Reliability and Satisfaction: What the Measures Mean
More than 63,000 PCWorld readers responded to our online and print advertisements or email messages, and volunteered to participate in our survey. With the help of statistical consultant Ferd Britton, we analyzed the resulting survey data to determine which companies’ numbers were reliably above or below the average of all responses for a particular product type. It’s important to note that our survey results don’t necessarily reflect the opinions of a given company’s customers as a whole. And because our data comes only from PCWorld readers who chose to take part in the survey, our results don’t necessarily reflect the opinions of PCWorld readers in general.
About the Survey
PCWorld readers rated hardware vendors in seven product categories: laptop PCs, desktop PCs, tablets, printers, smartphones, HDTVs, and digital cameras. For each category, our survey included at least four measures of the reliability of a brand’s products, such as failed components (a laptop hard drive, say) and problems that the user noticed right away (“out of the box”).
In the laptop, desktop, and printer categories, we also asked readers about their experiences with customer support.
This year’s survey included a series of questions asking readers how satisfied they were with the performance or specific features of a brand’s products (Samsung smartphone owners were asked to rate the phone’s touchscreen, for instance).
For each reliability, service, and product satisfaction measure, we determined whether the vendor’s score was significantly better than, not significantly different from, or significantly worse than the average of its peers.
If a vendor received fewer than 50 responses in a subsection, we discarded the results as statistically insignificant. This threshold prevented us from rating some companies.
We rated smartphone makers on four reliability criteria and five ease-of-use criteria. For wireless carriers that sell smartphones, we evaluated five aspects of their customer support and two aspects of their network performance: wireless Internet service quality and voice call quality.
Problems on arrival (all devices): Based on the percentage of survey respondents who reported any problem with the device out of the box.
Any significant problem (all devices): Based on the percentage of survey respondents who reported any problem at all during the product’s lifetime.
Any failed component replaced (laptop and desktop PCs): Based on the percentage of survey respondents who reported replacing one or more original components because the components had failed.
Dead PC (laptop and desktop PCs): Based on the percentage of survey respondents who reported problems with the processor, motherboard, power supply, hard drive, system memory, or graphics board/chip at any time during the life of their laptop or desktop PC.
Severe problem (HDTVs, phones, cameras, tablets, and printers): Based on the percentage of survey respondents who reported a problem that rendered their device impossible to use.
Overall satisfaction with reliability (all devices): Based on the owner’s overall satisfaction with the reliability of the device.
Service Measures (Laptop PCs, Desktop PCs, and Printers)
Phone hold time: Based on the average time a product’s owners waited on hold to speak to a phone support representative.
Average phone and Web service rating: Based on readers’ ratings of several aspects of their experience in using the company’s phone-based or Web-based technical support services. Among the factors considered were whether the information was easy to understand, and whether the phone support rep spoke clearly and knowledgeably.
Unresolved problem: Based on the percentage of survey respondents who said their problem was never fixed despite their contact with the company’s support service.
Service experience: Based on readers’ responses to a series of questions focusing on 11 specific aspects of their experience with the company’s service department.