Consumer Reports Car Rating Methodolgy is Flawed
Generally speaking, Consumer Reports provides useful and informative buying advice and product reviews for a wide range of products. From flat screen TVs to vacumn cleaners to car seats, Consumer Reports often uses a careful and refined testing procedure that generates some great advice.
As a long-time reader and current subscriber, I am 100% satisfied with Consumer Reports...except for their automotive reliability rankings.
Here's why: Car rankings are based exclusively on surveys offered by Consumer Reports (CR) readers. This, in my view, is a fatally flawed approach.
1. CR subscribers aren't representative of the general public. Quantcast.com, which estimates demographic and user data for millions of websites, has provided the following demographic "snap shot" of ConsumerReports.org (see the original report here):
[caption id="attachment_235" align="aligncenter" width="550"] Demographic data about the ConsumerReports.org website audience, as determined by Quantcast.com
As you can see, the typical ConsumerReports.org visitor is more likely to be wealthy ($100k+ annual household income) and college educated. While there's nothing wrong with being wealthy or educated, I suspect these consumers are a bit biased against American car brands.
- Hybrid buyers, for example, are significantly more likely to be educated and wealthy (see Scarborough Research). The best selling hybrid? A Toyota.
- American cars don't sell well in the country's wealthiest cities, additional proof that wealthy people are biased against American vehicle brands.
For anyone who thinks that Quantcast's data might be off, check out this 2009 study of CR's auto buying guide, which was sponsored by CR. According to the data on page 34, the average CR reader (either online or via magazine subscription) is wealthier and more educated than average.
2. CR data is noisy. By "noisy," I mean varying quite a bit from year to year. In this year's study, Volvo and Chrysler fell 10 and 8 spots in the rankings, while GMC, Cadillac, and Audi skyrocketed 10, 14, and 16 (!) slots.
Are we honestly supposed to believe that Audi was ranked as one of the least reliable brands last year, and yet somehow ranked top 10 in reliability this year? This is obviously a result of a limited amount of data, which brings me too...
3. CR uses as few as 100 surveys to rate vehicles! That's right folks - 100 measly surveys is all it takes for Consumer Reports to assess a specific vehicle's reliability rating.
100 data points is hardly enough to form a scientific evaluation - it's embarrassing that CR would admit to this methodology, but they've done precisely that:
...The scores are presented as a percentage better or worse than the average of all cars. The minimum sample size is 100 vehicles, but Consumer Reports often gets many more.
While CR might "often" get 100's or surveys, this hardly seems like a good system. It also explains Audi's wild change in rankings, doesn't it?
The bottom line: Don't trust Consumer Reports quality and reliability data, at least as far as automobiles are concerned.
At best, use CR automotive rankings as a supplement to other data sources. See their official 2012 rankings here.