Misleading Gimmicks from Consumer Reports

You better cut the pizza in four pieces because I’m not hungry enough to eat six.Yogi Berra (allegedly)

The other day I received an envelope from Consumer. It was soliciting contributions for a raffle fundraiser. The mailing had nine raffle tickets in it. Consumer Reports was requesting that I send back the tickets with a suggested donation of $9 (one dollar for each ticket). The mailing had a lot of paper:

The raffle had a grand prize that would be the choice of an undisclosed, top-rated car or $35,000. There were a number of smaller prizes bringing the total amount up for grabs to about $50,000.

The materials included a lot of gimmicky text. Things like:

  • “If you’ve been issued the top winning raffle number, then 1 of those tickets is definitely the winner or a top-rated car — or $35,000 in cash.”
  • “Why risk throwing away what could be a huge pay day?”
  • “There’s a very real chance you could be the winner of our grand prize car!”

Consumer Reports also indicates that they’ll send a free, surprise gift to anyone who donates $10 or more.

It feels funny to make a donate money based on the premise that I might win more than I donate, but I get it. Fundraising gimmicks work.

I don’t think it would be productive or even reasonable to pretend that humans are (or even ought to be) creatures that donate to non-profits for exclusively coldly-calculated, altruistic reasons.

That said, I get frustrated though when the gimmicks border on dishonest. Doubly so if deception is coming from an organization like Consumer Reports that brands itself as an organization committed to integrity.

One of the pieces of paper in the mailing came folded with print on each side. Here’s the first part that was visible:

Unfolding that paper and looking on the other side, I found a letter from someone involved in Consumer Reports marketing. The letter goes to some length to explain that it would be silly to not at least see if I had winning tickets. Here’s a bit of it:

It amazes me that among the many people who receive our Consumer Reports Raffle Tickers — containing multiple tickets, mind you, not just one — some choose not to mail them in. And they do this, despite the fact there is no donation required for someone to find out if he or she has won…So when people don’t respond it doesn’t make any sense to me at all.

This is ridiculous on several levels.

First, the multiple tickets bit is silly. It’s like the Yogi Berra line at the opening of the post. It doesn’t matter how many tickets I have unless I get more tickets than the typical person.

Second, it seems pretty obvious that Consumer Reports doesn’t care if a non-donor decides not to turn in tickets. The most plausible explanation for why Consumer Reports includes the orange letter is that people who would ignore the mailing may end up feeling guilty enough to make a donation. Checking the “I choose not to donate at this time, but please enter me in the Raffle” box on the envelope doesn’t feel great.

Finally, it makes perfect sense why I might not respond. Writing my name on each ticket, reading the materials, and mailing the tickets takes time. My odds of winning are low. I’d also have to pay for a stamp. Nothing about the rationale for not sending in the tickets is complicated. Consumer Reports knows that.

I’ll pretend that the only reason not to participate is that the stamp used to mail in the tickets isn’t free. That stamp is 55 cents at the moment.[1] Is my expected reward greater than 55 cents?

Consumer Reports has about 6 million subscribers.[2]

Let’s give Consumer Reports the benefit of the doubt and assume it can print everything, send initial mailings, handle the logistics of the raffle, and send gifts back to donors for only $0.50 per subscriber. That puts the promotion’s costs at about 3 million dollars. The $50,000 of prizes is trivial in comparison. Let’s assume that Consumer Reports runs the promotion based on the expectation that additional donations brought in will cover the promotion’s cost.

The suggested donation is $9. Let’s say the average, additional funding brought in by this campaign comes out to $10 per respondent.[3]

To break even, Consumer Reports needs to have 300,000 respondents.

With 300,000 respondents, nine tickets each, and $50,000 in prizes, the expected return is about 1.7 cents per ticket.[4] Sixteen cents total.[5] Not even close to the cost of a stamp.

4/12/2019 Update: I received a second, almost-identical mailing in early April.

Issues with Consumer Reports’ 2017 Cell Phone Plan Rankings


Consumer Reports offers ratings of cellular service providers based on survey data collected from Consumer Reports subscribers. Through subscriber surveying in 2017, Consumer Reports collected data on seven metrics:[1]

  1. Value
  2. Data service quality
  3. Voice service quality
  4. Text service quality
  5. Web service quality
  6. Telemarketing call frequency
  7. Support service quality

I don’t understand why data service and web service were evaluated with separate metrics. Unfortunately, Consumer Reports doesn’t offer much information about the methodology behind the survey.

The surveys collected data from over 100,000 subscribers.[2] I believe Consumer Reports would frown upon a granular discussion of the exact survey results, so I’ll remain vague about exact ratings in this post. If you would like to see the full results of their survey, Consumer Reports subscribers can do so here.

Survey results

Results are reported for 20 service providers. Most of these providers are mobile virtual network operators (MVNOs). MVNOs don’t operate their own network hardware but make use of other companies’ networks. For the most part, MVNOs use networks provided by the Big Four (Verizon, Sprint, AT&T, and T-Mobile).

Interestingly, the Big Four do poorly in Consumer Reports’ evaluation. Verizon, AT&T, and Sprint receive the lowest overall ratings and take the last three spots. T-Mobile doesn’t do much better.

This is surprising. The Big Four do terribly, even though most MVNOs are using the Big Four’s networks. Generally, I would expect the Big Four to offer network access to their direct subscribers that is as good or better than the access that MVNO subscribers receive.

It’s possible that the good ratings can be explained by MVNOs offering prices and customer service far better than the Big Four—making them deserving of the high ratings for reasons separate from network quality.

Testing the survey’s validity

To test the reliability of Consumer Reports methodology, we can compare MVNOs to the Big Four using only the metrics about network quality (ignoring measures of value, telemarketing call frequency, and support quality). In many cases, MVNOs use more than one of the Big Four’s networks. However, several MVNOs use only one network, allowing for easy apples-to-apples comparisons.[3]

  • Boost Mobile is owned by Sprint.
  • Virgin Mobile is owned by Sprint.
  • Circket Wireless is owned by AT&T.
  • MetroPCS is owned by T-Mobile.
  • GreatCall runs exclusively on Verizon’s network.
  • Page Plus Cellular runs exclusively on Verizon’s network.

When comparing network quality ratings between these MVNOs and the companies that run their networks:

  • Boost Mobile’s ratings beat Sprint’s ratings in every category.
  • Virgin Mobile’s ratings beat Sprint’s ratings in every category.
  • Cricket Wireless’s ratings beat or tie AT&T’s ratings in every category.
  • MetroPCS’s ratings beat or tie T-Mobile’s ratings in every category.
  • GreatCall doesn’t have a rating for web quality due to insufficient data. GreatCall’s ratings match or beat Verizon in the other categories.
  • Page Plus Cellular doesn’t have a rating for web quality due to insufficient data. Page Plus’ ratings match or beat Verizon in the other categories.
World’s best stock photo.
Taken at face value, these are odd results. There are complicated stories you could tell to salvage the results, but I think it’s much more plausible that Consumer Reports’ surveys just don’t work well for evaluating the relative quality of cell phone service providers.

Why aren’t the results reliable?

I’m not sure why the surveys don’t work, but I see three promising explanations:

  • Metrics may not be evaluated independently. For example, consumers might take a service’s price into account when providing a rating of its voice quality.
  • Lack of objective evaluations. Consumers may not provide objective evaluations. Perhaps consumers are aware of some sort of general stigma about Sprint that unfairly affects how they evaluate Sprint’s quality (but that same stigma may not be applied to MVNOs that use Sprint’s network).
  • Selection bias. Individuals who subscribe to one carrier are probably, on average, different from individuals who subscribe to another carrier. Perhaps individuals who have used Carrier A tend to use small amounts of data and are lenient when rating data service quality. Individuals who have used Carrier B may get more upset about data quality issues. Consumer Cellular took the top spot in the 2017 rankings. I don’t think it’s coincidental that Consumer Cellular has pursued branding and marketing strategies to target senior citizens.[4]

Consumer Reports’ website gives the impression that their cell phone plan rankings will be reliable for comparison purposes.[5] They won’t be.

The ratings do capture whether survey respondents are happy with their services. However, the ratings have serious limitations for shoppers trying to assess whether they’ll be satisfied with a given service.

I suspect Consumer Reports’ ratings for other product categories that rely on similar surveys will also be unreliable. However, the concerns I’m raising only apply to a subset of Consumer Reports’ evaluations. A lot of Consumer Reports’ work is based on product testing rather than consumer surveys.