Last updated: May 2019
In my opinion, Verizon has the most reliable nationwide network. AT&T’s network is also quite good. T-Mobile’s network is notably worse than AT&T or Verizon’s networks, but T-Mobile’s network still surpasses Sprint’s network in terms of reliability.
For people who travel often, especially outside of metro areas, nationwide network quality is an important consideration when selecting a carrier. For those who tend to stay in a few small areas, I suggest placing little weight on nationwide assessments and instead investigating coverage quality in the relevant areas.
There are four U.S. wireless networks with nationwide infrastructure: AT&T, T-Mobile, Verizon, and Sprint. These carriers are sometimes referred to collectively as the “Big Four.” Any other carrier that offers nationwide service in the U.S. is, at least in some regions, piggybacking off of at least one of the Big Four networks. Accordingly, this article’s focus is the relative quality of Sprint, Verizon, AT&T, and T-Mobile’s nationwide networks.
In my opinion, average download speed is an overrated metric of network quality. For most consumers, it’s going to be more important that a network can consistently deliver good enough speeds to perform the activities they want to do on their phones. Accordingly, I’ll place more weight on measures of network reliability than measures of raw speeds.
I believe RootMetrics’ methodology is particularly well-suited for assessing nationwide reliability. RootMetrics drives high-end phones connected to the major networks all around the country. During the drives, the phones conduct tests of network performance. RootMetrics’ drive testing has substantial geographic coverage, and RootMetrics’ nationwide results may be less prone to selection bias than other companies’ results.
As of April 2019, RootMetrics’ most recent national report is based on data collected in the second half of 2018. In that report, Verizon takes the top spot overall with AT&T coming in a close second. T-Mobile comes in a more distant third. Sprint takes the last spot in the rankings. Here’s a snapshot showing the “Overall performance” scores:
The rankings are unchanged and the scores are quite similar when looking at the scores that specifically assess network reliability:
Verizon didn’t just do well in the most recent reporting period. Verizon has taken the top overall ranking and the top network reliability ranking in every biannual period since at least the second half of 2013.
Unfortunately, RootMetrics is not transparent about how exactly it reaches its final scores or what exactly the scores mean. While I expect RootMetrics does a substantially better job of ranking major networks in terms of nationwide quality than other evaluators, it may still be worth considering the results found by other evaluators that use somewhat-defensible methodologies.
Opensignal relies on crowdsourced data from consumers. While I’m concerned that Opensignal’s nationwide results may be affected by massive selection bias—especially geographic selection bias—the methodology Opensignal uses has a lot going for it. Opensignal tests real-world scenarios with actual consumers and a lot of these tests take place indoors.
As of April 2019, Opensignal’s most recent report was published in January 2019. The report was based on a data collection period from mid-September to mid-December of 2018. The report doesn’t appear to list an overall winner but instead ranks carriers in a couple of different categories. Verizon alone wins in 3 of the 5 categories, and Verizon ties for the top spot in another. Here’s a snapshot of the results in the “4G availability” category (the category I think is most relevant to overall reliability):
I’m not convinced that T-Mobile actually has substantially more reliable 4G coverage than AT&T. T-Mobile may have an atypical proportion of its subscriber base in densely-populated areas. That might cause the results to skew in T-Mobile’s favor. It’s also worth keeping in mind that the category only accounts for 4G reliability. AT&T likely has substantially better coverage than T-Mobile under pre-4G technologies.
I would like to see analyses of Opensignal’s data that include attempts to control for geographic differences and other, potential sources of selection bias. Unless I see Opensignal transparently attempting these sorts of analyses, I expect I’ll remain awfully skeptical of Opensignal’s assessments. That’s a shame. I expect a lot of valuable insights could be gleaned from Opensignal’s data.
I don’t take other evaluation firms’ nationwide results too seriously due to limited public information and/or concerns about methodologies. That said, if you want to take other firms’ results as weak signals, it looks like these firms generally arrive at results similar to RootMetrics’ results. Based on my interpretation, Consumer Reports’ metrics related to network quality come out in Verizon’s favor, with AT&T and T-Mobile coming next, and Sprint coming in last. Tutela’s most recent U.S. report as of April 2019 isn’t exactly national in scope, but it covers 10 large cities. Verizon takes the top ranking among Big Four carriers in 9 of the 10 cities.
None of the third-party firms evaluating wireless network performance are as transparent as I would like. Despite the lack of transparency, I expect RootMetrics’ methodology is better suited for assessing nationwide network quality than other evaluators’ methodologies. RootMetrics’ results suggest that Verizon has the best nationwide network followed in order by AT&T, T-Mobile, and Sprint. This ordering fits with my personal experience and is, for the most part, consistent with recent results published by other somewhat-rigorous evaluation firms.
- Some regional operators with regional infrastructure will offer nationwide service via roaming agreements that allow subscribers to use other networks when outside of the covered region. Mobile virtual network operators don’t have their own network infrastructure and instead resell access to other networks.
- With crowdsourced data, I worry that users on some networks may be systematically different from users on other networks. Differences in network quality based on crowdsourced data could be due to either true differences in networks’ quality or differences in the behavior and location of users on each network.
RootMetrics’ tests are pretty well-controlled. High-end devices are used, and the phones from major networks all conduct tests in the same locations at the same times. I go into more detail in my article on RootMetrics’ methodology.
- I’m unsure whether RootMetrics provided the same kind of rankings prior to 2013. To see RootScore rewards in previous reporting periods, scroll to the bottom of the most recent report.
- “After the tests are evaluated for accuracy, the results are converted into scores using a proprietary algorithm.”
From RootMetrics Methodology page as of 4/23/2019 (archived here).
- While my impression is that RootMetrics primarily drive tests, its drivers occasionally bring phones indoors and the drive test data is supplemented with some crowdsourced data. Unfortunately, RootMetrics isn’t transparent about the weight different kinds of testing receive in analyses. I go into more detail in my page on RootMetrics’ methodology.
- The data collection period at the top of the report (archived here) is noted as “Sep 16 – Dec 14, 2018”.
- Verizon takes the top spot for 4G availability, upload speed experience, and video experience. Verizon and T-Mobile tie for the top spot in the download speed experience category. AT&T alone takes the top spot in the latency experience category.
- Past reports can be accessed here.
- For example, Nielsen might collect useful data and perform useful analysis, but I’ve been unable to find much information about either its results or its methodology.
- For what it’s worth, I’m really negative on Consumer Reports’ methodology for assessing wireless services. It may even be better to ignore the results than to treat them as a weak signal. Based on my interpretation of the 2017 results (the most recent as of April 2019), Consumer Reports has four metrics related to network quality. Verizon gets four “good” ratings on these metrics. AT&T gets two “good” ratings and two “fair” ratings. T-Mobile gets three “fair” ratings and one “good” rating. Sprint gets two “fair” ratings and two “poor” ratings. Subscribers to Consumer Reports can see the results here.
- T-Mobile narrowly beats Verizon in Houston. In a few cities, a mobile virtual network operator beat all the Big Four carriers for the top ranking.