Seventh
Measuring Broadband America
Fixed Broadband Report
A Report on Consumer Fixed Broadband Performance
in the United States
Federal Communications Commission
Office of Engineering and Technology
Table of Contents
- Chart 1: Maximum advertised download speed among the measured service tiers
- Chart 2: Consumer migration to higher advertised download speeds
- Chart 3: Median download speeds by ISP
- Chart 4: The ratio of weighted median speed to advertised speed for each ISP
- Chart 5: The percentage of consumers whose median download speed was greater than 95%, between 80% and 95%, and less than 80% of the advertised download speed
- Chart 6: The ratio of 80/80 consistent median download speed to advertised download speed
- Chart 7: Latency by ISP
- Chart 8: Percentage of consumers whose peak-period Packet loss was less than 0.4%, between 0.4% to 1% and greater than 1% by ISP
- Chart 9: Average webpage download time, by advertised download speed
- Chart 10: Maximum advertised upload speed among the measured service tiers
- Chart 11: Median upload speeds by ISP
- Chart 12: Median download and upload speeds by technology
- Chart 13.1: The ratio of median download speed to advertised download speed
- Chart 13.2: The ratio of median upload speed to advertised upload speed
- Chart 14: The percentage of consumers whose median upload speed was greater than 95%, between 80% and 95%, and less than 80% of the advertised upload speed
- Chart 15.1: Complementary cumulative distribution of the ratio of median download speed to advertised download speed
- Chart 15.2: Complementary cumulative distribution of the ratio of median download speed to advertised download speed (continued)
- Chart 15.3: Complementary cumulative distribution of the ratio of median download speed to advertised download speed, by technology
- Chart 15.4: Complementary cumulative distribution of the ratio of median upload speed to advertised upload speed
- Chart 15.5: Complementary cumulative distribution of the ratio of median upload speed to advertised upload speed (continued)
- Chart 15.6: Complementary cumulative distribution of the ratio of median upload speed to advertised upload speed, by technology
- Chart 16.1: The ratio of median download speed to advertised download speed, peak versus off-peak
- Chart 16.2: The ratio of median upload speed to advertised upload speed, peak versus off-peak
- Chart 17.1: The ratio of median download speed to advertised download speed, M-F 2 hour time blocks, terrestrial ISPs
- Chart 17.2: The ratio of median download speed to advertised download speed, M-F 2 hour time blocks, satellite ISPs
- Chart 18.1: The ratio of 80/80 consistent upload speed to advertised upload speed
- Chart 18.2: The ratio of 70/70 consistent download speed to advertised download speed
- Chart 18.3: The ratio of 70/70 consistent upload speed to advertised upload speed
- Chart 19: Latency, by technology and by advertised download speed
- Chart 20.1: The ratio of median download speed to advertised download speed, by ISP (0-5 Mbps
- Chart 20.2: The ratio of median download speed to advertised download speed, by ISP (6-10 Mbps)
- Chart 20.3: The ratio of median download speed to advertised download speed, by ISP (12-15 Mbps)
- Chart 20.4: The ratio of median download speed to advertised download speed, by ISP (18-25 Mbps)
- Chart 20.5: The ratio of median download speed to advertised download speed, by ISP (30-50 Mbps)
- Chart 20.6: The ratio of median download speed to advertised download speed, by ISP (60-300 Mbps)
- Chart 21.1: The ratio of median upload speed to advertised upload speed, by ISP (0.256-0.64 Mbps)
- Chart 21.2: The ratio of median upload speed to advertised upload speed, by ISP (0.768-1.5 Mbps)
- Chart 21.3: The ratio of median upload speed to advertised upload speed, by ISP (2-5 Mbps)
- Chart 21.4: The ratio of median upload speed to advertised upload speed, by ISP (6-10 Mbps)
- Chart 21.5: The ratio of median upload speed to advertised upload speed, by ISP (20-100 Mbps)
- Chart 22.1: The percentage of consumers whose median download speed was greater than 95%, between 80% and 95%, and less than 80% of the advertised download speed, by service tier (DSL)
- Chart 22.2: The percentage of consumers whose median download speed was greater than 95%, between 80% and 95%, and less than 80% of the advertised download speed (cable)
- Chart 22.3: The percentage of consumers whose median download speed was greater than 95%, between 80% and 95%, and less than 80% of the advertised download speed (fiber and satallite).
- Chart 23.1: The percentage of consumers whose median upload speed was greater than 95%, between 80% and 95%, and less than 80% of the advertised upload speed (DSL)
- Chart 23.2: The percentage of consumers whose median upload speed was greater than 95%, between 80% and 95%, and less than 80% of the advertised upload speed (cable)
- Chart 23.3: The percentage of consumers whose median upload speed was greater than 95%, between 80% and 95%, and less than 80% of the advertised upload speed (fiber and satallite)
- Chart 24.1: Average webpage download time, by ISP (1-3 Mbps)
- Chart 24.2: Average webpage download time, by ISP (5-10 Mbps)
- Chart 24.3: Average webpage download time, by ISP (12-15 Mbps)
- Chart 24.4: Average webpage download time, by ISP (18-25 Mbps)
- Chart 24.5: Average webpage download time, by ISP (30-50 Mbps)
- Chart 24.6: Average webpage download time, by ISP ( 60-200Mbps)
- Table 1: List of ISP service tiers whose broadband performance was measured in this report
- Table 2: Peak Period Median download speed, by ISP
- Table 3: Complementary cumulative distribution of the ratio of median download speed to advertised download speed, by technology, by ISP
- Table 4: Complementary cumulative distribution of the ratio of median upload speed to advertised upload speed, by technology, by ISP
1. Executive Summary
The Seventh Measuring Broadband America Fixed Broadband Report (“Seventh Report”) discusses data collected and validated in September 2016 from fixed Internet Service Providers (ISPs) as part of the Federal Communication Commission’s (FCC) Measuring Broadband America (MBA) program. This program is an ongoing, rigorous, nationwide study of consumer broadband performance in the United States. We measure the network performance delivered on selected service tiers to a representative sample set of the population. The thousands of volunteer panelists are drawn from subscribers of Internet service providers serving over 80% of the residential marketplace.[1]
The initial Measuring Broadband America Fixed Broadband Report was published in August 2011,[2] and presented the first broad-scale study of directly measured consumer broadband performance throughout the United States. As part of an open data program, all methodologies used in the program are fully documented and all data collected is published for public use without restriction. Including this latest report, seven reports have now been issued.[3] These reports provide a snapshot of fixed broadband Internet access service performance in the United States. They present analysis of broadband information in a variety of ways and have evolved to make the information more understandable and useful, and to reflect the changing applications supported by the nation’s broadband infrastructure.
A. Major Findings of the Seventh Report
The key findings of this report, based on measurements taken in September 2016[4] are as follows:
- The maximum advertised download speeds amongst the service tiers measured by the FCC were between 3-200 Mbps for the period covered by this report.
- The median speed experienced by subscribers of the participating ISPs was 57 Mbps.
- For most of the major broadband providers that were tested, measured download speeds were 100% of advertised speeds or better during the peak hours (7 p.m. to 11 p.m. local time).
- Fourteen ISPs were evaluated in this report. Of these, AT&T, Cincinnati Bell, Frontier and Verizon employed multiple different technologies to provide service across the country. Overall, 18 different ISP/technology configurations were evaluated in this report. Of those, 11 met or exceeded their advertised download speeds, all performed better than 75% of their advertised download speed, and only three performed below 90% of their advertised download speed.
- In addition to providing download and upload speed measurements of ISPs, this report also presents a measure of how consistently ISPs provide their advertised speed with the use of our “80/80” metric. The 80/80 metric measures the minimum speed that at least 80% of subscribers’ experience at least 80% of the time over peak periods.
These and other findings are described in greater detail within this report.
B. Use Of median speeds and subscriber-weighted speeds
The Seventh Report retains two changes made in the 2016 Report affecting how metrics are calculated and presented, namely the use of median speeds and subscriber-weighted speeds. First, consistent with the 2016 Report, we continue to present ISP broadband performance as the median,[5] rather than mean (average), of speeds experienced by panelists within a specific service tier.[6] Our focus in these reports is on the most common service tiers used by an ISP’s subscribers.[7]
Second, consistent with the 2016 Report, we continue to compute ISP performance by weighting the median for each service tier by the number of subscribers in that tier. Similarly, in calculating the overall average speed of all ISPs in a specific year, the median speed of each ISP is used and weighted by the number of subscribers of that ISP as a fraction of the total number of subscribers across all ISPs.
In calculating weighted medians, we have drawn on two sources for determining the number of subscribers per service tier. ISPs can voluntarily contribute their data per surveyed service tier as the most recent and authoritative data. Many ISPs have chosen to do so.[8] When such information has not been provided by an ISP, we rely on the FCC’s Form 477 data.[9] All facilities-based broadband providers are required to file data with the FCC twice a year (Form 477) regarding deployment of broadband services, including subscriber counts. For this report, we used the June 2016 Form 477 data. It should be noted that the Form 477 subscriber data values are for a month that generally lags the measurement month, and therefore, there are likely to be small inaccuracies in the tier ratios. It is for this reason that we encourage ISPs to provide us with subscriber numbers for the measurement month.
C. Use of other Performance Metrics
As in our previous reports, we found that for most ISPs, actual speeds experienced by subscribers nearly meet or exceed advertised service tier speeds. However, since we started our MBA program, consumers have changed their Internet usage habits. In 2011, consumers mainly browsed the web and downloaded files; thus, we reported average speeds since they were likely to closely mirror user satisfaction. By contrast, by September 2016, the measurement period for this report, many consumers streamed video for entertainment and education.[10] Both the median measured speed and how consistently the service performs are likely to influence the perception and usefulness of Internet access service and we have expanded our network performance analytics to better capture this.
Specifically, we use two kinds of metrics to reflect the consistency of service delivered to the consumer: First, we report the minimum actual speed experienced by at least 80% of panelists during at least 80% of the daily peak usage period (“80/80 consistent speed” measure). Second, we show what fraction of consumers obtains median speeds greater than 95%, between 80% and 95%, and less than 80% of advertised speeds.
Although download and upload speeds remain the network performance metric of greatest interest to the consumer, we also spotlight two other key network performance metrics in this report: latency and packet loss. These metrics can significantly affect the overall quality of Internet applications.
Latency (or delay) is the time it takes for a data packet to travel across a network from one point on the network to another. High latencies may affect the perceived quality of some interactive services such as phone calls over the Internet, video chat and video conferencing, or online multiplayer games. All network access technologies have a minimum latency that is largely determined by the technology. In addition, network congestion can lead to an increase in measured latency. Technology-determined latencies are typically small for terrestrial broadband services and are thus unlikely to affect the perceived quality of applications. The higher latencies of geostationary satellite-based broadband services may impair the perceived quality of such highly interactive applications. Not all applications are affected by high latencies; for example, entertainment video streaming applications are tolerant of relatively high latencies.
Packet loss measures the fraction of data packets sent that fail to be delivered to the intended destination. Packet loss may affect the perceived quality of applications that do not request retransmission of lost packets, such as phone calls over the Internet, video chat, some online multiplayer games, and some video streaming. High packet loss also degrades the achievable throughput of download and streaming applications. However, packet losses of a few tenths of a percent are unlikely to significantly affect the perceived quality of most Internet applications and are common. During network congestion, both latency and packet loss typically increase.
The Internet is continuing to evolve in architecture, performance, and services. We will therefore continue to adapt our measurement and analysis methodologies to help consumers understand the performance characteristics of their broadband Internet access service, and thus make informed choices about their use of such services.
2. Summary of Key Findings
A. Most Popular Advertised Service Tiers
A list of the offered ISP download and upload service tiers that were measured in this report are shown in Table 1. It should be noted that while upload and downloads speeds are measured independently and shown separately, they are typically offered by the ISP as a set of combined configurations. Together, these plans serve the majority of Internet users of the participating ISPs. Generally, service tiers are initially added to this report when five percent or more of an ISP’s customers subscribe to that tier and there are at least 30,000 subscribers in that tier. Each tier requires a certain number of panelists to meet the program’s target sample size, and it becomes difficult and costly to recruit panelists for tiers with few subscribers or across a very large number of tiers.
Technology |
Company |
Speed Tiers (Download) |
Speed Tiers (Upload) |
|||||||||||
DSL |
AT&T DSL |
1.5* |
3 |
6 |
|
|
|
|
0.256* |
0.384 |
0.512 |
|
|
|
AT&T IPBB |
|
3 |
6 |
12 |
18 |
24 |
45 |
|
0.768 |
1 |
1.5 |
3 |
6 |
|
CenturyLink |
1.5 |
3 |
7 |
10 |
12 |
20 |
40 |
0.512 |
0.768 |
0.896 |
5 |
|
|
|
Cincinnati Bell DSL |
5 |
10 |
30 |
|
|
|
|
|
0.768 |
1 |
|
|
|
|
Frontier DSL |
3 |
6 |
12 |
|
|
|
|
0.384 |
0.768 |
1 |
|
|
|
|
Verizon DSL |
(0.5 - 1) |
(1.1-3) |
|
|
|
|
|
0.384 |
(0.384 - 0.768) |
|
|
|
|
|
Windstream |
3 |
6 |
12 |
|
|
|
|
0.768 |
|
|
|
|
|
|
Cable |
Optimum |
25 |
50 |
101 |
|
|
|
|
5 |
25 |
35 |
|
|
|
Charter |
60 |
100 |
|
|
|
|
|
4* |
5 |
|
|
|
|
|
Comcast |
25 |
75 |
105 |
150 |
|
|
|
5 |
10 |
20 |
|
|
|
|
Cox |
5* |
15 |
25* |
50 |
100 |
|
|
1 |
2* |
5 |
10 |
|
|
|
Mediacom |
15 |
50 |
100 |
|
|
|
|
1 |
5 |
10 |
|
|
|
|
Time Warner Cable |
15 |
20 |
30 |
50 |
100 |
200 |
|
1 |
2 |
5 |
10 |
20 |
|
|
Fiber |
Cincinnati Bell Fiber |
10* |
30 |
|
|
|
|
|
1* |
3 |
|
|
|
|
Frontier Fiber |
25 |
50 |
75 |
|
|
|
|
10* |
25* |
50 |
75 |
|
|
|
Verizon Fiber |
25 |
50 |
75 |
100 |
|
|
|
25 |
50 |
75 |
100 |
|
|
|
Satellite |
Hughes |
5 |
10 |
|
|
|
|
|
1 |
|
|
|
|
|
ViaSat |
12 |
|
|
|
|
|
|
3 |
|
|
|
|
|
Table 1: List of ISP service tiers whose broadband performance was measured in this report
Chart 1: Maximum advertised download speed among the measured service tiers[11]
The maximum advertised download speed tier included in this report for ISPs using satellite technology is between 10-12 Mbps. Similarly, the maximum advertised download speed included in this report for DSL providers ranges between 3-45 Mbps. In contrast, ISPs using cable and fiber technology offer much higher maximum advertised download speeds. The maximum advertised download speeds included in this report for cable technology are between 100-200 Mbps. Among participating ISPs, only Cincinnati Bell, Frontier, and Verizon use fiber as the access technology for a substantial number of their customers and their maximum speed offerings included in this report are between 30-100 Mbps. A key differentiator between the providers using fiber technology and those using other technologies is that two of the fiber ISPs offer symmetric maximum advertised upload and download speeds. This is in sharp contrast to the asymmetric offerings of providers using other technologies, for which the maximum advertised upload speeds are typically 5 to 10 times below the maximum advertised download speeds.
Chart 2 plots the migration of panelists to a higher service tier based on their access technology.[12] Specifically, the horizontal axis of Chart 2 partitions the September 2015 panelists by the advertised download speed of the service tier to which they were subscribed. For each such set of panelists who also participated in the September 2016 collection of data,[13] the vertical axis of Chart 2 displays the percentage of panelists that migrated by September 2016 to a service tier with a higher advertised download speed. There are two ways that such a migration could occur: (1) if a panelist changed their broadband plan during the intervening year to a service tier with a higher advertised download speed, or (2) if a panelist did not change their broadband plan but the panelist’s ISP increased the advertised download speed of the panelist’s subscribed plan.[14]
Chart 2: Consumer migration to higher advertised download speeds
B. Median download speeds
Advertised download speeds may differ from the speeds that subscribers experience. Some ISPs more consistently meet network service objectives than others or meet them unevenly across their geographic coverage area. Also, speeds experienced by a consumer may vary during the day if the network cannot carry the aggregate user demand during busy hours. Unless stated otherwise, all actual speeds are measured only during peak usage periods, which we define as 7 p.m. to 11 p.m. local time.
To compute the average ISP performance, we weigh the median speed for each tier by its subscriber count. Subscriber counts for the weightings were provided from the ISPs themselves or, if unavailable, from FCC Form 477 data.
Chart 3 shows the median download speeds experienced by the subscribers of the ISPs participating in MBA, averaged across all analyzed service tiers, geography, and time, for 2016. The median download speed, averaged across all participating ISPs, was approximately 57 Mbps in September 2016. As shown in this chart, there is considerable variance of median download speed by both ISP and by technology. While most cable and fiber providers had median speeds ranging from 46 to 95 Mbps, the DSL and satellite providers had median download speeds ranging from 2 to 18 Mbps.
Chart 3: Median download speeds by ISP
However, as we observed above when examining advertised download speeds, the increase in median download speeds is not uniform across access technologies and companies.
Chart 4 shows the ratio of the weighted median speeds experienced by an ISP’s subscribers to that ISP’s advertised speeds. The ratios for both download and upload speeds to the advertised download and upload speeds are illustrated. The actual speeds experienced by most ISPs’ subscribers are close to or exceed the advertised speeds. However, DSL broadband ISPs continue to advertise “up-to” speeds that on average exceed the actual speeds experienced by their subscribers. Verizon, instead, advertises a speed range for DSL performance and has requested that we include this range in relevant charts; we indicate this speed range with shading on all bar charts describing Verizon DSL performance. Out of the 18 ISP/technology configurations shown, 11 met or exceeded their advertised download speed and all reached at least 75% of their advertised download speed. Only AT&T-DSL (at 82%), Cincinnati-DSL (at 76%) and ViaSat (at 78%) performed below 90% of their advertised download speed.
Chart 4: The ratio of weighted median speed to advertised speed for each ISP
C. Variations In Speeds
As discussed earlier, actual speeds experienced by individual consumers may vary by location and time of day. Chart 5 shows, for each ISP, the percentage of panelists who experienced a median download speed (averaged over the peak usage period during our measurement period) that was greater than 95%, between 80% and 95%, or less than 80% of the advertised download speed.
Chart 5: The percentage of consumers whose median download speed was greater than 95%, between 80% and 95%, or less than 80% of the advertised download speed
Even though the median download speeds experienced by most ISPs’ subscribers nearly meet or exceed the advertised download speeds, for each ISP, there are some customers for whom the median download speed falls significantly short of the advertised download speed. Relatively few subscribers of cable or fiber broadband service experience this. The best performing ISPs, when measured by this metric, are Optimum, Charter, Cox, TWC, Frontier-Fiber, Verizon-Fiber and Hughes; more than 85% of their panelists were able to attain an actual median download speed of at least 95% of the advertised download speed.
In addition to variation based on a subscriber’s location, speeds experienced by a particular consumer may fluctuate during the day. This is typically caused by increased traffic demand and the resulting stress on different parts of the network infrastructure. In order to examine this aspect of performance, we use the term “80/80 consistent speed” to refer to a metric designed to assess temporal and spatial variations in measured values of the download speed.[15] Consistency of speed is in itself an intrinsically valuable service characteristic and its impact on consumers will hinge on variations in usage patterns and needs.
Chart 6 summarizes, for each ISP, the ratio of 80/80 consistent median download speed to advertised download speed, and, for comparison, the ratio of median download speed to advertised download speed shown previously in Chart 4. The ratio of 80/80 consistent median download speed to advertised download speed is less than the ratio of median download speed to advertised download speed for all participating ISPs due to congestion periods when median download speeds are lower than the overall average. When the difference between the two ratios is small, the median download speed is fairly insensitive to both geography and time. When the difference between the two ratios is large, there is a greater variability in median download speed, either based on location or variations during the peak usage period.
Chart 6: The ratio of 80/80 consistent median download speed to advertised download speed.
D. Latency
Latency is the time it takes for a data packet to travel from one point to another in a network. It has a fixed component that depends on the distance, the transmission speed, and transmission technology between the source and destination, and a variable component that increases as the network path congests with traffic. The MBA program measures latency by measuring the round-trip time from the consumer’s home to the closest measurement server and back.
Chart 7 shows the median latency for each participating ISP. In general, higher-speed service tiers have lower latency, as it takes less time to transmit each packet. Satellite technologies inherently experience longer latencies since packets must travel approximately 44,500 miles from an earth station to the satellite and back. Therefore, the median latencies of satellite-based broadband services are much higher, at 594 ms to 624 ms, than those for terrestrial-based broadband services, which range from 11 ms to 43 ms in our measurements.
Chart 7: Latency by ISP
Amongst terrestrial technologies, DSL latencies (between 25 ms to 43 ms) were slightly larger than cable (15 ms to 35 ms). Fiber ISPs showed the lowest latency (11 ms to 14 ms). The differences in median latencies among terrestrial-based broadband services are relatively small, and are unlikely to affect the perceived quality of highly interactive applications.
E. Packet Loss
Packet loss is the percentage of packets that are sent by the source but not received at the destination. The most common reason that a packet is not received is that it encountered congestion along the route. A small amount of packet loss is expected, and indeed some Internet protocols use the packet loss to infer Internet congestion and to adjust the sending rate accordingly. The MBA program considers a packet lost if the round-trip latency exceeds 3 seconds.
Chart 8 shows the average peak-period packet loss for each participating ISP, grouped into bins. We have broken the packet loss performance into three bands which allows a more granular view of the packet loss performance of the ISP network. The breakpoints for the three bins used to classify packet loss have been chosen with an eye towards commonly accepted packet loss standards; provider packet loss SLAs; and various standards. Specifically, the 1% standard for packet loss is referred to in international documents and commonly accepted as the point at which highly interactive applications such as VoIP will experience significant degradation and quality.[16] The 0.4% breakpoint was chosen as a generic breakpoint between highly desired performance of 0% packet loss described in many documents and the 1% unacceptable on the high side. The specific value of 0.4% is based upon a compromise value between those two limits and generally supported by many network performance and service level agreements (SLAs) provided by major ISPs. Indeed, most SLAs support 0.1% to 0.3% SLA packet loss guarantees,[17] but these are generally for enterprise level services which generally have more stringent requirements for higher-level performance.
Chart 8: Percentage of consumers whose peak-period packet loss was less than 0.4%, between 0.4% to 1%, and greater than 1%.
F. Web browsing performance
The MBA program also conducts a specific test to gauge web browsing performance. The web browsing test accesses nine popular websites that include text and images, but not streaming video. The time required to download a webpage depends on many factors, including the consumer’s in-home network, the download speed within an ISP’s network, the web server’s speed, congestion in other networks outside the consumer’s ISP’s network (if any), and the time required to look up the network address of the webserver. Only some of these factors are under control of the consumer’s ISP. Chart 9 displays the average webpage download time as a function of the advertised download speed. As shown by this chart, webpage download time decreases as download speed increases, from about 7.4 seconds at 0.5 Mbps download speed to about 0.8 seconds for 25 Mbps download speed. Subscribers to service tiers exceeding 25 Mbps do not experience further significant decreases in webpage download times. These download times assume that a single user is using the Internet connection when the webpage is downloaded, and does not account for more typical scenarios where multiple users within a household are simultaneously using the Internet connection for viewing web pages as well as other applications such as real-time gaming or video streaming.
Chart 9: Average webpage download time, by advertised download speed.
3. Methodology
A. Participants
Fourteen ISPs participated in the Fixed MBA program in September 2016.[18] They are:
- AT&T
- CenturyLink
- Charter Communications
- Cincinnati Bell
- Comcast
- Cox Communications
- Frontier Communications Company
- Hughes Network Systems
- Mediacom Communications Corporation
- Optimum
- Time Warner Cable
- Verizon
- ViaSat
- Windstream Communications
The methodologies and assumptions underlying the measurements described in this Report are reviewed at meetings that are open to all interested parties, and documented in public ex parte letters filed in the GN Docket No. 12-264. Policy decisions regarding the MBA program involving issues such as inclusion of tiers, test periods, mitigation of operational issues affecting the measurement infrastructure, and terms-of-use notifications to panelists were discussed at these meetings prior to adoption. Participation in the MBA program is open and voluntary. Participants are drawn from academia, consumer equipment vendors, telecommunications vendors, network service providers, consumer policy groups as well as our contractor for this project, SamKnows. In 2016-2017, participants at these meetings (collectively and informally referred to as “the broadband collaborative”), included all fourteen participating ISPs and the following additional organizations:
- Center for Applied Data Analysis (CAIDA)
- International Technology and Trade Associates (ITTA)
- Internet Society (ISOC)
- Level 3 Communications (“Level 3”)
- Massachusetts Institute of Technology (“MIT”)
- M-Lab
- NCTA – The Internet and Television Association
- New America Foundation
- Practicum Team, NCSU, Institute for Advanced Analytics
- Princeton University
- United States Telecom Association (“US Telecom”)
- University of California - Santa Cruz
Participants have contributed in important ways to the integrity of this program and provide valuable input to FCC decisions for this program. Initial proposals for test metrics and testing platforms were discussed and critiqued within the broadband collaborative. M-Lab and Level 3 contributed their core network testing infrastructure, and both parties continue to provide invaluable assistance in helping to define and implement the FCC testing platform. We thank the participants for their continued contributions to the MBA program.
B. Measurement process
The measurements that provided the underlying data for this report relied both on measurement clients and measurement servers. The measurement clients (i.e., whiteboxes) resided in the homes of 6,193 panelists who received service from one of the 14 participating ISPs. The participating ISPs collectively accounted for over 80% of U.S. residential broadband Internet connections. After the measurement data was processed, as described in greater detail in the Appendix, test results from 4,545 panelists were used in this report.
The measurement servers were hosted by M-Lab and Level 3 Communications, and were located in nine cities across the United States near a point of interconnection between the ISP’s network and the network on which the measurement server resided.[19]
The measurement clients collected data throughout the year, and this data is available as described below. However, only data collected from September 1 through 11 and September 21 through October 9, 2016, referred to throughout this report as the “September 2016” reporting period, were used to generate the charts in this Report.[20]
Broadband performance varies with the time of day. At peak hours, more people are attempting to use their broadband Internet connections, giving rise to a greater potential for congestion and degraded user performance. Unless otherwise stated, this Report focuses on performance during peak usage period, which is defined as weeknights between 7:00 p.m. to 11:00 p.m. local time at the subscriber’s location. Focusing on peak usage period provides the most useful information because it demonstrates the performance users can expect when the Internet in their local area is experiencing the highest demand from users.
Our methodology focuses on the network performance of each of the participating ISPs. The metrics discussed in this Report are derived from traffic flowing between a measurement client, located within the modem or router within a panelist’s home, and a measurement server, located outside the ISP’s network. For each panelist, the tests automatically choose the measurement server that has the lowest latency to the measurement client. Thus, the metrics measure performance along a path within each ISP’s network, through a point of interconnection between the ISP’s network and the network on which the chosen measurement server resides.
However, the service performance that a consumer experiences may differ from our measured values for several reasons. First, as noted, we measure performance only to a single measurement server rather than to multiple servers, following the approach chosen by most network measurement tools. ISPs, in general, attempt to maintain consistent performance throughout their network. However, at times, some paths or interconnection points within an ISP’s network may be more congested than others and this can affect a specific consumer’s service.
Congestion beyond an ISP’s network and not measured in our study, can affect the overall performance a consumer experiences with their service. A consumer’s home network rather than the ISP’s network, may be the bottleneck. We measure the performance of the ISP’s service delivered to the consumer’s home network, but this connection is often shared among simultaneous users and applications within the home. This in-home network, which typically includes Wi-Fi, may not have sufficient capacity to support peak loads.[21]
In addition, consumers typically experience performance through the set of applications that they utilize, not as raw speed, latency or packet loss. The performance of an application depends on both the network performance and on the architecture and implementation of the application itself and the operating system and hardware on which it runs. While network performance is considered in this Report, application performance is generally not.
C. Measurement Tests And Performance Metrics
This Report is based on the following measurement tests:
· Download speed: This test measures the download speed of each whitebox over a 10-second period, once every hour during the peak hours (7 p.m. to 11 p.m.) and once during each of the following periods: midnight to 6 a.m., 6 a.m. to noon, and noon to 6 p.m. The results of each whitebox are then averaged across the measurement month; the median value for these average speeds across the set of whiteboxes is used to determine the median download speed for a service tier. The overall ISP download speed is computed as the weighted median for each service tier, using the subscriber counts for the tiers as weights.
· Upload speed: This test measures the upload speed of each whitebox over a 10-second period, with the same measurement intervals as the download speed. The speed measured in the last five seconds of the 10-second interval is retained, the results of each whitebox are then averaged over the measurement period, and the median value for the average speed taken over the set of whiteboxes is used to determine the median upload speed for a service tier. The ISP upload speed is computed in the same manner as the download speed.
· Latency and packet loss: These tests measure the round-trip times for approximately 2,000 packets per hour sent at randomly distributed intervals. Response times less than three seconds are used to determine the mean latency. If the whitebox does not receive a response within three seconds, the packet is counted as lost.
· Web browsing: The web browsing test measures the total time it takes to request and receive webpages, including the text and images, from nine popular websites and is performed once every hour. The measurement includes the time required to translate the web server name (URL) into the webserver’s network (IP) address.
This Report focuses on three key performance metrics of interest to consumers of broadband Internet access service, as they are likely to influence how well a wide range of consumer applications work: download and upload speed, latency, and packet loss. Download and upload speeds are also the primary network performance characteristic advertised by ISPs. However, as discussed above, the performance observed by a user in any given circumstance depends not only on the actual speed of the ISP’s network, but also on the performance of other parts of the Internet and on that of the application itself.[22]
The Technical Appendix to this Report describes each test in more detail, including additional tests not contained in this Report.
D. Availability Of Data
The Validated Data Set[23] on which this Report is based, as well as the full results of all tests, are available at http://www.fcc.gov/measuring-broadband-america. To encourage additional research, we also provide raw data for the reference month and other months. Previous reports of the MBA program, as well as the data used to produce them, are also available there.
Both the Commission and SamKnows, the Commission’s contractor for this program, recognize that, while the methodology descriptions included in this document provide an overview of the project, interested parties may be willing to contribute to the project by reviewing the software used in the testing. SamKnows welcomes review of its software and technical platform, consistent with the Commission’s goals of openness and transparency for this program.[24]
4. Test Results
A. Most Popular Advertised Service Tiers
Chart 1 above summarized the maximum advertised download speeds among the measured service tiers[25] for each participating ISP, for September 2016, grouped by the access technology used to offer the broadband Internet access service (DSL, cable, fiber, or satellite). Chart 10 below shows the corresponding maximum advertised upload speeds among the measured service tiers. As shown in Chart 10, the maximum upload speed of ISPs using DSL and satellite technology lags behind ISPs using cable and fiber technologies. In particular, the maximum advertised upload speed for ISPs using DSL technology is between 0.5 to 6 Mbps and for ISPs using satellite technology is 1 to 3 Mbps. In contrast, among cable-based broadband providers, the maximum advertised upload speeds among the measured service tiers is 5-35 Mbps. Similarly, for ISPs using fiber technology the maximum upload speed ranged from 3 to 100 Mbps. As was previously noted, except for Cincinnati Bell fiber, the upload and download speed offerings for fiber technologies are symmetric. The computed weighted average of the maximum upload speed of all participating ISPs is 13 Mbps.
Chart 10: Maximum advertised upload speed among the measured service tiers.
B. Observed Median download and upload Speeds
Chart 3 above showed the median download speeds experienced by each ISP’s participating subscribers in September 2016. Chart 11 below shows the corresponding median upload speeds. The median upload speed for this period across all consumers was 12 Mbps.
Chart 11: Median upload speeds by ISP.
Chart 12 shows the median download and upload speeds by technology for September 2016. As shown the median download speeds for DSL and satellite technologies, which are respectively 14 and 12 Mbps, lag behind the median download speeds for cable and fiber technologies, which are 79 and 63 Mbps. Similarly, the median upload speeds for DSL and satellite technologies, which are respectively 2 to 3 Mbps, lag behind the median upload speeds of cable and fiber technologies, which are 9 and 69 Mbps.
Observing both the download and upload speeds, fiber technology is more symmetric in its actual upload and download speeds. Other technologies tend to be far more asymmetric with the upload speed values lower than the download speed values. This asymmetry is reflective of actual usage in that consumers typically download significantly more data than they upload.
Chart 12: Median download and upload speeds by technology
Chart 4 (in Section 2.B above) showed the ratio in September 2016 of the weighted median of both download and upload speeds of each ISP’s subscribers to advertised speeds. Charts 13.1 and 13.2 below show the same ratios separately for download speed and upload speed.[26] The median download speeds of most ISPs’ subscribers have been close to, or have exceeded, the advertised speeds. Exceptions to this were the following DSL providers: AT&T-DSL, CenturyLink, Cincinnati Bell, Frontier DSL and Windstream with their median download speed at 81%, 85%, 93%, 86% and 94%, respectively, of their advertised download speed.
Chart 13.1: The ratio of median download speed to advertised download speed.
Chart 13.2 shows the median upload speed as a percentage of the advertised upload speed. As was the case with download speeds, most ISPs meet or exceed their advertised speeds except for most DSL providers: AT&T-DSL, CenturyLink, Cincinnati Bell DSL, Frontier DSL and Windstream which had values of 81%, 85%, 93%, 86% and 78%, respectively.
Chart 13.2: The ratio of median upload speed to advertised upload speed.
C. Variations In Speeds
As noted, median speeds experienced by consumers may vary based on location and time of day. Chart 5 above showed, for each ISP, the percentage of consumers (across the ISP’s service territory) who experienced a median download speed over the peak usage period that was either greater than 95%, between 80% and 95%, or less than 80% of the advertised download speed. Chart 14 below shows the corresponding percentage of consumers whose median upload speed fell in each of these ranges.
Even though the median upload speeds experienced by most subscribers were close to or exceeded the advertised upload speeds, for each ISP, there were some subscribers whose median upload speed fell significantly short of the advertised upload speed. This issue was most prevalent for ISPs using DSL technology. ISPs using cable and fiber technology generally showed very good consistency in service based on this metric.
We can learn more about the variation in network performance by separately examining variation across geography and across time. We start by examining the variation across geography within each participating ISP’s service territory. For each ISP, we first calculate the ratio of the median download speed (over the peak usage period) to the advertised download speed for each panelist subscribing to that ISP. We then examine the distribution of this ratio across the ISP’s service territory.
Charts 15.1 and 15.2 show the complementary cumulative distribution of the ratio of median download speed (over the peak usage period) to advertised download speed for each participating ISP. For each ratio of actual to advertised download speed on the horizontal axis, the curves show the percentage of panelists subscribing to each ISP that experienced at least this ratio.[27] For example, the Cincinnati Bell fiber curve in Chart 15.1 shows that 90% of its subscribers experienced a median download speed exceeding 92% of the advertised download speed, while 70% experienced a median download speed exceeding 94% of the advertised download speed and 50% experienced a median download speed exceeding 95% of the advertised download speed.
Chart 15.1: Complementary cumulative distribution of the ratio of median download speed to advertised download speed.
The curves for cable-based broadband and fiber-based broadband are steeper than those for DSL-based broadband and satellite-based broadband. This can be more clearly seen in Chart 15.3, which plots aggregate curves for each technology. Approximately 82% of subscribers to cable and 66% of subscribers to fiber-based technologies experience median download speeds exceeding the advertised download speed. In contrast, only 38% of subscribers to DSL-based services experience median download speeds exceeding the advertised download speed.[28]
Chart 15.3: Complementary cumulative distribution of the ratio of median download speed to advertised download speed, by technology.
Charts 15.4 to 15.6 show the complementary cumulative distribution of the ratio of median upload speed (over the peak usage period) to advertised upload speed for each participating ISP (Charts 15.4 and 15.5) and by access technology (Chart 15.6).
Chart 15.4: Complementary cumulative distribution of the ratio of median upload speed to advertised upload speed.
Chart 15.5: Complementary cumulative distribution of the ratio of median upload speed to advertised upload speed (continued).
Chart 15.6: Complementary cumulative distribution of the ratio of median upload speed to advertised upload speed, by technology.
All actual speeds discussed above are measured only during peak usage periods. In contrast, Charts 16.1 and 16.2 below compare the ratio of actual speed to advertised speed during peak and off-peak times.[29] Charts 16.1 and 16.2 show that while most ISPs show only a slight degradation from off-peak to peak hour performance, satellite ISPs show a markedly larger degradation. Hughes customers experience a drop from 243% to 166% in the ratio of median download speed to advertised speed from off-peak hours to peak hours. Similarly, ViaSat customers experience a corresponding drop from 106% to 78%.
Chart 16.1: The ratio of median download speed to advertised download speed, peak versus off-peak.
Chart 16.2: The ratio of median upload speed to advertised upload speed, peak versus off-peak.
Charts 17.1[30] and 17.2 below show the download ratio in each two-hour time block during weekdays for each ISP. The ratio is lowest during the busiest four-hour time block (7:00 p.m. to 11:00 p.m.).
Chart 17.1: The ratio of median download speed to advertised download speed, Monday-to-Friday two-hour time blocks, terrestrial ISPs.
Chart 17.2: The ratio of median download speed to advertised download speed, Monday-to-Friday two-hour time blocks, satellite ISPs.
Chart 6 (in section 2.C above) illustrated, for each ISP, the ratio of the 80/80 consistent median download speed to advertised download speed, and for comparison, the ratio of median download speed to advertised download speed shown previously in Chart 4.
Chart 18.1 illustrates information for 80/80 consistent upload speed. For all ISPs, the upload 80/80 speed is lower than the median upload speed. For most ISPs, the upload 80/80 speed is slightly lower than the median speed. However, in the case of Hughes, ViaSatand Verizon DSL, the 80/80 upload speed was considerably lower than the median speed.
Chart 18.1: The ratio of 80/80 consistent upload speed to advertised upload speed.
Charts 18.2 and 18.3 below illustrate similar consistency metrics for 70/70 consistent speeds, i.e., the minimum speed (as a percentage of the advertised speed) experienced by at least 70% of panelists during at least 70% of the peak usage period. The ratios for 70/70 consistent speeds are higher than the corresponding ratios for 80/80 consistent speeds. In fact, for many ISPs, the 70/70 consistent download speed is close to the median download speed. Once again, ISPs using satellite technology showed a considerably smaller value for the 70/70 download and upload speed as compared to the download and upload median speed, respectively.
Chart 18.2: The ratio of 70/70 consistent download speed to advertised download speed.
Chart 18.3: The ratio of 70/70 consistent upload speed to advertised upload speed.
D. Latency
Chart 19 below shows the weighted median latency, by technology and by advertised download speed for terrestrial technologies. For a given technology, latency varies little with advertised download speed. DSL service has typically higher latency than cable and fiber.
Chart 19: Latency for Terrestrial ISPs, by technology and by advertised download speed.
5. Additional Test Results
A. Actual Speed, By Service Tier
As shown in Charts 20.1-20.6, peak usage period performance varied by service tier among participating ISPs during the September 2016 period. On average, during peak periods, the ratio of median download speed to advertised download speed for all ISPs was 66% or better, and 90% or better for most ISPs. However, the ratio of median download speed to advertised download speed varies among service tiers. It should be noted that for Verizon-DSL, which advertises a range of speeds, we have calculated a range of values corresponding to its advertised range.
Chart 20.1: The ratio of median download speed to advertised download speed, by ISP (0-5 Mbps).
Chart 20.2: The ratio of median download speed to advertised download speed, by ISP (6-10 Mbps).
Chart 20.3: The ratio of median download speed to advertised download speed, by ISP (12-15 Mbps).
Chart 20.4: The ratio of median download speed to advertised download speed, by ISP (18-25 Mbps).
Chart 20.5: The ratio of median download speed to advertised download speed, by ISP (30-50 Mbps).
Chart 20.6: The ratio of median download speed to advertised download speed, by ISP (60-200 Mbps).
Charts 21.1 –21.5 depict the ratio of median upload speeds to advertised upload speeds for each ISP by service tier.
Chart 21.1: The ratio of median upload speed to advertised upload speed, by ISP (0.256-0.64 Mbps).
Chart 21.2: The ratio of median upload speed to advertised upload speed, by ISP (0.768-1.5 Mbps).
Chart 21.3: The ratio of median upload speed to advertised upload speed, by ISP (2-5 Mbps).
Chart 21.4: The ratio of median upload speed to advertised upload speed, by ISP (6-10 Mbps).
Chart 21.5: The ratio of median upload speed to advertised upload speed, by ISP (20-100 Mbps).
Table 2 lists the advertised download service tiers included in this study and compares this with the ISP’s median download speed results. As in past reports, we note that the download speeds listed here are based on national averages and may not represent the performance experienced by any particular consumer at any given time or place.
Table 2: Peak period median download speed, sorted by actual download speed
Advertised Download Speed (Mbps) |
ISP |
Actual Speed / Advertised Speed (%) |
|
0.81 |
0.5 - 1 |
Verizon DSL |
81 - 162 |
2.07 |
1.1 - 3 |
Verizon DSL |
69 - 188 |
1.29 |
1.5 |
CenturyLink |
86 |
2.41 |
3 |
AT&T DSL |
80 |
3.29 |
3 |
AT&T IPBB |
110 |
2.68 |
3 |
CenturyLink |
89 |
2.74 |
3 |
Windstream |
91 |
2.48 |
3 |
Frontier DSL |
82 |
3.30 |
5 |
Cincinnati Bell DSL |
66 |
9.81 |
5 |
Hughes |
196 |
5.00 |
6 |
AT&T DSL |
83 |
6.90 |
6 |
AT&T IPBB |
115 |
5.89 |
6 |
Windstream |
98 |
5.67 |
6 |
Frontier DSL |
94 |
6.89 |
7 |
CenturyLink |
98 |
9.66 |
10 |
CenturyLink |
97 |
8.47 |
10 |
Cincinnati Bell DSL |
85 |
15.35 |
10 |
Hughes |
154 |
14.31 |
12 |
AT&T IPBB |
119 |
13.06 |
12 |
CenturyLink |
109 |
11.15 |
12 |
Frontier DSL |
93 |
9.40 |
12 |
ViaSat |
78 |
11.38 |
12 |
Windstream |
95 |
15.98 |
15 |
Cox |
107 |
20.54 |
15 |
Mediacom |
137 |
17.30 |
15 |
TWC |
116 |
21.29 |
18 |
AT&T IPBB |
118 |
19.22 |
20 |
CenturyLink |
96 |
23.62 |
20 |
TWC |
118 |
27.87 |
24 |
AT&T IPBB |
116 |
28.07 |
25 |
Optimum |
112 |
29.49 |
25 |
Comcast |
118 |
25.32 |
25 |
Frontier Fiber |
101 |
28.90 |
25 |
Verizon Fiber |
116 |
27.89 |
30 |
Cincinnati Bell DSL |
93 |
28.40 |
30 |
Cincinnati Bell Fiber |
95 |
36.61 |
30 |
TWC |
122 |
40.52 |
40 |
CenturyLink |
101 |
48.05 |
45 |
AT&T IPBB |
107 |
55.89 |
50 |
Optimum |
112 |
58.16 |
50 |
Cox |
116 |
48.77 |
50 |
Frontier Fiber |
98 |
55.79 |
50 |
Mediacom |
112 |
58.35 |
50 |
TWC |
117 |
56.81 |
50 |
Verizon Fiber |
114 |
64.67 |
60 |
Charter |
108 |
87.74 |
75 |
Comcast |
117 |
81.65 |
75 |
Frontier Fiber |
109 |
81.81 |
75 |
Verizon Fiber |
109 |
118.29 |
100 |
Charter |
118 |
109.45 |
100 |
Cox |
109 |
91.81 |
100 |
Mediacom |
92 |
100.90 |
100 |
TWC |
101 |
99.31 |
100 |
Verizon Fiber |
99 |
112.66 |
101 |
Optimum |
112 |
111.08 |
105 |
Comcast |
106 |
140.72 |
150 |
Comcast |
94 |
B. Variations In Speed
In Section 3.C above, we presented speed consistency metrics for each ISP based on test results averaged across all service tiers. In this section, we provide detailed results for each individual service tier for each ISP. Consistency of speed is important for services such as video streaming. A significant reduction in speed for more than a few seconds can force a reduction in video resolution or an intermittent loss of service.
Charts 22.1 – 22.3 below show the percentage of consumers that achieved greater than 95%, between 85% and 95%, or less than 80% of the advertised download speed for each ISP speed tier. Consistent with past performance, ViaSat/Exede showed low consistency of speed with 52% of consumers experiencing an average service speed of 80% or less of advertised speed. ISPs using DSL technology also frequently fail to deliver advertised service rates. ISPs quote a single ‘up-to’ speed, but the actual speed of DSL depends on the distance between the subscriber and the serving central office.
Cable companies, in general, show a high consistency of speed. However, tiers of 100 Mbps and above appear to provide a somewhat lower level of consistency. Fiber-based systems, in general, offer a high level of consistency of speed.
Similarly, Charts 23.1 to 23.3 show the percentage of consumers that achieved greater than 95%, between 85% and 95%, or less than 80% of the advertised upload speed for each ISP speed tier.
Chart 23.2: The percentage of consumers whose median upload speed was greater than 95%, between 80% and 95%, or less than 80% of the advertised upload speed (cable).
Chart 23.3: The percentage of consumers whose median upload speed was greater than 95%, between 80% and 95%, or less than 80% of the advertised upload speed (fiber and satellite).
In Section 3.C above, we presented complementary cumulative distributions for each ISP based on test results across all service tiers. Below, we provide tables showing selected points on these distributions by each individual ISP and technology. Overall, performance depends less on a specific technology and more on the engineering and marketing choices made by each provider. For example, Optimum and Charter, which are cable-based companies, provided average download speeds over 95% and 96%, respectively, of advertised rates to 95% of their panelists. Cox and Mediacom, also cable-based companies, provided median speeds of at least 79% and 59% of advertised speed to 95% of their panelists. Verizon’s fiber-based service provided speeds of 88% or better to 95% of its panelists whereas Frontier Fiber provided speeds of 91% or better to 95% of its panelists.
Table 3: Complementary cumulative distribution of the ratio of median download speed to advertised download speed, by technology, by ISP
ISP |
20% |
50% |
70% |
80% |
90% |
95% |
AT&T - DSL |
90% |
83% |
77% |
74% |
70% |
64% |
AT&T - IPBB |
124% |
112% |
105% |
99% |
90% |
83% |
CenturyLink |
109% |
95% |
85% |
79% |
72% |
60% |
Cincinnati Bell Fiber |
95% |
95% |
94% |
94% |
92% |
89% |
Cincinnati Bell DSL |
93% |
85% |
77% |
64% |
37% |
25% |
Charter |
109% |
108% |
107% |
105% |
102% |
96% |
Comcast |
119% |
116% |
109% |
98% |
82% |
62% |
Cox |
119% |
114% |
107% |
106% |
96% |
79% |
Frontier Fiber |
111% |
101% |
98% |
96% |
94% |
91% |
Hughes |
212% |
165% |
136% |
121% |
88% |
70% |
Frontier DSL |
97% |
89% |
80% |
73% |
51% |
38% |
Mediacom |
115% |
109% |
95% |
89% |
74% |
59% |
Optimum |
113% |
112% |
110% |
109% |
104% |
95% |
TWC |
122% |
116% |
113% |
108% |
92% |
82% |
Verizon Fiber |
114% |
109% |
100% |
99% |
96% |
88% |
Verizon DSL |
123% |
108% |
92% |
75% |
53% |
47% |
ViaSat/Exede |
94% |
78% |
66% |
61% |
54% |
43% |
Windstream |
101% |
97% |
90% |
85% |
73% |
49% |
C. Web Browsing Performance, By Service Tier
Below, we provide the detailed results of the webpage download time for each individual service tier of each ISP. Generally, website loading time decreases steadily until the speed tier reaches 15 Mbps and does not change markedly above that.
Chart 24.1: Average webpage download time, by ISP (1-3 Mbps).
Chart 24.2: Average webpage download time, by ISP (5-10 Mbps),
Chart 24.3: Average webpage download time, by ISP (12-15 Mbps).
Chart 24.4: Average webpage download time, by ISP (18-25 Mbps).
Chart 24.5: Average webpage download time, by ISP (30-50 Mbps).
Chart 24.6: Average webpage download time, by ISP (60-200 Mbps).
[1] In 2016, we added a large regional operator, Cincinnati Bell, to the MBA program for the first time. Cincinnati Bell primarily serves northern Kentucky and southwestern Ohio.
[2] All reports can be found at https://www.fcc.gov/general/measuring-broadband-america.
[3] The First Report (2011) was based on measurements taken in March 2011, the Second Report (2012) on measurements taken in April 2012, and the Third (2013) through Sixth (2016) Reports on measurements taken in September of the previous year.
[4] The actual dates used for measurements for this Seventh Report were September 1-11, 2016 inclusive and September 21-October 9, 2016 inclusive.
[5] We first determine the mean value over all the measurements for each individual panelist’s “whitebox.” (Panelists are sent “whiteboxes” that run pre-installed software on off-the-shelf routers that measure thirteen broadband performance metrics, including download speed, upload speed, and latency.) For individual speed tiers, we then compute the median of the mean values of all the panelists/whiteboxes. The median is that value separating the top half of values in a sample set with the lower half of values in a sample set; it can be thought of as the middle value in an ordered list of values. For calculations involving multiple speed tiers, we compute the weighted average of the medians for each tier. The weightings are based on the relative subscriber numbers for the individual tiers.
[6] See 2016 Report at https://www.fcc.gov/reports-research/reports/measuring-broadband-americ….
[7] As described more fully in section 2, a service tier is initially added to this report only if it contains at least 30,000 subscribers and has 5% or more of an ISP’s total number of broadband subscribers.
[8] The ISPs that provided SamKnows, the FCC’s contractor supporting the MBA program, with weights for each of their tiers were: AT&T, Cincinnati Bell, CenturyLink, Charter, Comcast, Cox, Hughes, Mediacom, Optimum, Time-Warner Cable, and Verizon.
[9] See https://transition.fcc.gov/form477/477inst.pdf (explaining FCC Form 477 filing requirements and required data).
[10] Video traffic comprised 70% of Internet traffic in 2015, and some expect it to grow to 82% by 2020. See Cisco Visual Networking Index: Forecast and Methodology, 2014-2020 White Paper, http://www.cisco.com/c/en/us/solutions/collateral/service-provider/ip-ngn-ip-next-generation-network/white_paper_c11-481360.html (last accessed May 7, 2018).
[11] This chart lists only the most populous service tiers of the ISPs tested. It should be noted that ISPs may offer other tiers at higher or lower speeds.
[12] Where several technologies are plotted at the same point in the chart, this is identified as “Multiple Technologies.”
[13] Of the 6,241 panelists who participated in the September 2015 collection of data, 4,707 panelists continued to participate in the September 2016 collection of data.
[14] We do not attempt here to distinguish between these two cases.
[15] For a detailed definition and discussion of this metric, please refer to the Technical Appendix.
[16] See VoIP-Info, QoS (last visited July 2, 2018), https://www.voip-info.org/wiki/view/QoS and http://www.ciscopress.com/articles/article.asp?p=357102.
[17] See ITU, Recommendation ITU-R M.1079-2: Performance and quality of services requirements for International Mobile Telecommunciations-2000 (IMT-2000) access networks, www.itu.int/dms_pubrec/itu-r/rec/m/r-rec-m.1079-2-200306-i!!msw-e.doc.
[18] The 2014 Report and earlier reports also included Insight Communications, which has merged with Time Warner Cable, and Qwest Communications, which is part of CenturyLink. Hughes Network Systems joined the program in 2014. ViaSat operates under the brand name Exede Internet.
[19] For this report, we excluded some measurements using the M-Lab measurement servers, due to a problem with the architecture of those servers that affected the higher service tiers.
[20] The period of September 12-20, 2016 was omitted because the release of Apple’s iOS 10 operating system caused widespread network congestion. This determination was made consistent with the FCC’s data collection policy for fixed MBA data. See FCC, Measuring Fixed Broadband, Data Collection Policy, https://www.fcc.gov/general/measuring-broadband-america-measuring-fixed-broadband (explaining that the FCC has developed policies to deal with impairments in the data collection process with potential impact for the validity of the data collected).
[21] Independent research, drawing on the FCC’s MBA test platform (see https://www.fcc.gov/general/mba-assisted-research-studies), suggests that home networks are a significant source of end-to-end service congestion. See Srikanth Sundaresan et al., Home Network or Access Link? Locating Last-Mile Downstream Throughput Bottlenecks, PAM 2016 - Passive and Active Measurement Conference, at 111-123 (March 2016).
[22] Performance observed by a user may also depend on other factors, including the capabilities of their device and the performance of network devices within their home.
[23] The September 2016 data set was validated to remove anomalies that would have produced errors in the Report. This data validation process is described in the Technical Appendix.
[24] The software that was used for the MBA program will be made available for noncommercial purposes. To apply for noncommercial review of the code, interested parties may contact SamKnows directly at team@samknows.com, with the subject heading “Academic Code Review.”
[25] As discussed above, measured service tiers were tiers which constituted 5% or more of an ISP’s broadband subscriber base and had at least 30,000 subscribers.
[26] In these charts, we show Verizon’s median speed as a percentage of the mid-point between their lower and upper advertised speed range.
[27] In Reports prior to the 2015 MBA Report, for each ratio of actual to advertised download speed on the horizontal axis, the cumulative distribution function curves showed the percentage of measurements, rather than panelists subscribing to each ISP, that experienced at least this ratio. The methodology used in both this and last year’s Report, i.e., using panelists subscribing to each ISP, more accurately illustrates performance from the point of view of the consumer.
[28] The speed achievable by DSL depends on the distance between the subscriber and the central office. Thus, the complementary cumulative distribution function will fall slowly unless the broadband ISP adjusts its advertised rate based on the subscriber’s location. (Chart 17 illustrates that the performance during non-busy hours is similar to the busy hour, making congestion less likely as an explanation.)
[29] Verizon DSL download and upload results are shown as a range because Verizon advertises its DSL speed as a range rather than as a specific speed.
[30] In this chart, we have shown the median download speed of Verizon-DSL as a percentage of the midpoint of the advertised speed range for its tier.