FCC Plans for Second ISP Speed Test
The FCC plans a nationwide broadband speed test, following up on the limited tests it did last year which showed that customers at 13 of the largest ISPs were generally receiving performance at or exceeding advertised levels. Industry players have been meeting with the commission over the past several weeks to hammer out details of how the expanded tests will be done and what they'll measure, according to a series of ex parte filings. An agency source said that in addition to raw speed measurements, the new tests will include data on jitter, latency and variance.
Sign up for a free preview to unlock the rest of this article
If your job depends on informed compliance, you need International Trade Today. Delivered every business day and available any time online, only International Trade Today helps you stay current on the increasingly complex international trade regulatory environment.
In its first “Measuring Broadband America Report,” the commission worked with broadband measurement service SamKnows to take more than 100 million direct measurements of broadband performance from thousands of volunteers. The report (http://xrl.us/bk3w33) found that the 13 ISPs in the study were generally meeting or exceeding advertised speeds (CD Aug 3 p1). Now, the commission is proceeding to what some in the industry are calling “Phase II.” The FCC plans to expand the study into new regions of the country, and “publish more kinds of data” in two new reports this year, said a public notice Friday (http://xrl.us/bmskj3). “Performance information is only useful to consumers if it is accurate and up-to-date,” it said. “The FCC will continue its commitment to test and report broadband information transparently, and in collaboration with key stakeholders."
Representatives from more than a dozen ISPs met earlier this month with commission staff to discuss how to recruit volunteer panelists, how to license report data to carriers in advance of the public release, and how to allow for academic and research review of the testing architecture, said an ex parte filing in docket 09-158 (http://xrl.us/bmskkf). A draft code of conduct presented at the meeting said the participants and stakeholders in the broadband testing program would agree to act in good faith, keep results confidential until the FCC releases the data, and ensure they don’t do anything to “enhance, degrade, or tamper with the results of any test for any individual panelist or broadband provider."
Level 3 discussed “the importance of end-to-end Internet metrics, given that poor performance of any link from the beginning of the transmission to the end can result in a poor consumer experience,” the company said about a separate FCC meeting (http://xrl.us/bmskku). Level 3 suggested the use of trace routing to identify links in the end-to-end transmission that cause performance issues. “Measurement standards should be agreed upon and implemented on an industry wide basis, as opposed to only involving certain providers or types of providers involved in Internet transmissions,” the Internet backbone company wrote.
Mediacom has concerns with the FCC’s methodology in earlier tests, the company reported it and NCTA told commission officials. “During the initial round of broadband speed testing by SamKnows last year, the testing methodology did not always account for significant differences between ISPs in terms of the relative distances between the various ISPs and the selected M-Lab measurement points,” Mediacom attorney Craig Gilley wrote in an ex parte filing (http://xrl.us/bmskmr). “Measuring results using a limited number of servers located in metropolitan areas disadvantaged providers such as Mediacom who do not have their own networks serving such locations."
Mediacom did not fare as well as some of the other ISPs in last year’s broadband test. Although its average sustained download speeds were frequently over 90 percent of its advertised speed, during evening peak hours its speed dropped to around 75 percent. The “small size of the test sample,” overemphasized the “anomalous results of just a few geographically remote users,” Gilley wrote. Mediacom proposed the possibility of deploying additional measuring servers, at its own cost, at one or more of its “significant peering locations.”