ISP Speed Test Data Gathering Phase Affected By Server Issues
The data-gathering phase for the FCC’s 2012 Measuring Broadband America report was delayed by about a month to this April, because of server issues, said an ex parte filing posted in docket 09-158 last week (http://xrl.us/bm55z3). An FCC engineer working on the ISP speed tests told us the issues were relatively minor. And the technical issues occurred over a period of several months, the recent filing said.
Sign up for a free preview to unlock the rest of this article
If your job depends on informed compliance, you need International Trade Today. Delivered every business day and available any time online, only International Trade Today helps you stay current on the increasingly complex international trade regulatory environment.
"Over a five month period, a very gradual degradation of network capacity has been identified,” M-Lab engineers Thomas Gideon and Sasha Meinrath said at an April 4 meeting with FCC staff, according to the filing. “While many of M-Lab existing monitoring mechanisms track changes in performance of M-Lab servers, the gradual pace and long timeline of the performance degradations made the problem difficult to track.” This was the reason monitoring mechanisms failed to identify the performance degradations, they said. Rebooting the servers corrected the problem and returned performance to optimal levels.
SamKnows, a U.K. broadband performance measurement company, found evidence of congestion on March 23, and a few days later, it reported the throughput issue to M-Lab, according to a slide presentation at the FCC meeting (http://xrl.us/bm6abv). On March 30, SamKnows noticed further throughput issues to the New York and Los Angeles test nodes. “March data for these test nodes has therefore been compromised,” the presentation said.
"Anomalies” in the network were affecting some of the measurement locations, Walter Johnston, chief of the FCC’s Electromagnetic Compatibility Division, told us. M-Lab and the FCC team performing measurements noticed it because they do a lot of auditing to make sure the network is performing optimally, he said. “It wasn’t a major issue,” Johnston said, saying the data collected through April provides fair comparative tests across all ISPs tested. “I've done a lot of network testing in my life,” he said. “This was basically a minor hiccup."
At the meeting, commission staff sought input on appropriate steps to disclose any potential impact of the server issues on the quality of data that was gathered from October through March, before procedures were updated, the filing said. Because the data is used to make comparisons of performance between broadband providers, participants suggested caution in how the data is released. Some participants suggested a “prominent disclosure” of the potential impact on data accuracy, and others suggested waiting and analyzing the scope of the impact on data before discussing disclaimers, the filing said. Commission staff said they would write a draft disclaimer that could accompany data in the potentially affected period.
The FCC’s comparative speed testing results are heavily touted by broadband service providers who fared well. Verizon promotional material frequently mentions a chart in the 2011 test (http://xrl.us/bm59zg) that showed its FiOS service to be the fastest and most consistent high-speed Internet provider.