CDT/Fitbit Report Seeks To Address Dearth of Privacy Best Practices in Wearables
Wearables, whether worn on the wrist or embedded in clothing to monitor physical activities and other wellness metrics, are fast being adopted. Many people are freely sharing their personal data with friends, family and even device manufacturers. But the consumer-facing wearables industry doesn't have a lot of privacy guidance on collecting, storing and sharing users' personal data, said authors of a new report from Center for Democracy and Technology and Fitbit as well as several observers.
Sign up for a free preview to unlock the rest of this article
If your job depends on informed compliance, you need International Trade Today. Delivered every business day and available any time online, only International Trade Today helps you stay current on the increasingly complex international trade regulatory environment.
Released a month ago, the CDT/Fitbit report, with funding from the Robert Wood Johnson Foundation, sought to address that deficiency by spelling out best practices for building a culture of privacy, security and ethics for wearable technology. It especially focused on Fitbit's internal research and development team (see 1605180014). Personal data collected by wearable companies are largely unregulated and fall outside the scope of the Health Insurance Portability and Accountability Act (HIPAA) and other laws.
Wearable users like to share their personal wellness information. An Ericsson global survey released last week said "contrary to expectation," many users aren't concerned about showing their results and other personal information to others. "Half of users of wearables share the data from their wearables online, and 60 percent feel in control of the data they are sharing and who has access to it," said the report, which surveyed 5,000 smartphone users (of which 2,500 were wearable users) online in Brazil, China, South Korea, the U.K. and U.S. Ericsson said 67 percent of users are open to having third parties use that data as long as it's anonymized and they get some value in return. Users also are more willing to share such data with wearable makers than with their doctors and insurance firms, the report found, mainly due to "a belief that the insights and services provided are of value" and also because wearable users appear to be more familiar with online risks compared with other smartphone users.
CDT/Fitbit report authors -- Michelle De Mooy, deputy director for CDT's privacy and data project, and Shelten Yuen, Fitbit's vice president-research -- said in interviews over the past week that the collaboration examined the data practices of Fitbit's internal R&D team, such as using wellness information from its own employees in developing products. Internal R&D "still remains a crucial, crucial part of whether or not the company is going to make it" and where data use is significant, said De Mooy.
They said it's still too early to gauge the impact from their report on the wearable industry. “What I’ve gotten is this cautious interest," said De Mooy. But the broad recommendations in the report -- user dignity, data stewardship and social good -- are resonating in the wearable sector, she said. The recommendations are designed to help companies shift from thinking about themselves as data silos to being part of a larger "social conversation" that has a social impact, she said. Yuen said the impetus for the development of such best practices wasn't to get out ahead of potential federal or state regulation, nor was it driven by technology companies committing serious misuses of data. "It was driven by a desire to get this out in front of people so they can start thinking about it, too," he said.
Fitbit has a "really strong privacy statement" for users, said Yuen, and internally has a "culture of data privacy" since employees are used to test sensors and help develop new products. "When you do experiments on yourself, you tend to think through all the sort of weird scenarios," he said. "And a lot of us have experience in human subject research already either coming from clinical backgrounds or medical research. But we didn’t have a formal process in the place at the time." He said CDT suggested Fitbit get one. This internal practice, he added, "helps set the stage for how you then think about your own users."
The report is "very reasonable" in terms of achieving privacy and security by design, said Adam Thierer, senior research fellow with George Mason University's Mercatus Center. He called that a "smart approach" for developers, especially as the report focuses on embedding best practices in the R&D process that includes de-identification and anonymization practices, data security, encryption and limitations on sensitive data. "So long as that’s in the realm of voluntary best practices and not mandatory, it doesn’t raise any red flags for those of us who are concerned about sort of top-down approaches to privacy and security law," he said. "I don’t see any call in there for that."
Morgan Reed, executive director of ACT|The App Association, said the CDT/Fitbit report was "very interesting," but not final. "I think [the reason] you see so many efforts around this is that the landscape is moving incredibly rapidly. So today’s industry best practice is likely to run into difficulties as new sensors are added, new capabilities are provided," he said. CTA developed some privacy and security guidelines last year. Other companies, health industry groups like the American Medical Association and government agencies such as the Office of the National Coordinator for Health Information Technology are looking into developing similar privacy- and security-related recommendations, said Reed.
Editor's note: This is the first in a two-part series looking more closely at Fitbit's use of employee data for R&D, possible government enforcement for wellness data, and development of other privacy guidelines.