Wielding artificial intelligence, investigators were able to scan huge amounts of information collected from wearables and discern users’ height, weight, sex, age, and other characteristics, according to a recently released paper by scientists from the University of Cambridge and the Alan Turing Institute. The identities of the people whose information was in the data were not revealed, but the study raises privacy concerns about what could happen to information that’s collected, experts say.  “Devices such as smartwatches are not covered by HIPAA or other privacy laws, meaning any data collected can go back to the seller. In addition, most people inadvertently waive protections when they set up device software and ‘accept’ the terms of service,” healthcare attorney Heather Macre said in an email interview.  “This puts a lot of data in the seller’s hands and from there, anyone knowledgeable with health data can extrapolate a lot of information.”

AI May Not Be Your Friend

To conduct the study, researchers developed an AI system called Step2Heart. The system used machine learning to predict health-related outcomes in anonymous health datasets. Step2Heart was able to classify sex, height, and weight with a high degree of confidence, researchers said. It was possible but more difficult to discover metrics like BMI, blood oxygen, and age. The kind of health data collected in the study could stay available for longer than you might expect, experts say. “Health data, such as existing health conditions, can be highly sensitive and may be sensitive for a long time, e.g. the rest of the individual’s life,” Sean Butler, the director of product marketing at data privacy company Privitar, said in an email interview. “Full location traces [the history of where someone has been when] are highly revealing and cannot effectively be anonymized.”  Health data that’s discovered by AI could potentially be used to discriminate, observers say. “Being able to revert the de-identification or de-anonymization of health data is not a good direction as it opens the prospect of being singled out and offered less favorable services by banks and insurances just because a person is in a certain socio-demographic group, without any transparency,” Dirk Schrader, the global vice president of product marketing and business development at cybersecurity company New Net Technologies, said in an email interview. “The ability to infer age, sex, and other fitness metrics or health data means that the companies using this kind of data are biased towards a certain group that seems less profitable for their model.”

Expert: If You Want Privacy, Stay Offline

New York University bioethicist Arthur L. Caplan says that wearable users need to know what they are giving up when they operate their devices. “There’s so much data out there and so many hackers that I fear privacy is impossible at this point unless you stay offline,” he added in an email interview.  But some experts say there are steps users can take to reduce the risk that their personal data could be revealed by wearables. “Read the license agreements of the apps you use and your watch. If you can’t find one, that’s really bad,” Schrader said.  Don’t blindly accept the default privacy settings, and “instead take the view that the default settings are designed to enable the company to collect as much data from you as possible,” Paul Lipman, CEO of cybersecurity company BullGuard, said in an email interview. “It’s also fairly standard for wearable manufacturers to provide privacy options on the device, the app linked to the device, and a web portal so you will need to step through each one. Also, turn off location tracking. Up to five data points tied to a location provide enough information to identify someone.” You might want to think carefully the next time you strap on your smartwatch. The data it’s collecting could end up in more places than you expect.