Wearables

Let's look at wearables and tackling the ethical challenge

7th August 2017
Anna Flockett
0

It is clear over the past few years that wearable devices have moved forward and are continuing to do so. From their early role as novelty gadgets and sports accessories to become more powerful and generally more useful, who knows what is next.

Guest blog by Mark Patrick, Mouser Electronics.

Wearables these days - which including smart watches, hearable devices, smart glasses and so on - promise to provide us constant, easy access to information, communications, entertainment and health/fitness monitoring.

Not only are there benefits for end users - wearables are also a bonanza for developers and manufacturers. Analysts predict strong growth, from a market value of less than $20bn in 2016 to almost $60bn in 2022, with hundreds of millions of devices being sold every year.

However, the headlong growth of this market is raising ethical and privacy concerns which developers and designers need to make themselves aware of, in order to protect end users and themselves.

Wearables - dream or nightmare?
To help us better understand the challenge, let’s consider some of the more alarming scenarios that smart, connected wearables could soon make possible.

For example, your smart watch’s built-in microphone detects when an advert or a song is playing nearby. Then it monitors your heart rate and pulse, your voice, and your movement, to determine if you are excited, attentive or uninterested. The data is sold. It may be used to target marketing at you in future. Perhaps this sounds improbable? But recent reports show that such scenarios are not too far from reality: smartphone apps are already using the microphones in their handsets to detect ultrasonic signals added to adverts, or broadcast in stores, in order to track users and measure ad exposure.

Or, imagine your insurer offers you cheaper insurance if you wear a health-monitoring wristband. However, the wristband detects an irregular heartbeat, and your next monthly premium is increased by 1,000%. Constant, mandatory health monitoring might sound unlikely, but companies such as BP are already handing out free or reduced price fitness trackers to their employees.

The programme is voluntary, but insurance premiums are reduced if they exercise regularly (BP stresses that it does not misuse employees’ personal data). Research firm Gartner predicts that “by 2018, two million employees will be required to wear health and fitness tracking devices as a condition of employment” - with the primary goal being to protect employees’ health, but also to help protect employers from medical liability claims.

How about smart glasses that track your gaze direction, identify products and marketing material you seem interested in, and market similar items to you? Although this is technically feasible, no such product exists. But there are precedents. For example, in a case pre-dating modern wearable devices, the Target stores chain researched ways of determining if a shopper was pregnant, based on the items they purchased, in order to subsequently market babycare products to them.

In fact, accurate personal data is so valuable to advertisers, marketers and insurers, that it is conceivable they will offer wearable devices for free, in exchange for the data they gather. Perhaps in some countries, governments will consider doing the same.

Challenges and solutions
All of the above scenarios present thorny ethical and moral dilemmas for device and service vendors. They also raise questions of legal liability. To shield themselves from risks, developers can begin by familiarising themselves with data protection laws in major markets.

These regulations limit the types of data that can be collected and how it is used - and they may suggest financial penalties for corporations that do not safeguard customer information. In addition, regulations often require service providers to fully delete personal data when requested by customers, and to let customers view their own data. Compliance with these laws can be an onerous task, so it’s wise to carefully consider and limit which data needs to be stored, in order to reduce the potential future burden of handling it.

As services reach hundreds of millions of customers, companies may decide to avoid storing valuable data because liability costs could be enormous if it was stolen. One way to mitigate such risks is using asymmetric cryptography to ensure the company itself cannot read the personal data, only the customer. However, an unattractive side effect of this technique is that it becomes difficult to do server-side data processing. For example, if the customer wants to search through their medical history, they have to retrieve the data, decrypt it with their private key, and then run the search on their personal client device.

Another common privacy protection strategy is to anonymise personal data, by completely removing identifying details such as names. However, there have been incidents in which data was de-anonymised, generally by matching behaviour patterns of anonymous users with publicly available information about individuals.

In one infamous case, Netflix released a public database of anonymised data about its users’ movie preferences, and challenged developers to use it to create a better algorithm to suggest movies to users. However, researchers demonstrated how the anonymous Netflix user profiles could be matched with the movie preferences of real people found elsewhere on the internet. Netflix quickly terminated the project.

The risk is, that if someone can gather enough data points associated with a person, then that is as unique as a fingerprint. So, simply anonymizing data becomes increasingly ineffective as more information about an individual is stored. But a further refinement is to aggregate the anonymised data - gathering users into groups.

For example, in the BP employee fitness program mentioned earlier, data gathered by the smart wristbands is eventually stored in aggregate form, so that the company does not have access to any individual’s specific details. This approach reduces liability risks, without losing all the value of the data that has been compiled - and it is generally more attractive to privacy-savvy users.

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier