Wearables: between “benevolent nudging” and the threat of losing control

Fitness Bracelets, Smart Watches & Co. can help achieve goals like more exercise, according to researchers. The devices also learn a lot about their wearers.

 Wearables: Between "benevolent nudging" and the threat of loss of control

Wearables such as fitness trackers and Smart Watches are now widely available, but their inner workings are often difficult for users to understand.

Especially during the corona pandemic, devices that can be used to measure heart rate, sleep rhythm or steps taken were found , new buyers. Their success “came relatively quickly,” explains scientist Saskia Nagel. The devices seemed ‘intimate’ in the form of “very close”, can be easily integrated into everyday life and are almost ubiquitously available. However, many users are not even aware of the amount of data collected here and possibly passed on and shared, Nagel said at a conference on Wednesday in Berlin of the “DenkfabrEthik” (DenkfabrEthik). to consider.

With the state-funded initiative, the Gesellschaft für Informatik (GI) and partners from science and industry want to explore whether and how one’s own data, especially from wearable health technologies, can be managed with sovereignty.

Nudging and sludging

Nagel wearables did not want to deny various opportunities. You could “our minds and skills” by providing “more cognitive capacity for other tasks” by providing vital information. made available, explained the head of the field of applied ethics at the RWTH Aachen University. For example, people with disabilities could run marathons because obstacles were pointed out to them. “Ideally, we have a ‘benevolent nudging'”, the professor emphasized: Human action is given more space because the application encourages or slightly urges users to achieve their own goals. “Drink” and “Get Out Regularly” apps may actually work, as early empirical studies suggest. However, the goals are often set by the app developers.

Nagel also raised various concerns. Nudging can easily be translated into “Sludging” change when other institutions decide about their own supposed well-being. The information provided by the devices is also likely to overwhelm many users because they cannot process it. The result is stress. Dequalification through and dependency on wearables can also be identified: Anyone who works with data glasses, for example, delegates their own skills to them. This could be perceived as a loss if you actually want to remember simple sequences of numbers like phone numbers. There are also major concerns about privacy. In general, one encounters a general problem with the interaction of people and technology, the researcher explained: The latter makes it a little easier, but the user no longer has full control over what is happening: “There remains a gap in responsibility.”

Where are the mistakes?

In case of doubt, the technology decides for the human being, what is already conceptually required for a self-learning artificial intelligence (AI). Nevertheless, mistakes remain possible, which society has to deal with. However, different requirements apply to applications in medicine, mobility or professional life, for example. According to Nagel, transparency alone is not enough to ensure the frequently required trust in technology such as “Trustworthy AI” (Artificial Intelligence): “I need explanations” that allow interpretability.

It is important to know, for example, why the system makes which decision and when. “How are values ​​balanced here?”, the expert also wants to know. However, the autonomy of the user often depends on the “accuracy of the process” opposite to. In medical technology in particular, people wanted to know who was to blame for a failure, the ethicist reported. If the doctor or the programmer didn’t take responsibility, “the collective around the system” would be happy to do so. around held accountable. However, this solution is also sometimes not suitable if responsibility or only a scapegoat is sought. Shifting the trust question to the other users who shared data collaboratively, for example, could also backfire for everyone.

Complex and impenetrable

Michelle Christensen, professor at the Einstein Center Digital Future (ECDF) in Berlin, reminded us that tools like stones or smartphones “change the way we think”. On the other hand, “we no longer understand what we design”. The collection of data for profiles already includes the whole life of users. In addition, developers tend to insert many of their own prejudices into algorithms.

In addition, there is the hidden infrastructure in the background, which is what other researchers have already identified as the “Internet of Bodies” make it possible and keep it going, added Florian Conradi from the ECDF. The maxim at the center is therefore to take wearables apart, examine the trackers and, if necessary, create data-saving alternatives using rapid prototyping (3D printing). A student built a kind of open-source receipt printer to make the collected data tangible, Conradi revealed.

A fellow student brought a Nokia 3410 back to life, which now tweets once a day, that it still works. Christensen referred to a student who had built her own app for tracking sleep and snoring, in which her own body data did not go to companies.

Sniffer software by design

Researchers from In a study last year, ExpressVPN Security Lab identified 12 health and fitness apps that contained questionable location features. These included “Ab Workout Fitness” with more than 10 million downloads and “30 days challenge legs & po’ with over 5 million downloads. The trackers secretly collect data that is not only used for advertising purposes, but also ends up with the US military and law enforcement agencies. Responsible for this is sniffer software from companies like X-Mode, which secretly collects data about app users in the background and sends this to interface providers.

Becky Caldwell, data protection expert at smartwatch and navigation device manufacturer Garmin, asserted that that the company does not transmit any information to third parties without the consent of the users. Users could use the “Privacy Zones” also cover up activities in certain areas such as around their home. Integrating data protection into the technology from the outset and setting it up accordingly is a “win-win situation for everyone”.

@top-advice.ru

Rate article
TOP ADVICE
Leave a Reply