BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Why Can't Healthcare Wearables Get You Out Of Work?

This article is more than 9 years old.

Earlier this year, Nintendo unveiled the first few details of its new line of “Quality of Life” that will be released in fiscal 2015. Speaking to shareholders, Nintendo president Satoru Iwata described a bedside sleep sensor that will gather data on its users overnight, upload them to Nintendo’s servers for analysis, and then produce suggestions for getting a better night’s rest. Nintendo is one of the last to enter the market for consumer health analytics, a manic growth-industry that has drawn in Apple , Nike, Jawbone and Garmin .

But Nintendo came late to arcade games too, and not counting a short-lived Odyssey clone, its  NES console arrived a decade after the home console market had emerged. While the market may seem perilously overlarge already, there is much that could be done to shape its development. As is always the case when technology is given credit for engineering self-improvement, all the pressure for change is directed at the individual. If you’re sleeping poorly, are overstressed, or in bad health, its your responsibility to find a way out of these statistically suboptimal demographics. And if you can’t manage re-engineering your own life, there is a ready narrative to rationalize how your failure  in the economy of individual betterment makes you a drag on the collective, unworthy of further material support or care.

“There is no argument that whether or not we have sound sleep or not significantly affects our health, and many of us recognize through our daily lives that accumulated fatigue makes it difficult to maintain good health,” Iwata told investors. “Fatigue and sleep are themes that are rather hard to visualize in more objective ways. At Nintendo, we believe that if we could visualize them, there would be great potential for many people.”

IBM is similarly unfolding plans to use its Watson AI in healthcare applications, the most recent of which is Panorama, a similar “Quality of Life” technology built around gathering health data from users and applying computational intelligence toward convincing them to make their daily routines better comport with statistical norms.

The perils of these sorts of prescriptive applications alongside the rapidly evolving variety of automata powering them are myriad, and have oft been criticized. My friend and sometime editor Rob Horning describes the unquenchable complexity of big data not as a nefarious antagonist but a catalyst for deepening social divisions through scientifically-engineered indifference.

“We don’t need to be watched and brainwashed to [be made] docile,” Horning writes, “we just need to be situated within social dynamics whose range of outcomes have all been modeled as safe for the status quo. It’s not: ‘I see what you are doing, Rob Horning, stop that.’ It’s: ‘Rob Horning can be included in these different data sets, which means he should be offered these prices, these jobs, these insurance policies, these friends’ status updates, and he’ll likely be swayed by these facts.’”

In this light, lifestyle technologies like run trackers, heartbeat monitors, calorie counters, and sleep sensors don’t so much improve one’s quality of life as they preserve the comparative advantages of those already living in circumstances that predispose them to good health, while building an unscalable data wall in front of those in need of what another group already has too much of. Health too easily becomes a synonym for goodness, using it as the foundation for behavioral norms makes their resultant demographic divisions self-perpetuating.

As Kate Crawford argued in The Atlantic, health insurance companies are eager to use the collected data to limit their liabilities for individual claims, and while they cannot force individuals to use healthcare devices, they can use court orders to request data from people who do. It’s never proposed to use devices and data as a means of changing our shared environments instead of narrowly focusing on individuals.

In an alternate universe, you might imagine a sleep sensor discovering poor sleep habits and increased stress levels in a high percentage of employees working for a certain company, which could be used to impose penalties on the company or restrict its ability to demand overtime or weekend work hours. Or perhaps instead of feeding you drips of digital guilt for eating an ice cream cone or bag of potato chips, your phone’s healthcare app could give you permission to leave the office at 4PM, telling your boss you won't be allowed to take laptop or Blackberry with because doctor’s orders.

But health monitoring devices are designed not to liberate us from the exploitive troughs of labor and social resentment by reflecting toxic environmental pressures back onto those who've architected that environment. They are not meant to change things so much as convince us of our own dubious worth as individuals. In the same way as Proctor and Gamble invented halitosis as a clinical diagnosis in need of treatment in order to find a market angle for Listerine, the idea of self-improvement through data and machinery is a solution to a problem that no one has. It’s not that you’re sleeping wrong, but that you’re working too much and that too much of your life is at risk when you think about straying too far off the careerist path.

Healthcare devices are the byproduct of our diseased politics and economy, just as was the social scourge of bad breath among Europeans was largely attributable to diet, driven by the industries that made bacterial feeder foods like flour and sugar the cheap and easily accessible. Until we can imagine devices capable of fighting back against the calamitous cruelty of our prefabricated work environments and the illusory economy its meant to honor, we will remain as communally sick as we have ever been no matter how statistically perfect any one person’s sleep.