We turned our bodies into dashboards and our lives into optimization problems. The numbers aren't setting us free.

You slept 7 hours and 23 minutes last night. Your sleep score was 74. Your REM was below average. Your resting heart rate was 58, which is good, but your heart rate variability was 34, which is below your baseline. You took 8,247 steps yesterday, falling short of your 10,000-step goal. Your screen time was 4 hours and 12 minutes, up 17% from last week. Your readiness score is 68. Your body battery is at 43%.
How do you feel?
That's a trick question. It doesn't matter how you feel. The numbers have already told you how you're doing.
In 2016, Jordan Etkin, a researcher at Duke University, published a study that should have made every fitness tracker manufacturer nervous. She found that measuring an activity reduced the enjoyment people derived from it. Participants who tracked their walking reported less pleasure in the activity than those who didn't, even when they walked the same amount. The act of measurement transformed walking from something you do into something you perform. The number on the screen became the point, and the experience itself became instrumental.
This is the Hawthorne effect turned inward. The original Hawthorne studies, conducted at a Western Electric factory in the 1920s, found that workers' productivity changed simply because they knew they were being observed. The observation itself altered the behavior. What Etkin demonstrated is that self-observation does the same thing. When you measure your own experience, you're not a person going for a walk. You're a person generating step data.
And that shift, from experiencing to measuring, changes everything about how you relate to your own body and your own life.
In 2017, researchers at Rush University Medical Center and Northwestern University coined the term orthosomnia to describe a new clinical phenomenon: patients who were developing sleep disturbances as a direct result of obsessing over their sleep tracker data. People were lying awake worrying about whether they were getting enough deep sleep. They'd check their scores in the morning and, if the numbers were bad, feel tired regardless of how they actually felt before looking at the screen.
The researchers, led by Kelly Glazer Baron, documented cases where patients' self-reported sleep quality worsened after they started using trackers, even as their objective sleep (measured by polysomnography) stayed the same or improved. The tracker wasn't revealing a problem. It was creating one.
This gets at something the quantified self movement doesn't want to confront. These devices aren't neutral measurement instruments. They're interpretation machines. They take continuous biological signals, run them through proprietary algorithms, and spit out a number. That number feels authoritative (it's data, after all) but it's an abstraction layered on top of a much more complex reality. Your body doesn't have a sleep score. Whoop gave you one.
And once you have the number, you can't unknow it. You can't just feel rested anymore. You have to be rested, as defined by an algorithm you can't inspect.
Deborah Lupton, a sociologist at the University of New South Wales, has written extensively about what she calls the quantified self as a cultural phenomenon. In The Quantified Self, she argues that self-tracking technologies don't simply record our lives; they produce a specific kind of self. A self that is knowable through data. A self that can be optimized. A self whose value is legible in metrics.
This is a profound shift in how we understand what it means to be a person. For most of human history, self-knowledge was qualitative. You knew yourself through reflection, through relationships, through experience. You knew how you felt. The quantified self replaces this with a regime of numbers. You don't know how you feel. You know what your HRV is.
Gina Neff and Dawn Nafus, in their book Self-Tracking, make a related observation. They note that self-tracking devices don't just measure; they normalize. They establish baselines, define acceptable ranges, flag deviations. They create a standard you're perpetually measured against, one that's derived from population-level data and may have nothing to do with your individual physiology.
When your tracker tells you your VO2 max is "below average," it's comparing you to everyone else in its dataset. It doesn't know about your asthma, your genetics, your age, the altitude you live at. But the number feels like a verdict. And you internalize it.
Here's the thing nobody wants to say plainly: a device that monitors your heart rate, sleep patterns, location, activity, and stress levels 24 hours a day is a surveillance device. The fact that you bought it voluntarily and it sends the data to your phone instead of a government database doesn't change what it is.
Apple, Google, Fitbit, Oura, Whoop: they all have access to your biometric data. Their privacy policies are long and lawyered. The data is "anonymized," which means roughly what it sounds like: roughly. Researchers have demonstrated repeatedly that anonymized health data can be re-identified, especially when combined with other data sources.
But the privacy issue, while real, is almost secondary to the psychological one. The more insidious surveillance is the one you conduct on yourself. You've installed a panopticon on your wrist. You are both the watcher and the watched. And the watching never stops.
Michel Foucault would have had a field day. The disciplinary power he described, power that operates not through force but through constant visibility, through the internalization of the observer's gaze, is exactly what these devices enable. Except the power doesn't emanate from a prison guard or a supervisor. It emanates from a widget on your home screen.
Streaks are the gamification layer on top of the surveillance layer. Close your rings. Hit your step goal. Maintain your meditation streak. Don't break the chain.
Streaks exploit a well-documented psychological vulnerability: loss aversion. We hate losing progress more than we enjoy making it. A 30-day meditation streak doesn't make you want to meditate on day 31. It makes you terrified of not meditating on day 31. The motivation isn't positive. It's anxious.
I've watched people exercise while sick because they didn't want to break a streak. I've watched people stress about their stress scores, the irony of which should be disqualifying for the entire industry. I've watched people check their sleep data before they've even noticed how they feel in the morning, letting a number override their own embodied experience.
At what point does self-improvement become self-harassment?
When every dimension of your life has a metric, you start to devalue the dimensions that don't. How do you quantify a good conversation? What's the score for feeling at peace? Where's the data on whether you were truly present with your kids at dinner or just physically in the room while mentally reviewing your activity stats?
The quantified self doesn't just measure what it measures. It implicitly argues that what it measures is what matters. Steps matter. Calories matter. Sleep stages matter. The unmeasurable (contentment, meaning, depth of experience) gets pushed to the periphery. Not because you consciously decide it's less important, but because the dashboard doesn't have a field for it.
This is the logic of what the philosopher Ian Hacking calls making up people. The categories we create don't just describe the world. They change it. When you define a person by their metrics, you create a person who sees themselves through metrics. The tool shapes the user.
I'm not arguing for ignorance. If you have a heart condition, monitor your heart rate. If you're training for a marathon, track your mileage. Measurement has its place: specific, bounded, in service of a defined goal.
But that's not what the quantified self movement sells. It sells total legibility. The idea that if you just collect enough data about yourself, you'll finally understand yourself. That self-knowledge is an engineering problem, solvable with better sensors and smarter algorithms.
It's not. Self-knowledge is a human problem. It requires sitting with uncertainty, tolerating ambiguity, and sometimes accepting that you don't know why you feel the way you feel, and that's okay.
You're not a dashboard. You're not a score. You're not a streak.
Take the damn watch off once in a while and just go for a walk. Don't count the steps. Notice the trees.
Join my newsletter to get notified when I publish new articles on AI, technology, and philosophy. I share in-depth insights, practical tutorials, and thought-provoking ideas.
Technical tutorials and detailed guides
The latest in AI and tech
Get notified when I publish new articles. Unsubscribe anytime.