Seduced By The Machine
Why We Allow Apps and Corporations To Tell Us How To Live
I’ve written before about how we debate the growing power of digital technologies from the wrong perspective. We’re so obsessed by the question of whether machines are rising to the level of humans that we fail to notice how the humans are becoming more like machines. Humans are strange, idiosyncratic, perverse and surprising, which is why we’re fascinated by each other. But we’re trying to make our unruly selves more predictable. Rather than wondering how we might avoid obsolescence by leaning into who we are, we seem to be saying to the machines, “Don’t worry, we’ll come and meet you half-way.”
How polite of us!
The philosopher C. Thi Nguyen makes an intriguing argument along similar lines in a new paper, now in pre-print. Nguyen is interested how and why people allow their core values to be colonised by metrics and rankings. He is slightly vague about what he means by “values”, but I take him to mean something like “the things we care about most and aim for”. Our individual goals and purposes; what we want to do with our time on the planet.
How do we arrive at our values? Nguyen suggests we start with an inchoate, semi-articulated desire for something, like good health, or expertise, or a job we love, and then journey towards our own version of it through a process of trial and error. We try out different preferences, poses, ambitions, activities, getting feedback from experience along the way. If all goes well, we gradually adjust towards happiness and fulfilment, and away from unhappiness and boredom.
When that process is given time to unfold the values we end up with are rich and subtle and, crucially, self-tailored. They’re rooted in who we are, or aspire to be, as individuals. We’re happiest when we’ve achieved a fit between these values and our individual circumstances: the job that suits me, the fitness regime that makes me feel good, the diet that gives me sustenance and pleasure.
Increasingly we also have our goals defined for us by technology and by modern bureaucratic systems (governments, schools, corporations). But instead of providing us with something equally rich and well-fitted, they can only offer us pre-fabricated values, standardised for populations.
Apps and employers issue instructions with quantifiable metrics. You want good health - you need to do this many steps, or achieve this BMI. You want expertise? You need to get these grades. You want a promotion? Hit these performance numbers. You want refreshing sleep? Better raise your average number of hours.
Nguyen’s core point is that these metrics do more than give us a way to monitor the progress of our own intrinsic values; they become our values. We internalise them. The metrics become what we care about, what we dream of. He calls this “value capture”.
When our values are captured we put aside our vague goal of feeling good about ourselves and do what the fitness app proposes, even if it makes us miserable. We forget that Instagram is meant to be fun and obsess over the number of likes we get for each post. We turn every mealtime into a health and nutrition challenge. Instead of aiming to become a good pianist, we seek a pass grade in the next exam. Instead of choosing a university that accords with our vague and developing sense of who we are, we pick one that’s in the top ten of the latest ranking.
In all such cases, we are outsource our own aspirations to metrics set by external agents. Nguyen sees value capture happening to individuals and to institutions, in a self-reinforcing loop. Corporations increasingly assess employees on standardised measures which can be used in any department, office or country. Those measures tend to be similar to the ones used by all the other corporations. Employees, whatever their background, learn to chase these same metrics. Everyone converges.
In a famous study of US law schools, cited by Nguyen, the sociologists Wendy Espeland and Michael Sauder showed how a ranking system introduced in 2010 destroyed the diverse ecosystem that had developed over the past century, as every school started chasing the same “thin metrics”. The programs on offer became homogenised and less accessible to students with different ideas about what they wanted from a legal education and career. Students stopped reflecting on what they themselves cared about and simply assumed they should go to the “best” school, as defined by the ranking.
Nguyen isn’t against quantification or technology and his paper is by no means a crude rant against the spread of metrics. Neither does he pretend that our values can ever be totally authentic or autonomous. We always take them from elsewhere, to some extent. But he does believe that value capture, in the form he describes, comes at a cost to happiness and human flourishing - that over time, it induces boredom, worry, and ennui. Why? Because it prevents us from achieving a fine fit between our values and our unique circumstances.
Without the constant presence of metrics, you discover what makes you happy in a kind of meandering conversation with the world. As I’ve noted before, the example of artistic development is helpful. If you’re a pop musician, you try out different styles and approaches, sometimes taking off in the wrong direction, other times hitting gold by accident - that’s how you discover your own voice. But if the game is to get a certain number of downloads on Spotify, then you have to accept Spotify’s definition of a successful pop artist, and write songs that fit its metrics.1 Your self-development is stunted. You feel unfulfilled as a result.
At the everyday level you might have discovered an idiosyncratic exercise regime that makes you feel good, but since there’s no app for it, you do whatever FitBit deems appropriate, even it doesn’t quite feel right for you. In all such cases you give up on fine-tuning your values - on the messy process of fashioning your self - and simply accept a set of ready-made values which have little to do with your individual context, and which will therefore always be a bad fit for you. You sacrifice your fulfilment, to please the machine.
Of course, apps offer individualisation, but they do so in a crude and inflexible way, by necessity, since they are serving millions of people at once. Nguyen draws parallels with how modern bureaucracies, whether state or global corporation, render the impossibly complex variety of human life into a series of “legible” units that can be measured, standardised, and moved around. That narrows their vision. They overlook what can’t be measured or standardised, which happens to coincide with much of what makes life worth living, like individual dignity.
Just as individuals inside a bureaucracy can’t do much about the rules on which it is run, users of apps can’t engineer how an app works beyond whatever simple customisation options the app manufacturer deigns to offer. Here’s Nguyen:
Value capture, even when consensual, involves a low degree of granular control over the details of the contents of one’s value. It puts you in the same relationship with your values as you have with, say, your iPhone’s End User License Agreement. When you click to sign a EULA, you did, technically, consent, and you are, technically, responsible. But you only have one binary choice: accept the whole package or not. When we permit ourselves to be value-captured by institutional values, we have the same low granularity of control over our values: we either accept the whole package, or not. You can’t get control over how your FitBit counts steps, or how the edifice of higher education counts citation rates and impact factors.
Nguyen’s argument reminded me of a recent post by Rory Sutherland on the inflexibility of search engines. Most online search and selection shows you results for the prompt you’ve given it. That sounds fine, doesn’t it? The problem is we often don’t know what we really want. Offline, we work that out in a messy, recursive, iterative process of searching. Online search just assumes we already know, and over-efficiently hides from us information that doesn’t meet our stipulations but which might help us.
It gives you properties in a given area and within certain price bounds while hiding what might be a more appealing property just outside of those parameters. Or take online dating. Maybe the person you’re actually going to fall in love with is an inch shorter or a year younger than the limit you set online, or has very different tastes in music or politics. Online search won’t tell you that. It offers no surprises or wild cards. It’s a cliché of romantic comedies that people only think they know what they want in a partner. Often we only discover who we’re looking for by meeting the person we had no idea was right for us. What’s the algorithm for that?
All this begs the question of why we allow our values to be captured. Why are we surrendering personal autonomy so cheaply? Well, these apps are really useful. They can improve our lives. They’re often very good at motivating us to do things we want to do but might not otherwise get round to. To some extent, we want them to capture our values. But it’s like a founder hiring a notoriously ruthless financial director to cut costs, only to find, a year down the line, that she has been completely marginalised and the FD is calling all the shots.
Human beings tend to prefer certainty over doubt, comprehensibility to confusion. Quantified metrics are extremely good at offering certainty and comprehensibility. They seduce us with the promise of what Nguyen calls “value clarity”. Hard and fast numbers enable us to easily set goals, justify decisions, and communicate what we’ve done. But humans reach true fulfilment by way of doubt, error and confusion. We’re odd like that.
This post is free to do share it if you enjoyed it. After the jump: a rattle bag of goodies. Paid subscriptions are what enable me to write this thing at all. So buy yourself, or a friend a present and sign up today. It’s easy and cheap.