April 29, 2019 (1740 words)
::
Why software engineers need to worry about the social, economic, political implications of the products they build.
Tags: working-in-tech, personal, game-of-thrones
This post is day 119 of a personal challenge to write every day in 2019. See the other fragments, or sign up for my weekly newsletter.
For the software designer, programming with interaction involves seeking a kind of magical moment of transformation, a moment when one begins to get back more than what was put in; an unexpected moment when the system seems not only just to work, but to almost come to life; a moment when what had previously been a noisy mess of buggy half-working mechanisms seems to flow together and become a kind of organic whole.
The elusive chase for this kind of transformative moment is the essential reason why geeks keep banging away at their keyboards, deep into the night, deprived of sleep and propped up by caffeine and sugar and the adrenaline of the experience of feeling in contact with something larger than oneself.
(A quote from Michael Murtaugh’s essay “Interaction” in the 2008 book, Software Studies: A Lexicon, edited by Matthew Fuller.)
Three and a half years after my startup was incorporated, it got acquired by another company. To be clear, this was not a successful exit; it was an acquisition for parts, at bargain basement rates, because we didn’t want anything to do with the business anymore. I had already started a masters program in a wholly unrelated topic, and I was just beginning to shake off my former life in favour of diving into left politics. I personally would have been okay with letting the company die and (metaphorically) throwing the body into the Hudson.
And yet, if you had asked me how I felt about our prospects two years before that, I would have defended it at all costs. Not because I strongly believed in the mission or anything - even then, I found our mission (such as it was) kind of dumb, the product an unnecessary addition to an already crowded space, and the industry vaguely parasitical. But because I was so attached to the technical stack I had built that I needed to see it thrive, even if it meant being folded into a bigger company whose mission I found even more meaningless. It was my baby, and I almost didn’t care what it was used for - I just needed it to continue.
It’s embarrassing to reflect, now, on how much pride I had in that system. To be fair to my past self, there was a reasonably complicated data pipeline that involved machine learning, and over time I had come up with tricks to manage its various components: a testing framework that nicely incorporated live data, a seamless way to reload training models, instant notifications if anything needed manual revising. It was the first time I’d been given complete technical autonomy over such a large project, and there was something so intellectually satisfying about zooming in on a well-defined problem like this.
But I didn’t actually care about the product. We barely had any actual users, especially in the early days, but even in our heyday, when we had actual customers paying good money to use our product, I just couldn’t bring myself to care about them. At best, we were helping brands to more accurately understand their customers so they could market to them more efficiently. At best. And yet, I was still engrossed by the technical challenges of building this product, to the point where it basically took over my life.
Cognitive dissonance, I suppose. I wanted to work on a cool technical problem, and in the back of my mind, I wanted it to mean something more, something larger and more meaningful. But I also knew that it didn’t - that in the best case, it would shift some more commodities into the hands of consumers who didn’t really want them anyway, in order to raise some numbers in a PowerPoint that no one would pay attention to. I guess I tried not to think about it too much.
Thinking back on it now, I marvel at how easy it was to be carried away by the craft of it. I spent an ungodly amount of time submerged in this neat little self-contained system, coming up with elegant ways to solve problems that I merely assumed would be worth solving. If I had to explicitly decide what to work on, would I have chosen the field of surreptitiously collecting data on people so that brands could target them better? Hell no. But it was the business model we pivoted into, and I was happy enough with the technical challenges at hand that I didn’t spend too much time thinking about the ethical or social implications of what I was building.
And maybe I would have done it for a lot longer, but then all the political wildfires of 2016 happened and suddenly it felt like everything was exploding. I came to see that my neat little system was a distraction, and that the elegance of the code behind my vanity project didn’t matter if the world was unraveling at the seams.
Two years ago, after the infamous incident where United Airlines physically removed a passenger from a flight, Albert Burneko of Deadspin wrote a response titled The Corporation Does Not Always Have To Win. It’s a good piece, and you should read the whole thing, but I think about this passage in particular a lot:
If I were a corporation, the spectrum of possible responses to these distractions [the noise from his dog and bees outside] would include killing my dog and encasing my home in soundproofed concrete […]. Alas, I am a human—I cannot be a corporation, and a corporation cannot be me—so the dog gets to live. For now.
But the point is: You are not the corporation. You are the human. It is okay for the corporation to lose a small portion of what it has in terrifying overabundance (money, time, efficiency) in order to preserve what a human has that cannot ever be replaced (dignity, humanity, conscience, life). It is okay for you to prioritize your affinity with your fellow humans over your subservience to the corporation, and to imagine and broker outcomes based on this ordering of things. It is okay for the corporation to lose. It will return to its work of churning the living world into dead sand presently.
There’s a similar point to be made here in relation to the tech industry. There’s something about getting stuck into a technical problem that makes it tempting to wipe out all other concerns - specifically, the broader context in which the technical problem arises. I’m sure this occurs in domains other than software as well, but I think it’s especially prevalent in tech, because you often get the chance to work on something with massive, far-reaching impacts.
And if you think about those impacts in the abstract, you can be proud of your work. You’re organising the world’s information and bringing the world closer. You’re enabling frictionless transportation and fundamentally redesigning the way cities operate. But when you look at the details, the picture starts to look less pretty: you’re writing code to steal tips from workers who are already underpaid and overworked. You’re building augmented reality tools to help the US military kill innocent people abroad. You’re getting paid shit tons of money to get people to click more ads while your (female, non-white) colleagues who don’t have Stanford or MIT on their resume are getting 3-month contracts with lousy pay and no health insurance.
Once you start to think about all the ethical implications, it all starts to feel very unpleasant, very quickly. Hence the temptation to ignore it: why should you have to care about the moral issues in the first place? Can’t you just ignore all the political and social stuff to focus on your code, which you’re really proud of?
But you are not your code. You are a mortal human being whose ability to use the Internet, go to work, and write great code depends on the continued functioning of a society that provides all these things. This society is currently crumbling due to (among other things) vertiginous wealth inequality and ecological destruction, and the code written by people just like you is (perhaps unwittingly) making things worse by further concentrating wealth in the hands of those who will never fully relinquish their ill-begotten power.
I’m not saying that it’s a bad thing to occasionally lose yourself in code or whatever it is that you enjoy doing. Believe me, I get the appeal of escaping into a clean, beautiful system that’s under your control. It’s just that it’s also good to occasionally reflect on what your efforts are being used for, and whether that outcome aligns with the world you’d want to live in.
The problem with the tech industry as it stands today is that you cannot just take for granted that your code is making the world better - sometimes, it may be actively making the world worse. If you don’t want this to be the case, you will probably have to fight for it, by organising a union or engaging in broader activism to change the underlying political landscape. The tech industry - a reflection of the economy at large - is not, at the moment, set up to channel employees’ natural inclination to get lost within a system towards positive outcomes. Instead, it is structurally set up to funnel energy towards ventures that make rich people richer, whether or not they serve moral ends.
And honestly, it’s exhausting to have to think about all this; life would be much simpler if we could all just get paid tons of money to work on fun technical problems without having to worry about the crumbling legitimacy of our political system. But this is the world we were born into, and this is the fight before us.
At the risk of sounding overly dramatic: I recently watched the latest episode of Game of Thrones, and I bet most of the people fighting at Winterfell wished they were somewhere else, living in peacetime rather than facing the horror show that is the impending zombie invasion. But that’s the battle that’s in front of them, and fleeing isn’t really an option - it doesn’t postpone the fight, or make the problem go away; it just shifts the burden of fighting onto others, who may fail. And if they fail …