The Tricky Business of Computing Ethical ValuesReading Time: 5 minutes
Why it’s so hard to compute ethics.
An expert in computing responds to Tara Isabella Burton’s ‘I Know Thy Works.’
In 2018 researchers from the Massachusetts Institute of Technology Media Lab, Harvard University, the University of British Columbia, and Université Toulouse Capitole shared the results of one of the largest moral experiments conducted to date. They recorded 40 million ethical decisions from millions of people across 233 countries. The experiment’s ‘Moral Machine‘ posed to users variations of the classic trolley problem, imagining instead the trolley as a self-driving car. Should the car swerve and collide with jaywalking pedestrians or maintain its current trajectory, which would yield inevitable doom for the passengers inside? What if the jaywalkers are elderly? What if the passengers are doctors?
The results of the study highlight various preferences and implicit bias in decision making (for example, sparing doctors over elderly people), as well as cultural preferences between countries clustered into global Western, Eastern, and Southern categories. (For example, countries in the Southern category showed a strong preference for sparing physically fit individuals.) There’s debate over whether the trolley problem is really the right framing to interrogate ethical choices (specifically whether decisions made in the simulation really reflect what you would choose in real life), and the experiment itself has also been criticized for both its setup and assumptions. But the Moral Machine does provide a fascinating example of an algorithm that observes your ability to make decisions in a handful of tense driving situations, and then spits out a bulleted list of your preferences for saving babies over dogs, and how that compares with the rest of the world (or at least the rest of the people who have taken the test). There’s something oddly compelling about seeing a neat, orderly enumeration of your ethical values—the Moral Machine is much quicker at computing those supposed values than was the college roommate you stayed up all night with discussing the sticking points of utilitarianism.
Tara Isabella Burton’s ‘I Know Thy Works‘ revolves around a similar human desire to skip the lengthy debates and figure out the next best ethical thing to do in a logical, calculated fashion. The story does this through the Arete system, whose goal is to ‘outsource morality.’ First, users input their meta-ethic, the driving principle they want to base decisions on (The pursuit of truth at all costs or The greatest happiness for the greatest number). Then, the app builds recommendations and suggestions to comply with that ethic: for example, what jobs users can work or pleasures they can indulge in. The app’s ethical accounting also tracks users’ every action and calculates a publicly posted score, which is linked to things like CVs and dating profiles. The plot of the story revolves around characters who try to briefly escape the system through clandestine Black Dinners, during which they leave their phones—and their meta-ethics—behind.
The Arete system is fascinating in its conception, and perhaps not too distant from our own reality. Effective computability, broadly defined as the domain of phenomena that can be algorithmically measured and computed, used to be restrained to measuring physical quantities like vehicle speed or electrons flowing down a copper wire. Now tech’s inextricability from our lives means we can compute more abstract quantities, such as one’s FICO score or the attributes of a perfect romantic partner. Smartphones and wearable devices have folded the outside world into our cognitive processes as extended minds, which in turn has made users reliant on the algorithms and applications that organize our groceries, help us communicate with our loved ones, and remind us to take our medication. The final frontier for this creep in effective computability is ethics and morality: Can we outsource our own ethical decisions to machines? And if we can, do we want to?
Indeed, Arete builds upon a culture of tracking and surveillance that has accompanied our lives online, especially when it comes to personal well-being and self-help apps. People log their every minute on productivity apps and every dollar on budget apps, they meditate on mental health apps and pray on religious ones, they count steps and calories—and they willingly consent to sharing their personal information with companies and advertisers for this convenience. Beyond personal apps, we’re also enmeshed in larger government and economic systems that leverage user surveillance and tracking to build profiles for individuals and make algorithmic decisions, which in turn often alter the trajectories of lives. These systems range from credit scores (which endeavor to judge financial trustworthiness) to pretrial risk assessments. There are also examples like China’s social credit system, which claims to assess financial and ‘social’ creditworthiness. These systems have their own meta-ethics and automated processes to compute alignment to those ethics and are ripe with examples of bias and discrimination against minority groups (for example, higher mortgage-application denial rates for Black Americans).
Conceptually, it would be challenging to design a computational ethics algorithm like the one described in ‘I Know Thy Works.’ Implicit in Burton’s story is the idea that the meta-ethics are linguistic in nature and translatable into logic compatible with Arete’s code base. In reality, it would be hard to assign a philosophical principle—the greatest happiness for the greatest number—with a computational output. Determining such clear moralistic/ethical statements has perplexed philosophers for centuries, and it is unlikely that a person could articulate a coherent set of meta-ethics for themselves within Arete. Indeed, characters in Burton’s story have trouble picking a meta-ethic and sticking with it. These sorts of dilemmas are familiar: How can one simultaneously express a desire for privacy and security in personal life while yearning to display their life on social media? As the story notes, ‘We all want exactly two things in this life: to be seen, and to be invisible. The question is which one we want more.’
At the same time, modern algorithms grounded in machine learning do not rely solely on explicit logical rules—they also crowdsource their decision making based on statistical patterns gleaned from large amounts of training data. Although this has led to great mimicry of intelligent behavior (hello, ChatGPT), developing an algorithm that mimics the statistical majority of ethical decisions in its recommendations is fraught. These models aren’t good at handling the complex moral quandaries that arise at the fringes.
In the story, as Arete’s algorithmized ethics govern the minutiae of daily life, the idea of personal resistance becomes central: How do you assert ethical independence? There’s a performative aspect to ethics, as with other aspects of identity. The public scoring systems on both Arete and other, real-world apps (like step counters or Instagram) are designed to facilitate that performance. The story has another notable stage: the Gothic ambience of the Black Dinners, distinguished by excess and the desire for escape. Although the theatrics of the story’s particular Black Dinner ultimately culminate in tragedy, there is an undercurrent of hope for individuals coming together, bending and shifting their values in defiance of a cold ethical calculus, avoiding the ‘wrenching emptiness in [their] sternum.’ I’d like to think that, in that world, I would also turn off my app and turn up my apocalyptic playlist until morning.
MediaDownloader.net -> Free Online Video Downloader, Download Any Video From YouTube, VK, Vimeo, Twitter, Twitch, Tumblr, Tiktok, Telegram, TED, Streamable, Soundcloud, Snapchat, Share, Rumble, Reddit, PuhuTV, Pinterest, Periscope, Ok.ru, MxTakatak, Mixcloud, Mashable, LinkedIn, Likee, Kwai, Izlesene, Instagram, Imgur, IMDB, Ifunny, Gaana, Flickr, Febspot, Facebook, ESPN, Douyin, Dailymotion, Buzzfeed, BluTV, Blogger, Bitchute, Bilibili, Bandcamp, Akıllı, 9GAG