I’ve talked before about how there’s no such thing as true randomness – that is, the term ‘random’ is just a shorthand for events with causality too complex to be determined by a human observer. However, because all events are connected by causality, because we live in a world of things affecting other things affecting other things, of butterfly effects, the term ‘random’ could be fairly used to describe nearly anything. For the same reason as there’s no such thing as random, then, there’s also no such thing as certain. You can never know with absolute certainty that a given event will happen, that a known effect will follow a cause, because there are a pseudo-infinite number of factors outside of our direct observation that may change the outcome.
What makes the idea of certainty so loaded is that so many hypotheticals are premised on certainty. This is part of the problem I was describing in my essay on strategy games last week: The outcomes for all your choices are known and quantifiable beforehand, making the choice of outcomes largely one of preference rather than guesswork and damage control. We can describe a trolley problem, where you choose to sacrifice one life to save five, but the real-world outcomes of our choices are impossible to conclusively determine beforehand, and all such moral and mortal calculus becomes suspect. When we believe that the ends justify the means, there’s an implied and unstated leap of faith that the ends are certain – which they never are.
One one level we know this. We comprehend that nothing is ever certain, that we cannot wholly rely on the outcome of our actions being what we predict – but it’s difficult to navigate day-to-day life keeping this in mind, so we just assume everything will basically function as intended until it fails to, and we’ll handle that on a case-by-case basis. This is fine. It’s fine! However, a great deal of discretion and self-interest go into when we take that leap of faith: If it’s more convenient to us to believe that something will definitely always work, we are certain. If we’d rather not take the plunge required by being certain of an outcome, even if it’s an extremely likely one, we will frame it by its uncertainty in our mind. Because nothing can be completely certain, anything we perceive to be certain or to be uncertain is perceived that way as much to motivate ourselves towards particular behaviors as by seriously evaluating its likely outcome. We quietly nudge ourselves one way or the other by reframing and restating the expected outcome: If you don’t want to tackle a task, you emphasize the undeniable risk of failure, and if you do you think only of what you will do after success. Or, frequently, the likelihood of a predicted outcome is not evaluated at all, but treated as axiomatic: Punish the child to prevent the misbehavior, lock up the criminal to prevent the crime, wage the war to end the injustice. Regardless of whether we have any information about whether these work, we definitely feel that they ought to work, that they are narratively if not statistically sound.
Since I mentioned hypotheticals, let’s bust out one of those hoary old questions. Say you’re given a box with a button: Every time you press the button, you get $10 and somewhere a random person dies. Well, that’s what you’re told anyway: All you can directly observe is that when you press the button you get $10. It’s obviously wrong to kill a person for $10, but it would be very easy to convince yourself that no one actually died when you pressed the button. I mean how would that even make any sense? Printing money is way more plausible than instant randomized death at range. Even if you did take it at its word, and were morally upstanding, would the next person who inherited the button be so? Or the next?
Eventually someone will convince themselves that it’s probably fine. It’s fine! Besides, they need the money. They press that button every second for a full 8 hour work day, 2,000 hours a year. At the end of the year, they’ve made $72 million dollars and killed 7.2 million people. $72 million dollars is approximately 0.4% of what Jeff Bezos makes in a year. 7.2 million deaths is approximately a 13% increase in the average number of deaths per year, so it would probably raise some eyebrows, but still I don’t know maybe it’s Coronavirus or something. Most folks would stop. Not everyone would. Certainly not everyone would if the box was less lethal – say if it was a lever that gave $100 but could only be pulled every ten seconds, or a $600 wheel that took a full minute to rotate. I don’t know if people would really notice an increased death rate of 0.2% per year. Maybe.
I suppose if there were perhaps 100 of these boxes, even if eventually the rate of death climbed towards that extra 13% it would be easy to assume it could be anything causing the extra deaths. Still, I imagine many box owners would stop using them once they understood what was going on, settle for their billions of dollars. Not all of them though. Over time, all 100 boxes would filter into the hands of one person, the guy who Does Not Give A Shit. This person has a lot of money, so he can afford to have other people use his money-death-wheel boxes for him. So now we have 100 boxes that kill one person every minute, in use 24/7. This would kill 52 million people a year, slightly less than doubling the annual death rate of humanity – but still not higher than the birth rate, so this probably wouldn’t kill the species. This would also earn $31.5 billion dollars a year. This is approximately half again what Jeff Bezos earns per year.
Now. None of this is happening. There are no magic death money wheel boxes. However, the same dynamics are at play: Power accumulates in the hands of those willing to accumulate it most ruthlessly, with the most disregard for the well-being of others. Part of the reason this is true is that it is so easy to believe in your own plausible deniability, to create uncertainty or certainty as it best suits you, to create the uncertainty that these deaths are really caused by you, to create certainty that it’s fine (it’s fine!), that you earned this money, that really the problem with the world is the surplus population anyway, to create certainty that you’re just doing what anyone would do. And yet the wheels turn, once a minute, and fall slowly but surely into the hands of those who care least about the future of humanity.