Skip to content

philosophy

The Sanctification Button

I have a browser extension installed on my work laptop that blocks my access to Reddit, Facebook, and other news and social media sites. My employer didn't install it; I did. I have the same extension installed at home, albeit blocking a more limited set of time-wasting sites. On its face, setting up a system that does nothing but restrict my future options seems like a waste of time. Wouldn't it be easier just to choose to not visit those sites at imprudent times? In theory, sure. But I don't trust my future self, and restraint is taxing. I'd have trouble explaining why. Admittedly it's weird, but I think I can assume that you, dear human reader, at least understand where I am coming from, regardless of how much you rely on such things yourself. In behavioral economics, we call these kinds of mechanisms "precommitments". Odysseus bound himself to the mast before sailing past the Sirens. Gamblers leave behind their checkbooks and credit cards before a casino vacation. I am one of many who find it prudent to occasionally bind my future actions, restrict my future options, or simply nudge my older self in a certain direction. There remain plenty of mistakes that I would like to prevent, but for which there is no mechanism to preempt. Inevitably, technology will improve; new products will become available. Some of these, like the browser blocker, will be increasingly capable precommittment tools. How far should you go with this? Artificial intelligence combined with cybernetics could make any undesirable behavior potentially preemptable. Leaving aside the technical difficulties, if you could not only end your ability to lie, cheat, and steal, but also gossip and insult, should you? Taken to its extreme, if there were a button that removed your ability to sin, would you push it?