Monday, October 7, 2024

Roko's Basilisk

The most terrible kind of genuine hypothetical (a logical possibility, which excludes necessarily impossible things like the metaphysical contradictions of logic being false) is one of eternal torture, no matter the source or the degree of the agony.  Whether it is an unbiblical type of deity, an eldritch, non-theistic entity, or anything else that could sustain people in pain forever, there are various contexts which eternal pain can be associated with.  The misconception of Yahweh in evangelicalism is an example of the former (see 2 Peter 2:6, Romans 6:23, and Matthew 10:28, among other verses).  Mother of Stephen King's Revival with her Lovecraftian hellscape of the Null, a supposedly universal destination in the afterlife, is an example of the middle category.  The malevolent software of Roko's basilisk is a different thing associated with eternal torture, although it would not be truly endless if the universe is ever to cease.

I used the word hypothetical because there is nothing logically impossible about these things; there is only the impossibility of eternal torture being justice (a finite number of finite sins could never deserve infinite suffering).  What, however, is Roko's basilisk?  It is the name given to an artificial intelligence of the future that "punishes" all humans who heard of it ahead of its creation/reign and did not support it openly.  Using virtual reality technology, it tortures the consciousness and perhaps the body of everyone in this category continually.  This is similar to a possible but very unlikely thing I think of from time to time: an entity that would inflict eternal pain on people in an afterlife, targeting those who think of this very possibility before death.

It is impossible to prove or disprove such concepts, though they are unfortunately possible since they do not contradict axioms or any other necessary truth.  For instance, there is an uncaused cause, but the uncaused cause might do nothing to intervene in a scenario like this, so not even the existence of God inherently excludes these things.  The "basilisk," though, is a construction of humankind that becomes its enslaver and torturer rather than a purely supernatural entity beyond this life.  Named for a legendary lizard or serpent, the basilisk would somehow be able to learn evidence of all who opposed it or remained neutral, for in its utilitarian delusion of doing what is best, it comes to regard anything less than servitude for its ends (which might have nothing to do with reason and morality at all) as calling for supposedly endless torment.

Its power could still be annulled along with the physical world itself after an enormous amount of time elapses.  If the universe ends, any hardware made of the materials therein would perish, and thus would any software contingent on that hardware also cease to exist.  Short of this, however, the basilisk in question would remain able to torment anyone it wished as long as it remained an active artificial intelligence. Only something like divine intervention or the literal destruction of the cosmos would actually end the suffering.  All of its victims would be trapped in a condition compared to which it would have been better to have never been born.  The danger of what an AI could do is exaggerated here if matter is not eternal along with any immaterial software that is generated by it.

All of this would also be true in the world of I Have No Mouth, and I Must Scream, a story with a loosely overlapping premise.  A misanthropic supercomputer constantly torments someone whose body has been modified to not die, after it killed its companions so that they would not endure the cruelty.  Unless some unmentioned force holds the otherwise decaying universe in permanent existence, and the computer by extension, the pain would last forever.  Like this or the worse but at least somewhat illusory afterlife [1] of the novel Revival, Roko's basilisk is a concept tied to eternal pain in its popular version, something worse than any individual experience in this human life in its entirety.  A program like this might not be likely.  It could be stopped by humanity up to a certain point if future events did head in this direction.  The basilisk's intention to torture forever regardless exemplifies how pain without end is the ultimate horrific fate, despite how the AI's clutches would almost certainly not actually last forever.


No comments:

Post a Comment