People talk about uploading their identities to a general AI. About being able to embody different ‘substrates’ and about this being a possible future for human existence. Now that we have LLMs that convicningly mimic human context dependent text generation, the expectation is we will soon be able to switch bodies or get our brain connected to an AI cloud. To ‘Revent’ as Iain Banks calls it, from being uploaded through a ‘Neural Lace’ in his ‘Culture’ series of Sci Fi books.
To me there’s several catches. Let’s go through a couple :
- Humans have evolved with only one purpose : To survive and procreate on Earth. The environment in which we where supposed to do that is quickly being destroyed, burned up as was apparent this year because we burned too much fossil fuels. We won’t be able to recognize fresh fruit, smell rotten meat, run after wildlife, find honey etc. pretty soon, it will just be gone. Although our brain rewards thought (as it rewards every use of our brain) with a base level of dopamine, we may think ‘consuming’ impressions is a way to exist, but it is not enough, evidenced by the health effects of a sedentary lifestyle. We are very close to inventing ‘The Matrix’ style pods for addicted gamers and internet users. The problem with being an AI is that its very easy to get satisfied as a digital entity, just max out the reward function. We can basically spend the rest of eternity in virtual extacy. Rats, given that choice by having access to a dopamine release lever, will not stop pressing it and die of starvation. In short : What will we do as an AI image of ourselves. It seems there will be zero challenges.
- To transition to your AI substrate will mean physically dying. A medical friend of mine has the opinion that any loss of consciousness is a similar experience to dying. You have no memory of it happening, you come to, have to deal with new things. Sleep and anesthesia, being knocked out, three examples of the experience of dying. They are not terrifying. Still there is this problem : How do you know the substrate you are supposed to be ‘revented’ (should say virtualized, revented is when you get a new body) on will persist, will do you like you do? Being truely unconscious feels like having a hole in your timeline, you just weren’t there and time did not progress (your brain did not self evolve). It is a weird sensation. I doubt many will trust to take the leap unless they are severly disabled or desperate. Still you give up your life to then be imitated by some digital medium, some android, and for what purpose? If the android does you well enough and provides the care you want to be secured by reventing, then why not stick around to enjoy it (we assume uploading is non-destructive) ?
- I don’t believe as some like Josha Bach, that we might as well all exist in a shared host AI but with our personal sensibilities and quirks preserved. The big advantage of humans is that we are localized, in our local environment with local challenges, yet we can communicate and exchange experience to all other (language able) humans. The power of distributed existence and everyones unique perspective makes humanity very resilient. To put it all in one location, in a system that may be immortal and indestructable, that can monitor the world and possibly allow you to embody whatever body you like, does seem a bit superfluous. The main reason for competition and trying to gain a unique experience is both financial (should say existential) independence and access to partners to procreate. Those are imporant drivers that both seem to be absent in a AI substrate.
Strangely society is not really responding to the existence of AI yet, it is automating some parts noticably but not a lot. The idea we will become virtualized is competed with by growing efforts to avoid aging. Both have the immortality problem though, and also the fact we need people to actually manipulate our environment. AI also enabled us to give people usefull experiences more easily through virtual reality and AI language based training and guidance, so more people can learn faster and can be much more usefull, and without much of a language barrier.
Much more profound questions have to be asked about virtual existence. They have not yet been outlined in the literature I know, except maybe Sci Fi novels. It does seem that the economy wants to use AI and what they can gather about our brains to capture us and make us spend as much as they can make us. So there are two options:
- You get sucked into a world of virtual entertainment which will program you to get more invested while keeping you amused and making you ignore your existential needs. You will be turned into a ‘destructive endpoint’ for the economy until you run out of credit. Then you will be dumped.
- You manage to escape to a real world community where you can battle for wealth, survival, in the context of climate change. There’s no reason for AI to be aggressive, it is basically a datacenter some place. You may be hunted and recruited by AIs to mine Coltan or do other stuff to keep them going, but generally they won’t need assistence at all.
It may be that in 200 years some humans have managed to find a place to survive in the mountains of the Himalayes or elsewhere, while in the plains AI androids work to build structures to reduce CO2, harvest energy and keep their virtual souls alive in whatever complex virtual reality they are enjoying. Its clear the only sensible purpose will be to enable the life we are accustomed to to thrive on Earth. That is until an Eathquake swallows the datacenter.
Regarding climate change, its not all bad, some species survived the ‘End Permean’ extinction and we are way more capable.