BLOG

Forgetting ourselves

It’s a funny old time of the year. Mustard yellow leaves stick to our shoes wherever we walk, a perpetual reminder of the looming winter and darkness. As we mark Halloween and All Saints’ Day, we remember the dead and recognise the connection between the living and the departed. In the UK, our costumes evoke death and the mythical. In continental Europe, we light candles and lay wreaths by graves. Elsewhere, people might fly kites or burn incense. It’s an occasion to think of and speak to loved ones who have passed away. But rarely do we expect them to answer back.

Cue the digital afterlife. Despite the many technical constraints, pop culture has long been taking the afterlife to new levels. Most recently, the Netflix series Black Mirror repeatedly conceives dystopian scenarios of our brains being uploaded to the cloud and digitally resurrected, as a chatbot or a robot (depending of course, on what you can afford). In the 2017 episode USS Callister, the protagonist creates sentient digital clones of his colleagues and inflicts punishments. San Junipero (episode 4, series 3) is a simulated reality the deceased can inhabit. Watching these, it’s rather a relief to know that with our current understanding of the human mind and technological capabilities, producing a digital copy of a person’s consciousness is a long way off. That’s not to say there aren’t those who are giving it a good shot.

Eterni.me is an app in development that collects stories, memories and thoughts from the user’s digital footprint – from Facebook, LinkedIn, etc. – which are uploaded to a digital avatar. Loved ones can then interact with it after the person’s death. The company thinks of it as “a library that has people instead of books”. A similar app, Replika, was born out of the grief its founder, Eugenia Kuyda, felt after the death of her best friend. 2.5 million people have signed up and its core users fall within the 18 – 25 age category. A general attraction of chatbots appears that they don’t judge users in the same way humans might; it seems people yearn for communication, but who they communicate to is secondary.

In his posthumously published book, Brief Answers to the Big Questions, Stephen Hawking hails brain-computer interfaces (BCI) as the “future of communication”. These are devices that convert messages carried by an individual’s nervous system into commands that can control external software or hardware such as a computer or robotic arm. Benevolent applications include research into restoring function to people who have severe motor or sensory disabilities. Other applications are focused on augmenting human capacity, and extracting thoughts from the brain rather than from our Twitter threads. Elon Musk’s Neuralink wants to merge AI with our brains, so that we could upload and download digital information directly. For now, existing tools can access a tiny percentage of our neurons, BCI-induced movements are slow, and bionic eyes offer very low-resolution vision. And yes, the brain must be living.

As scientists continue to stretch our technological limits, and as pop culture stretches our imagination, ethical concerns emanate. Months ago, we posed an oft-repeated question in our report ‘Ethical, Social, and Political Challenges of AI’; “Just because we can, should we?”. Questions of equality of access come up too – computer-brain interfaces and robotic limbs are expensive. And then there is the question of how ethical it is to monetise grief. How do we know whether the ambition to mummify our memories is driven by the industry or by profound human yearning? These developments will also be shaped by the dialogue around data. For example, the European court of justice is expected to rule on the right to be forgotten in 2019, which could allow citizens to demand that outdated, irrelevant or excessive data about them is removed from online search results (and which some argue is a threat to free speech).

Carl Öhman and Luciano Floridi, both at the Oxford Internet Institute, refer to the ‘Digital Afterlife Industry’. DAI for short. Startups and tech giants like Facebook and Google offer a variety of services, from simple password deposits, to the avatars and chatbots discussed above. The authors point to the ethical and regulatory frameworks already established in archaeology, which emphasise dignity, irrespective of whether the subject is aware or not. They call for industry agreements as a minimum, to incorporate requirements that would inform users and protect individuals. Eterni.me’s co-founder, Marius Ursache, once said he plans to hire a psychologist, but developing sophisticated AI comes first. Personally, I sense some irony in resurrecting the deceased as avatars based on social media personas that only reveal a fraction of our true selves. I also wonder about their impact on the grieving process, and on traditions like Halloween and All Saints’ Day.

 

Nika Strukelj