Drifting into disaster?

I can feel my heart beating a bit faster in my chest.  Sweat prickles under my arms.  I notice I’m breathing through my mouth, and my fingers and toes are cold.  I’m scared.  Scared by a newspaper article.  

Too dramatic?  Maybe, but I am a “Highly Sensitive Person” (an unscientific but very useful term, coined by Elaine Aron decades ago).  I think of us as the canaries in the coal mine - we are the friends who ask before you know if something is bothering you, who dissolve in the gallery or at the concert because something is beautiful or moving, who have a selectively shared list of things they really truly hate which may include types of lighting, noises, or crowds.  So let me say, something is off, my friends, very off.

The article in question is in the FT Weekend Magazine  - the cover is headlined with “AI Will Tear Us Apart - a warning about a clear and present danger”.  It’s not firewalled, so I suggest you go check it out.

There’s a lot to unpack in this well-written article, but the first thing to note is where it’s published - the FT - typically read by…yep, you guessed it, wealthy white men.  From a positive perspective, given the underlying message (slow the f*** down) needs to be heard by the powerful, well, it’s in the right place.  From a negative perspective, as I’ve been saying, this conversation is not filtering down fast enough to the rest of us.  It’s “elite”, it’s predominantly male, it’s the powerful talking to the powerful.  And it goes without saying, sadly, that I don’t think there was a single woman referenced in the article. Why?  Because massive decisions that will impact everyone are made by a few private companies.  Those few private companies are staffed overwhelmingly by engineers.  And with only ~16.4% of software engineering graduates being female , the people developing this are overwhelmingly male and I’d hazard a guess to say overwhelmingly cis-het, from middle-upper class backgrounds, etc.   The blind leading society into blindness is how I’d put it.  

As I wrote a few weeks ago, we are at a turning point.  Hogarth cites more than $11bn of investment funnelling towards AI in just the first three months of this year - this looks like more than twice last year’s investment.  With investment comes talent and resources.  And what is all this chasing?  AGI (Artificial General Intelligence), what Hogarth calls “God-like AI”, which he describes as  “…a super intelligent computer that learns and develops autonomously, that understands its environment without the need for supervision and that can transform the world around it.” AGI is widely (though not universally) considered to bring “…significant risks for the future of the human race.” Hogarth goes on to write that the companies working on this “…are running towards a finish line without an understanding of what lies on the other side.”

There is debate about whether this sort of narrative is crying wolf.  Personally, I don’t care - as a former project manager, part of my job was to mitigate risk, which involved assessing the likelihood of the risk happening, the impact if that risk were to actually happen, and suggesting ways to avoid or reduce the impact of that risk.  In my opinion, the people that are arguing about if the risk is “real” or not are basically merrily sparking away, while having created a deposit of gasoline and flammables in open sight, while simultaneously defunding the fire department.  

And what is causing people to behave so irresponsibly?  Really, some of the same forces that have brought us to climate catastrophe - greed, hubris and ego.  Having worked on cutting edge technology myself - it’s exciting.  The feeling you are changing things for the greater good, creating something that could solve big problems, doing something deliciously complex that hasn’t been done before is borderline addictive.  And inherently, you believe that you have the ability to steer it in a virtuous direction, while potentially making megabucks yourself.  But we know - things don’t go this way - there are bad actors, and unexpected circumstances, and market forces create distortions in behaviours.  And let’s be clear - this debate is about future harms, but present harms are already occurring - witness the Belgian man who died by suicide after conversing with a chatbot (more on this in another article).  

As Hogarth points out, what is needed here is collaboration rather than competition, and that collaboration right now looks like slowing down and focussing more resource on what is called AI alignment - work that is responsible for ensuring that AI systems align with human values.  Imagine for a moment how complex this is - the starting point being that humans themselves don’t align on what human values actually are!  Hogarth also advocates a global system of governance for development of AGIs, “…removing the profit motive from potentially dangerous research and putting it in the hands of an intergovernmental organisation”, similar to how CERN is hived off from the world.  Worryingly, though, he says that 2023 is the year when governments need to take control of this.  Will they?  At least here in the UK, the government is much more concerned with persecuting refugees and fuelling a culture war against trans people.  I’m not convinced they have the intellectual firepower nor the political will to take on anything this complex.  

So what does this all have to do with therapy?  I’m still toying with words and phrases for this, so maybe this won’t stick - but I think we’re firmly in the era of “un-humaning” or “de-humaning” , having sleepwalked our way into it maybe 10-15 years ago. I love technology and so many of the changes I’ve witnessed since the late 90s…yet I can also see a steady erosion of how we relate to one another and ourselves as a result of many of the developments.  And this trend is set to accelerate, perhaps catastrophically, if we don’t start paying attention to what is being built and how it is being used.  It’s time to start “re-humaning”. 

Previous
Previous

Therapy Thoughts - Signs it’s Time to Explore Therapy

Next
Next

Big Conversations