As an addendum to last week's post, I was pondering the enthusiasm of Andrei Karpathy and many others (including myself) for the development of AGI - either intelligent general AI or general AI that is super-intelligent because of its ability to recursively self-improve by editing its own source code.
I was wondering how much of that enthusiasm might be anchored in cognitive bias. We should always be suspicious of the rationality of our motives if we have strong feelings about them. If we want them too much.
The creation of AGI and investing it with the power to address world-spanning existential dangers equals the raising of a sole godlike power. Almost a monarch. An authority quite unlike a committee.
It was only last September of 2022 that we witnessed extraordinary scenes in Britain during the mourning period of the late Queen, as the new sovereign, King Charles, accompanied Queen Elizabeth's coffin and toured the Kingdom. I was amazed at hearing crowds passionately shout 'Long live the King!' to drown out the rare protestor heckling the king. These are words I had long believed consigned to history.
I am mildly Royalist for reasons of logic and practicality, but perceiving this love for the new king put a small lump in my throat.
Even such potentially hostile arenas as Ireland yielded vast adoring crowds of people desperate to feel a connection with the new sovereign. It seemed to me like evidence of a deep desire for a benevolent leader. Given the present-day contempt for career politicians, there seems to always be a sublimated hope that a divinely-ordained leader will take the reins and do a good job.
I say 'divinely-ordained', because look at how monarchs and emperors present themselves to their subjects. They attempt to suggest their unity with the divine, and so with the cosmos, by the grandeur of their throne rooms and their luminous clothing. They often even claim descent from the gods. Godlike, they are unassailable, protected by the power of the state through legal systems and heavily armed guards. They display no involvement with anything so mundane as having cash in their pockets or needing to visit a supermarket.
In the same way, we would expect the AGI not to display unseemly behaviours like feathering its own nest, or craving popularity. We imagine such an entity as aloof, lacking material needs and uninterested in satisfying base desires. Godlike.
Perhaps this is partly my own societal conditioning as a British national, but in societies as diverse as Russia, Japan, Thailand, and the USA, there does seem to be a primal hankering for a monolithic, enlightened autocrat.
Upon coronations there is always the hope that the next leader might be selfless and wise. With a few exceptions however, history has proven this to generally be a triumph of cultural programming and optimism over realism.
So, I wonder whether this deep-seated need is now being manifested in the enthusiasm for super-intelligent AI and investing it with all our fears and hopes?
If true, then how much of our rationalising about creating AGI is a cultural artefact? All of it, or some of it?
Just a thought...