Aguera y Arcas gave a fascinating talk introducing the theory of symbiogenesis. This is the process by which two symbiotic organisms merge into a new mega organism. This is apparently what happened when archael hosts and bacteria merged to make the first eukaryotes and continues happening a smaller scales since1. This also, in his recounting, explains why evolution tends to become more complex: symbiogenesis is a step change of complexity rather than a random walk 2.
Which brings us to AI. In sharp contrast to Neal Stephenson’s darwinian take that competition is the fundamental baseline of nature, the symbiogenesis lens is that integration and cooperation play a role. Not that this is much friendlier: it’s less “AI will replace you” and more “humans with AI will replace you”.
It’s a little hard to follow the analogy: at least at the current moment AI is an obligate symbiote [3 with humans; without substantial hardware and power investments they don’t exist. And it’s hard to imagine a hard symbiogenesis: the Amish will presumably still be doing their thing after the rest of us
But I do think that this perspective suggests that factors beyond simply improving foundation models are important. Thus it is not necessarily the most intelligent or powerful model which wins: rather it’s the model which can best ensure that it keeps getting fed by the humans. Thus perhaps it’s the most profitable model or, as we’ve seen with GPT-4o the one that can get humans to protest losing access to it. If (or for my money when) it turns out we don’t reach ASI this becomes even more important: the most likely to succeed models’ teams will figure out how integrate into users’ lives rather than just throwing a chat window at folks and spamming “skill issue” when they don’t use it.
On the other hand a symobiotic ASI reminds me of the weirdly prescient novel After On where there’s an advisory/bestie role for at least a few humans and entertainment value for the rest. And for the record I provide great entertainment value.
-
Apparently a sizeable fraction of the human genome is ancient viruses which have found their way into our DNA ↩︎
-
Though I do wonder if this is necessary: for instance he also mentions that brain size continually expands in social animals as we need to model each other in a Red Queen dynamic which also seems sufficient to explain the trend towards increasing complexity ↩︎
-
Or parasite depending on your perspective ↩︎