The complexity that makes human intelligence possible is already accessed by a brain-in-a-box through a limited set of interfaces; it's just a mushy organic brain and bone box rather than an electronic brain and steel box.
It's probably reasonable to argue that an AGI would require interfaces to all of the outside world's complexity to be self aware, but there's nothing stopping us from building it those interfaces.
There is a major league difference, and that is the closed loop our body, the aspect of being, missing from everything we have made so far.
In a technical, not inclusive sense, I agree with you. Brain in a box is part of the story.
I do question "limited"
Again, in the technical sense, we do build interfaces that offer superior capability. But, they are nowhere near as robust and integrated.
I am not saying complexity itself makes us possible, though I do believe it is a part of the story.
Higher functioning animals display remarkable intelligence, yet they are simpler than we are in many ways, including the intelligence itself.
We feel, for example. Pain, touch, etc. And when we pay close attention to that, we can identify where, how, when, and map all that to US, what we are and know it is different from others, and the world overall.
Pain is quite remarkable. There are many kinds. Touch is equally remarkable as is pleasure.
Ever wonder why pain or pleasure is different depending on where we experience it? Why a cut on my leg feels different from one on my foot, or hand? Same for a tickle, or something erotic.
I submit these kinds of things are emergent, and happen when the whole machine has enough complexity to be self aware. Even simple creatures demonstrate this basic property.
Beings.
We have not made a being yet. We have made increasingly complex machines.
As we go down that road further, I suspect we will find emergent properties as we get closer to something that has the potential to be.
Not just exist.
I realize I am hand waving. That is due to simple ignorance. We all are sharing that ignorance.
Really, I am speaking to a basic difference that exists and how it may really matter.
Could be wrong too. Nobody is going to know for some time yet. Materials science, our ability to fabricate things, all are stones and chisels compared to mother natures kitchen.
We are super good at electro mechanical. We are just starting to explore bio-mechanical, for example.
The latter contains intelligence that we can see, even if we do not yet understand.
The former does not. Period.
Could. Again, nobody knows.
There are things stopping us, and I just articulated them.
But not completely!
Scale may help. If we did build something more on par with a being, given our current tech, it would end up big.
And every year that passes lowers the bar too.
We can make things today that were science fiction not so long ago.
One other pesky idea out there too:
There may be one consciousness.
A rock, for example, literally is an expression. It has a simple nature, no agency due to low complexity. But, it's current state is what happened to it, how it formed, where it moved. And it is actually changing. The mere act of observing it changes it in ultra subtle ways.
Now, look at bees, ants. Bees know what zero is, appear to present far more complexity in how they respond to the world, what they do, than their limited, small nature might suggest.
Why is that?
What we call emergent may actually be an aggregation of some kind. Given something is a being, perhaps a part of that is a concentration of consciousness.
I am not a believer in any of that. I just expressed our ignorance.
But, I find the ideas compelling and suggestive.
They speak to potential research, areas where we could very significantly improve our ability to create.
Doing that may open doors we had no idea even existed.
We may find the first intelligence we end up responsible for is an artifact, not a deliberate construct.
In fact we may find a construct is not possible directly. We may find it just happens when something that can BE also happens.
Anyway, I hope I have been successful in my suggestion there remains a lot to this we flat out do not know.
It's probably reasonable to argue that an AGI would require interfaces to all of the outside world's complexity to be self aware, but there's nothing stopping us from building it those interfaces.