I'm not sure. If we think about it mechanically, and the output of the a computer system is a function of it's input, then perhaps we are woefully underestimating the role of the body as both a monitor and input system.
I don't think an artificial intelligence has to resemble anything we would recognize as human intelligence, while still being "intelligent". The motivations of humans and the motivations of computers are very different. If we can give a computer motivation, as well as a way to act and react within an environment, and a framework for learning (neural networks?) then what it "thinks" will be determined by the experience of its own existence. I'm not sure we're there yet.
At this stage at least, the recreation of the interaction between that artificial body and the environment will be impressionistic at best...
Like we don't even have a full conscious understanding of what we are made from, or what we need to survive.
How do we 'install' those ideas and that imperative in an agent, when we don't fully understand it ourselves?
I'm not entirely supporting one side or another, but I think it's reasonable to bet against the imminent arrival of AGI at this stage, unless some radical discovery comes to light soon.
And if (when?) that discovery eventually comes, I'd suspect it to be a biological one. But then, who knows...
What is "a body" other than inputs and outputs to interface with the environment, and perhaps a mechanism of perceiving the environment and remembering what was perceived?
The capital T-truth is that no one really knows since we're all observing the system subjectively from the inside.
Many who practice meditation and/or experiment with psychedelic drugs will tell you that there's something in there that's not a computer.
We could assume, as many do, they're fooling themselves; since we can't measure it. Or, we could trust our authentic experience of the real world even when it can't (yet) be measured.