Interesting thoughts, but I cannot agree. An approach which is generating so many opinions about what the next step should be is obviously far from exhausted. Far from it: It is hardly yet mature. Further, the fact that there are many opinions on how to achieve AGI does not mean that one of them is not the right one.
The human brain clearly achieves general intelligence through connectivity in a neural network, unless the whole of neurology has been barking up the wrong dendrite. Full-scale digital models of the human brain are in the pipeline. Artificial general intelligence, given these premises, may arrive before we understand how we managed it, but it shall arrive.
By the same token, copying the human brain will achieve the same kind of intelligence as the human brain. It may have required embodiment in phylogeny - to evolve; it may have required embodiment in ontogeny - to be built during embryology and maturation. Unless there is something about brain physics which we are missing, however, it cannot require embodiment to copy it.
Neural networks are very different to statistical approaches in machine learning. They are also far from mined-out. My feeling is that AGI will come when we learn how to apply metaphor: Literally, how to use neural networks to generalise. When they can start using learned approaches applied to completely unrelated problems, AI will by definition become general.