Many AI researchers believe that neural models are the only true way to build an artificial intelligence. I disagree. Requiring neurons for intelligence is like requiring cells for an android. Just because our main example for natural intelligence, humanity, has a massive neural network does not necessarily mean that it is required for artificial intelligence. Looking at the average human, I would despair if that was true.
I feel that the only true requirement for an AI is functional. You can define that function however you like; until we get closer to AI, it's all speculation.
My theory does happen to follow the brain, but more the triune brain theory. This states that there are three main functional areas of the brain, that are then broken down further. The first is the reptilian portion. The second is the paleomammalian portion. The third is the neomammalian portion. The reptilian areas of the brain gave earlier life the ability to act rather than simply react. I would say that this is the current state of AI research, minus the forays into extremely specific higher functions. Once AI has successfully reached the equivalent of the paleomammalian functions, I would call this an artificial pre-intelligence, though an hour ago would have called this a first generation AI. Now, I would require a simple form of neomammalian functionality before saying something is an artificial intelligence.
As a final thought, I will say that I'm not ignoring neural networks as a basic building block of an AI. I'm just saying that we should not get hung up on one particular method when it's the functionality that's important.