I was reading an email from Ben Goertzel, a major proponent and researcher in artificial general intelligence (AGI), and something that he said stuck out to me.
"The central problem [of AI/AGI] is coordinating various components together in a synergetic way, so that they can help each other avoid combinatorial explosion, rather than exacerbating each others' problems via 'error magnification'"
The issue with many AI projects, such as creating a common sense engine, is the fact that you need a massive number of interacting objects (such as rules) to make anything realistic. Earlier today, I saw a TED Talk about simplicity, and how few scholars are looking into it, while much was known about complexity.
When putting those two together, I started thinking about balancing the rules of complexity with rules of simplicity. There needs to be a balance to allow for a massive number of interactions while not creating a combinatorial explosion.