Friday, May 7, 2010

Symbolic Meaning

I spent some more time thinking about the problem of meaning for symbolic AI.  I still think it might be a non-issue, as our on-and-off-again neurons have no issue with representing the smell of a rose, the memory of a baby being born, or the imagination of a world full of dungeons, dragons, and magic.  Could meaning be a social, emergent entity?  Does meaning even exist without a mind?  Is cognition required?  Could it emerge from the neurons in our brain?

If so, could we use symbolism to represent meaning (like how AI started back in the day) and not worry much about Searle's Chinese Room argument?  How can we explain away his thought experiment? 

We have information theory to quantify the amount of information in data, but can we create a meaning theory (preferably under a different name) to quantify the amount of meaning in data?  We still have to have a general representation, somehow.  Do we have to have external organization to give data meaning?  If so, then how do we get internal meaning within ourselves?  This is why I ask about how meaning might be social.  The development of language would have allowed the sharing and expansion of meaning (memes?) to a much broader audience.

No comments:

Post a Comment