Our brains are a massive network of nodes communicating in a constant frenzy of electro-chemical stimulations and repressions. MRI and PET technology tells us that neuronal activity increase in various areas of the brain as we perform various tasks. What type of information processing is going on here? It is fairly certain to me that there is no simple mapping between what neurons do and what chips do. At least not the kind of chips that contain logic gates, flip flops and typical microprocessors.
If we can hope to find an analogue to neural processing in todays state of the art digital technology we probably would be better to look a specialized chips such as Digital Signal Processors (although the mapping onto brain circuits would still be of the coarsest kind).
I believe a good part of the function of biological neural networks is transformation. Specifically along the lines of the Fourier and Inverse Fourier Transforms. These sorts of transforms take signals in the spacial domain (for example an image) or the time domain (music or speech) and transform it into components in the frequency domain (and visa versa). These types of transformations are used extensively in digital signal processing to find patterns, remove noise and accentuate specific types of information. Applications for these transforms are ubiquitous as you can see here, here, and here. It would be an insult to mathematics and a travesty of nature if the brain did not exploit similar transformations even if they are not strictly Fourier transforms. (As it turns out the Fourier transform is just a special case of a much broader class of transform - for example, wavelet transforms [1, 2, 3] began stealing the limelight in the 90's). In fact, if you goggle "Fourier and Neural Networks" or "Transform and Neural Networks" you will find quite a bit of academic papers that explore joint applications of these technologies.
If you've read this earlier post, then you know that I am interested in applying more mathematically oriented models to semantic representation then have traditionally be employed in AI. This is not to say that first order logic (one of the mainstays of AI) is not mathematical. Rather, I am talking about branches of mathematics where numeric rather than symbolic computation is the focus (of course at its foundations all mathematics is symbolic but this is not the level that one typically operates when doing vector math or analysis).
If semantic knowledge can be modeled in vector form then it opens up many of the tools of mathematical analysis to AI. One of these tools is of course the Fourier transform and its friends. These are the kind of musings which give me goose bumps!
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment