Monday, June 16, 2014

Conceptor Networks

I read today about a new variant of recurrent neural nets called Conceptor Networks, which look pretty. interesting,

In fact this looks kinda like a better-realized variant of the idea of "glocal neural nets" that my colleagues and I experimented with a few years ago.

The basic idea, philosophically (abstracting away loads of important details) is to

  • create a recurrent NN
  • use PCA to classify the states of the NN
  • create explicit nodes or neurons corresponding to these state-categories, and then to imprint these states directly on the dynamics

So there is a loop of "recognizing patterns in the NN and then incorporating these patterns explicitly in the NN dynamics", which is a special case of the process of "a mind identifying patterns in
itself and then embodying those patterns explicitly in itself", which I long ago conjectured to be critical to cognition in general (and which underlies the OpenCog design on a philosophical level...)

There is some hacky Matlab code here implementing the idea; but as code, it's pretty specialized to the exact experiments described in the above technical report...

My intuition is that, for creating a powerful approach to machine perception, a Conceptor Network would fit very well inside a DeSTIN node, for a couple reasons
  1. It has demonstrated ability to infer complex dynamical patterns in time series
  2. It explicitly creates "concept nodes" representing the patterns recognized, which could then be cleanly exported into a symbolic system like OpenCog

Of course, Conceptor Networks are still at the research stage, so getting them to really work inside DeSTIN nodes would require a significant amount of fiddling...

But anyhow it's cool stuff ;)

No comments: