Categories
Natural computation for intell. sys.

FIT5167 – Natural Computation Week 6

Natural computation entered week 6 with an introduction to unsupervised learning. That is, learning in a neural network without a target for output. This is generally achieved through classification/clustering/self organising maps [SOM].

self organising map
self organising map

The networks for SOMs are actually a little bit simpler than MLP. The process for creating clusters is also quite intuitive. Each neuron in the feature map layer has a unique weight vector, if an input results in that neuron being the most activated (which neuron has the lowest euclidean distance from the input vector) then its weight values move closer to that of the input ( again using euclidian distance):

 

SOM weight update (source: week 6 notes)

The concept of decaying the learning rate was introduced during the lecture but this must be done carefully. If one were to train a network until the weight adjustments stabilized, training will end after a certain number of epochs regardless of how well the network has clustered the data.

Finally the concepts of ‘topological neighborhood’ was introduced. In actual brains, weights of neighboring neurons are updated when a neuron wins the competitive activation. Logically this will result in similar classifications being held by neighboring neurons. The update of the neighboring weight can be done using Gaussian or exponential decay functions:

 

Update neighboring neurons too!
Update neighboring neurons too! (source week 6 notes)

Leave a Reply

Your email address will not be published. Required fields are marked *