Categories
Natural computation for intell. sys.

FIT5167 – Natural Computation Week 5

Part 2 of the MLP lectures was completed in week 5. We ran through some extended examples including Batch and Online learning methods. The issue of local minimums and over fitting were also introduced along with some ways of overcoming the limitations they impose.

It turns out that batch learning is the most common method of learning. We ran through an example of proportionality using Mean Square Error [MSE] then a further example applying momentum.

batchlearning
The crux of batch learning

The concept and reasoning behind each operation in back-propagation and batch learning are quite clear, I definitely need to do some repetition to memorize the process for an exam condition however.

The next topic was network generalization, whereby the fitting of the model is relaxed. This ensures that noise and sample data patterns do not have a negative impact on the ability of a NN generated model to reflect further values.

Generalization is required for effective modelling

Other method for preventing over fitting, thrashing and intractable learning were:

  • Early stopping [set number of epochs]
  • Regularization/weight decay
  • Data normalization
  • More that will be covered in next weeks lecture

The tutorial enabled us to start using matlab. The nprtool and nntool were used to create neural networks which could then be exported and manual modified to specific requirements. I found matlab to be fairly easy to use, with exception for the plotting tools when I was unable to make what I wanted with.

 

Leave a Reply

Your email address will not be published. Required fields are marked *