Let post as I forgot to publish,  the last week of new topics in Natural computation covered Recurrent networks for time series forecasting. The alternatives for structuring and feeding back previous time series are the main points of difference between methodologies.

Elman Networks:

elman network
source: Week 11 lecture notes FIT5167

Jordan Networks:

jordan networks
source: Week 11 lecture notes FIT5167

Fully recurrent:

Fully Recurrent Time series forcasting network
source: Week 11 Lecture notes FIT5167

These network operate very similarly to standard MultiLayer perceptrons. Self organizing maps have been proposed as one possible method for selecting input variables. Genetic algorithms were also noted as an alternative input selector.

Review of this unit:

I found the FIT5167 to be a very thought provoking subject, with excellent resource provided by the subject lecturer, Grace Rumantir. The best part of the subject was the assignments where we got some very useful practical experience  constructing neural networks. With the statistical analysis that NNs allow, the skills learned in the subject can be applied to a very wide range of problems. I would recommend this subject to anyone studying MIT at Monash even if their major is not Intelligent Systems.