Categories
Intelligent systems

FIT5047 – Intelligent Systems Week 6

Intelligent systems’ sixth week took a swing into soft computing and probabilistic systems. We had a quick introduction to probability theory which had the usual intuition breaking outcomes. The use of venn diagrams to explain parts of Kolmogorov’s Axioms were particularly useful. The definition of conditional probability did strike me a little of guard however:

 

Conditional Probability
Conditional Probability

Although in review this does seem much clearer. Given the knowledge of B [yellow] what is the probability of A [red].  As per the diagram and axiom, the answer is the intersection of A and B [green].

A revision of elementary probability reminded me that although at first glance it seems a trivial subject probability requires some use of the brain and calculator:

Suppose a “once in a century” flood has probability 0.01 of occuring in a year. How long do we expect to wait for one?

The answer:

(1 – p)^n = .5

log(1 – p)^n = log (.5)

n log(1 – p) = log (.5)

n = log .5 / log(1 – p)

= approx 69 years

Next came some discussion over notation and then, more importantly, and introduction to Bayes’ Theorm. A simple explination of Bayes’ theorm can be seen here:

 

Discussion then continued to some of the silly mistakes in probability theory that litter its past. I’m sure in 20  years many of the financial tools used in present day will appear on the lecture slides in the same category.

Kevin also made the time to add some material about the nature of probability. The suggestion made in Russell and Norvig is that probability is simply used to represent and agents believe state. If this is kept in mind then it is understandable why Bayesian networks have been such a boon over the past 15 years.

Categories
Adv. programming for DB apps.

FIT5059 – Adv. Prog. for DB Applications Week 6

Week 6 saw an introduction to  custom forms. It makes a lot more sense now why we completed the basic introductions to each component in an automated fashion now. All of the elements that we have already put into practice can be added to a window and customized. Using smart triggers in conjunction with stored procedures gives us the tools that we need to make decent database applications.

The properties palette style customization that is used in Oracle forms developer makes it very similar in feel to ASP.NET development.

More advanced error/exception handling was also discussed this week. Due to the event driven nature of Oracle’s forms, generating exceptions is a convenient way to handle the madness users subject programs to. Customizing systems message will also allow the flow of information from the system to the user to be much more decipherable.

The final topic that was introduced in preparation for next week was the juxtaposition between using multiple canvasses or multiple forms for more complex applications. While multiple forms encompasses much stronger encapsulation, message passing needs to be done using global variables D:<

On the other hand using multiple canvasses allows for much easier message passing, distributed development is decidedly more difficult.

Multi form or multi canvas
Multi form or multi canvas

Doing the assignment will be the best way to ensure that the processes for each element have been memorized.

Categories
Natural computation for intell. sys.

FIT5167 – Natural Computation Week 6

Natural computation entered week 6 with an introduction to unsupervised learning. That is, learning in a neural network without a target for output. This is generally achieved through classification/clustering/self organising maps [SOM].

self organising map
self organising map

The networks for SOMs are actually a little bit simpler than MLP. The process for creating clusters is also quite intuitive. Each neuron in the feature map layer has a unique weight vector, if an input results in that neuron being the most activated (which neuron has the lowest euclidean distance from the input vector) then its weight values move closer to that of the input ( again using euclidian distance):

 

SOM weight update (source: week 6 notes)

The concept of decaying the learning rate was introduced during the lecture but this must be done carefully. If one were to train a network until the weight adjustments stabilized, training will end after a certain number of epochs regardless of how well the network has clustered the data.

Finally the concepts of ‘topological neighborhood’ was introduced. In actual brains, weights of neighboring neurons are updated when a neuron wins the competitive activation. Logically this will result in similar classifications being held by neighboring neurons. The update of the neighboring weight can be done using Gaussian or exponential decay functions:

 

Update neighboring neurons too!
Update neighboring neurons too! (source week 6 notes)
Categories
Network security

FIT5044 – Network Security Week 6

Week 6 continued on the topic of firewalls. We got into some more detailed discussion about the implementation of the concept of firewalls. For example, the use of hardware/software and integration into the OS kernel etc. The question of where and how the firewall engine should be implemented generates a number of options. One thing to remember is that (all else being equal) a more secure configuration will result in less convenience for the business/users. The requirements in terms of security and convenience of each case must be the driving force behind decisions such as the FE implementation.

Next weeks lecture will be focussed on the IP layer and IP security. This week the focus for network security has been the RSA assignment so this is a short post!

TCP/IP stack