Categories
Reading Unit - DoS Research

FIT5108 – DoS Reading Unit Part 2

This week’ summary will be a review of 2 papers from my reading pack: http://mchost/sourcecode/DoS/DoS%20Docs/JournalandBook/

Adaptive Defense Against Various Network Attacks, 2006, Zou, C., Duffield, N., Townsley, D., Gong W., IEEE.

Summary:

The method discussed in the paper was not focused on improving the current malicious packet identification methods but to increase the efficiency of their application by modifying their adjustable parameters based on the current and recent network conditions. One example was drawn via the Hop Count Filtering [HFC] method for mitigation of DDoS attacks through the assumption that attackers do not know the real hop-length from spoofed sources to their target. The effectiveness of this particular mitigation method is not paramount to the contention but rather the fact the HCF has adjustable parameters in its filtration. By adjusting the ‘strictness’ of the HCF using a simple, low overhead, low computational cost method, the authors were able to significantly improve the performance of the HCF.

Note that performance was based on a curve whereby the costs of false positive and false negatives were arbitrarily defined.

Relevance to thoughts in intelligent systems in network security and DDoS mitigation:

  • Computational cost is almost always relevant
  • Network overhead is always relevant
  • The utility, or cost/reward heuristic for the adaptive system must be provided to the system
  • Parameter management of multiple non-adaptive mitigation or defense systems can be done by single adaptive service which monitor network conditions and established the probability a current attack and the severity.
  • The proposed systems does not use any intelligent systems in the actual identification of malicious packets, perhaps this is due to the computational cost.
  • Adaptive systems can be used to achieve cost minimization for security services.

A Distributed Throttling Approach for Handling High Bandwidth Aggregate, 2007, Wei Tan, C., Chiu, D-M., Lui, J., Yau, D., IEEE.

Summary:

This article approach the breakdown of network communication in the case of flash crowds and DDoS attack which both cause high network aggregates sourced from distributed source to a single location. The authors propose what I would describe as a layered router throttling approach. Throughout the article the term ‘dropped traffic’ is used as to describe the effect of router throttling. The article provides some background on the router throttling strategy but I am somewhat confused over the dropping of traffic ones a certain bandwidth level is exceeded. Does this mean that all incoming packets will be dropped regardless of the existence of tcp session? Does it means that existing sessions will remain alive until they time out? I will need to do some further reading on the router throttling mitigation method. A key requirement of this strategy is having a number of routers in the preceding network hops subscribed to the method. See below:

throttlingapproach
Drawn from the paper, this is a deisrible router structure

The paper goes on to propose a number of algorithms and lightweight communication between routers and evidence that by dropping traffic the distributed throttling method can keep target servers alive. Although this solution would undeniably be effective in keeping a server alive, it drops traffic based on the traffic level of the router it approaches. I feel that there is a very bad worst case scenario where the probability of packet being dropped would have a very low correlation to whether or not it is malicious. The lack of header/packet inspection does have very good computational efficiency however.

  • This solution could be consider somewhat of a benchmark that intelligent mitigation methods would need to improve on, the indiscriminate dropping of packets will result in DoS for users approaching the server via routes that the DDoS approach. Keeping a server alive is probably the primary goal of DoS mitigation but service availability should stand right next to that goal.
  • If this defense strategy was widespread it appears to have numerour vulnerabilities that attackers would sureley exploit. Attackers could test thrasholds for tripping packet dropping and possibly launch attacks that deny serverhere is a  to specific regions with less cost than an attack on a service that was not protected by distributed router throttling.
  • I get a sense that this strategy could work on a macro sense perhaps piggy backing on border routing protocols, however the ‘dumb’ nature of throttling seems a very limiting factor. I will obvously need to investigate the router throttling methods more as with my current understanding this solution seems sub-optimal.

 

 

Categories
Advanced Network Security

FIT5037 – Advanced Network Security Week 1

Week 1 of Adv network security to be lectured by Dr Phu Dung Le provided an introduction to the topics covered in the unit:

  • Modern computing and network security
  • Ellicptic curve public key encryption
  • Design and implementation of RSA and ECC
  • Intrusion detection systems
  • Network and distributed software security
  • Advance wireless security
  • Large computer security systems
  • Security, load balancing and network performance
  • Main research in security

The lecture broke off in to some very interesting discussion over information retrieval from encrypted data sources. The example provided seems like a one of case but this problem will become increasing relevant with the rise of cloud computing.  For example, as large companies such as Sony find strong efficiency and financial motivators to outsource their data storage to cloud providers, encryption of that data is paramount. With a large, off site, encrypted data sources there are issue with the efficient retrieval of data and the point of decryption. For example:

  • If searching for similar images given and initial image, how can this be accomplished without downloading and decrypting the entire database?
  • When retrieving data, at what point does decryption occur, if at the client then all the incoming data will fly straight past firewall, intrusion detection systems and anti-virus software.

A paper proposing a solution where:

an encryption scheme where each authorised user in the system has his own keys to encrypt and decrypt data. The scheme supports keyword search which enables the server to return only the encrypted data that satisfies an encrypted query without decrypting it.

http://mchost/sourcecode/papers/Sharedandsearchableencrypteddataforuntrustedservers.pdf

The problem of like image recognition is still not easily addressable using this solution. Although it could be argued that categorization schema could work effectively. I wonder at plausibility of using unsupervised neural networks in conjunction with the hash algorithm to provide a method not dependent on designer imposed categorization. Imagine the network would need to be infinitely complex to follow hashing however…

The tutorial introduced Snort (a leading intrusion detection system) – http://www.snort.org/

Installing and making a basic configuration for snort was the task.  I am not a big fan of the red hat linux distro that we have access to in the tutorials so I complete the install of snort 2.9.0.5 along with snort report 1.3.1 on my home gateway. I used the latest dynamic rules from

The tutorial I followed loosely for the install can be view:  http://www.symmetrixtech.com/articles/001-snortinstallguide.pdf (*note that following the instructions blindly will result in disaster).

It was also mentioned in the lecture that we would be investigating the RSA in comparison to Elliptic curve cryptology [ECC]. I had no idea what ECC was, a good video I found providing a brief explanation:

Categories
IT Research Methods

FIT5185 – IT Research Methods Week 1

Week 1 of IT research methods was a lecture by Dr Jose Kuzic on the nature of research.  The lecture bounced between subjective opinions from experience in research and a a framework for conducting research questions.

  • Formulating Questions
  • Literature Analysis
  • Case Studies
  • Surveys
  • Qualitative data analysis
  • Quantitative data analysis
  • Communication research

Also introduced were some research paradigms:

  • Scientific research (positivist)
  • Applied research (practical)
  • Social research (interpretive)

I feel that being aware of these paradigms is valuable but self imposing mutual exclusivity or black and white generalization would be counter productive (ie: oh well that’s just a positivist view/ I can’t do that I am doing applied research). A more pragmatic approach of using whatever the best method for reaching outcomes to a posed question regardless of paradigm would be required for good research.

inductiveDeduction
Induction and deduction in science (source: week 1 lecture notes)

Details of Assignment 1 and 2 were also made available on moodle this week. Assignment 1, a literature review and presentation seems like it will be an enjoyable assignment that will allow some synergy with other subjects.