Low Orbital Ion Cannon

December 9th, 2010

Anonymous have mobilized. Well, a subset of Anonymous under the banner of Operation Payback have started attacking organizations that have discontinued services to WikiLeaks (at the time of writing there is no DNS name and it is located at After reading several articles on the subject, I was familiar with the tool that Anonymous was using to attack these sites. From what I can tell there are several versions of the tool named Low Orbital Ion Cannon (or LOIC for short). LOIC is a modified version of an open source load tester. It was modified to connect to an IRC channel and get a list of targets to start attacking.

This is interesting in and of itself as the users are joining an opt-in botnet. I did not have the time to download it and take a look but I did run into a JavaScript version that peaked my interest a bit. I went ahead and grabbed a copy of it and put it up here. It is pretty basic, you put the target url in the first box and launch with “IMMA CHARGING MEH LAZER”. After looking at a packet capture of the traffic it does exactly what you would expect it to, it opens several connections to the server in question. At this point it doesn’t do anything but http and https connections but it would be interesting to see it support other protocols such as SSH and DNS.

Command-line Fu: Parsing logs

July 7th, 2010

Tonight I was going through some output from a very large debug file for a router. The only information I had to go off of was that file. When I say large, I meanĀ 120955 lines of network debug information. It is far too much to go through manually unless you know exactly what you are looking for. It came up that I needed to look at each individual session that ran through the device. Instead of going though line by line looking for each session, I knew I could script it somehow. Just for reference here is basically what a single packet looks like

Jul 5 10:56:55 10:56:55.923221:CID-01:FPC-07:PIC-00:THREAD_ID-14:RT:<>;6> matched filter TDS_debug:

Jul 5 10:56:55 10:56:55.923257:CID-01:FPC-07:PIC-00:THREAD_ID-14:RT:packet [40] ipid = 24977, @7a2200e8

Jul 5 10:56:55 10:56:55.923276:CID-01:FPC-07:PIC-00:THREAD_ID-14:RT:—- flow_process_pkt: (thd 14): flow_ctxt type 13, common flag 0×0, mbuf 0xe50a000, rtbl_idx = 2405

Jul 5 10:56:55 10:56:55.923304:CID-01:FPC-07:PIC-00:THREAD_ID-14:RT: flow process pak fast ifl 80 in_ifp reth1.412

Jul 5 10:56:55 10:56:55.923316:CID-01:FPC-07:PIC-00:THREAD_ID-14:RT:flow_np_session_id2nsp: NP hdr: session id – 654724464, Flag – 8

Jul 5 10:56:55 10:56:55.923337:CID-01:FPC-07:PIC-00:THREAD_ID-14:RT: flow session id 413040

Jul 5 10:56:55 10:56:55.923352:CID-01:FPC-07:PIC-00:THREAD_ID-14:RT: vsd 1 is active

Jul 5 10:56:55 10:56:55.923363:CID-01:FPC-07:PIC-00:THREAD_ID-14:RT: tcp seq check.

Jul 5 10:56:55 10:56:55.923371:CID-01:FPC-07:PIC-00:THREAD_ID-14:RT:mbuf 0xe50a000, exit nh 0xec753c2

Jul 5 10:56:55 10:56:55.923388:CID-01:FPC-07:PIC-00:THREAD_ID-14:RT: —– flow_process_pkt rc 0×0 (fp rc 0)

I figured I should match on the source address and source port. You can see source/destination address and port in the following string of the first line


The first part of my command to filter out just source address and port was

$ cat debug | grep “\<\/”

Doing just this gave me a whole bunch of lines. Looking through them, I found that some of them were not the lines I was looking for and had to do with NAT so I refined my command even more by adding another grep command on the end of it

cat debug | grep “\<\/” | grep “matched filter”

Doing this gave me all the lines I wanted. Counting the following output with “wc -l” gave me 2827 lines. This is more manageable than before but still I want to cut out all of the excess fluff on each line and get rid of any duplicate lines. For the fluff, I turned to my friend cut

$ cat debug | grep “\<\/” | grep “matched filter” | cut -d “<” -f 2

The cut command at the end set a delimiter of the character “<”. The Delimiter tells it where to break up sections into groups. I then passed it the -f flag to tell it to show the second “section” (basically everything to the right of “<”). This gave me several lines like the following>;6> matched filter TDS_debug:

I got rid of the fluff at the beginning and now I need to get rid of the fluff at the end

$ cat debug | grep “\<\/” | grep “matched filter” | cut -d “<” -f 2 | cut -d “-” -f 1

Again, I used cut and I set the delimiter to the character “-” and showed the first colum (everything to the left of the “-”).

With this command I get just the source address and port.

All that is left now is to sort them and get rid of any duplicates with the sort and uniq commands

$ cat debug | grep “\<\/” | grep “matched filter” | cut -d “<” -f 2 | cut -d “-” -f 1 | sort | uniq

Here is an extremely abbreviated output of what I have now

Last thing that is left to do is to pipe all that output into a individual file

$ cat debug | grep “\<\/” | grep “matched filter” | cut -d “<” -f 2 | cut -d “-” -f 1 | sort | uniq > sessions

We are now done and I can file this command away for further use.

); ?http://www.statcounter.com/free_hit_counter.html