Hello Team,
I read your email regarding the CheckMates Toolbox Contest and wanted to share a Oneliner I created for Heavy Connections.
1. Whats the purpose of my Oneliner?
Going through the output of "fw ctl multik print_heavy_conn" can sometimes be cumbersome, especially if there is a large amount of heavy connections. Analyzing the output manually is not really effective, so I created a oneliner to give me a simple output composed of Source Address, Destination Address, Destination Port and the number of times a heavy connection occurred between a source and a destination based on a specific destination port.
2. The Oneliner
fw ctl multik print_heavy_conn | awk -F " " '{print $3, $5 }' | sed 's/:[0-9][0-9][0-9][0-9][0-9]//g' | sort > /dev/shm/all_heavy_cons.txt;cat /dev/shm/all_heavy_cons.txt |sort -u > /dev/shm/uniq_cons.txt;while read line; do echo $line $(grep "$line" -c /dev/shm/all_heavy_cons.txt);done < /dev/shm/uniq_cons.txt;rm -f /dev/shm/all_heavy_cons.txt;rm -f /dev/shm/uniq_cons.txt
3. Code Breakdown and explanation
fw ctl multik print_heavy_conn | awk -F " " '{print $3, $5 }' #Print all Heavy Connections and use awk to print fields 3 and 5 which are Source:Port and Destination:Port
| sed 's/:[0-9][0-9][0-9][0-9][0-9]//g' | sort # Pipe the Output of Source:SourcePort and Destination:DestinationPort to sed and remove the SourcePort. Sort the Output
> /dev/shm/all_heavy_cons.txt; # Write the Source and Destination:DestinationPort to a temporary file
cat /dev/shm/all_heavy_cons.txt |sort -u > /dev/shm/uniq_cons.txt; # Filter Source and Destination:DestinationPort for unique entries and write it to a temporary file
while read line; do echo $line $(grep "$line" -c /dev/shm/all_heavy_cons.txt);done < /dev/shm/uniq_cons.txt; # Read each unique connection from a file and use grep to count the amount of times a connection is present in all_heavy_cons.txt. Print the Output to the Screen
rm -f /dev/shm/all_heavy_cons.txt; rm -f /dev/shm/uniq_cons.txt # Remove temporary files
4. Sample Output
[Expert@fw1:0]# fw ctl multik print_heavy_conn | awk -F " " '{print $3, $5 }' | sed 's/:[0-9][0-9][0-9][0-9][0-9]//g' | sort > /dev/shm/all_heavy_cons.txt;cat /dev/shm/all_heavy_cons.txt |sort -u > /dev/shm/uniq_cons.txt;while read line; do echo $line $(grep "$line" -c /dev/shm/all_heavy_cons.txt);done < /dev/shm/uniq_cons.txt;rm -f /dev/shm/all_heavy_cons.txt;rm -f /dev/shm/uniq_cons.txt
10.10.1.16 24.214.55.123:80 1
10.11.28.15 200.211.10.187:80 1
10.11.98.1 211.8.1.10:80 4
10.135.17.113 1.2.3.4:443 1
10.135.211.162 1.25.7.225:443 1
Enjoy!
Best regards yephex
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.