We jump into security with an automated nmap script, tcpdump & Wireshark. We start off with a basic review of our network engineering process that Dorothy is working on. Then we go into our nmap script to locate unauthorized devices & services. We move on to tcpdump and how to capture packets. We go over some tcpdump options and why they work for us. The output is saved to a file ready for Wireshark and packet analysis.

LINKS

  1. autonmap.sh – The original nmap script on github ( see our script below )
  2. xsltproc manpage – A program to convert xml files into html
  3. xsltproc Ubuntu Package – The ubuntu 16.04 package
  4. packetlife.net – Cheat Sheets

TCPDUMP COMMANDS

  1. tcpdump -D = Shows a list of interfaces for capturing packets
  2. tcpdump -i 3 -nvvv -s0 -XX -w capture.pcap
    1. -i 3 = Interface number
    2. -nvvv = Don’t resolve hostnames and port numbers, extra verbose
    3. -s0 = Captures entire packet
    4. -XX = Prints the header, packet data and link level header, in hex and ASCII.
    5. -w capture.pcap = Write everything to the file capture.pcap

NMAP SCRIPT

This is our version of the nmap script.

#!/bin/bash

CURRENTDATE=`date +%F:%H:%M`
DATE=`echo $CURRENTDATE`

## Begin Config

# The directory for autonmap data/scans
RUN_DIRECTORY=”/usr/local/autonmap/”

# The directory you want the web report to live in
WEB_DIRECTORY=”/var/www/autonmap/”

# The subnets you want to scan daily, space seperated.
SCAN_SUBNETS=”10.20.20.0/24″

# The full path (http) to where the report will be hosted by your webserver. This is included in the email report.
# I suggest setting up auth using htpasswd etc, in which case you can include the auth in the URL for simplicity if you want.
WEB_URL=”http://website/scan-$DATE.html”

# The full path to your chosen nmap binary
NMAP=”/usr/local/bin/nmap”

# The path to the ndiff tool provided with nmap
NDIFF=”/usr/local/bin/ndiff”

# The email address(es), space seperated that you wish to send the email report to.
EMAIL_RECIPIENTS=”Email Address”

## End config

echo $DATE”- Welcome to AutoNmap2. ”

# Ensure we can change to the run directory
cd $RUN_DIRECTORY || exit 2
echo “`date` – Running nmap, please wait. This may take a while. ”
#$NMAP –open -T4 -PN $SCAN_SUBNETS -n -oX scan-$DATE.xml > /dev/null
$NMAP –open -sn -n $SCAN_SUBNETS -oX scan-$DATE.xml > /dev/null
echo “`date` – Nmap process completed with exit code $?”

#Convert the xml into html – added by Damien Hull
xsltproc scan-$DATE.xml -o scan-$DATE.html

# If this is not the first time autonmap2 has run, we can check for a diff. Otherwise skip this section, and tomorrow when the link exists we can diff.
if [ -e scan-prev.xml ]
then
echo “`date` – Running ndiff…”
# Run ndiff with the link to yesterdays scan and todays scan
DIFF=`$NDIFF scan-prev.xml scan-$DATE.xml`

echo “`date` – Checking ndiff output”
# There is always two lines of difference; the run header that has the time/date in. So we can discount that.
if [ `echo “$DIFF” | wc -l` -gt 2 ]
then
echo “`date` – Differences Detected. Sending mail.”
echo -e “AutoNmap2 found differences in a scan for ‘${SCAN_SUBNETS}’ since last scan. \n\n$DIFF\n\nFull report available at $WEB_URL” | mail -s “AutoNmap2” $EMAIL_RECIPIENTS
else
echo “`date`- No differences, skipping mail. ”
fi

else
echo “`date` – There is no previous scan (scan-prev.xml). Cannot diff today; will do so tomorrow.”
fi

# Copy the scan report to the web directory so it can be viewed later.
echo “`date` – Copying HTML to web directory. ”
cp scan-$DATE.html $WEB_DIRECTORY

# Create the link from today’s report to scan-prev so it can be used tomorrow for diff.
echo “`date` – Linking todays scan to scan-prev.xml”
ln -sf scan-$DATE.xml scan-prev.xml

echo “`date` – AutoNmap2 is complete.”
exit 0

 

#17: Security with nmap script, tcpdump & Wireshark

Listen |