NETRESEC Network Security Blog


NetworkMiner 2.2 Released

NetworkMiner 2.2 = Harder Better Faster Stronger

NetworkMiner 2.2 is faster, better and stronger than ever before! The PCAP parsing speed has more than doubled and even more details are now extracted from analyzed packet capture files.

The improved parsing speed of NetworkMiner 2.2 can be enjoyed regardless if NetworkMiner is run in Windows or Linux, additionally the user interface is more responsive and flickers way less when capture files are being loaded.

User Interface Improvements

The keyword filter available in the Files, Messages, Sessions, DNS and Parameters tabs has been improved so that the rows now can be filtered on a single column of choice by selecting the desired column in a drop-down list. There is also an “Any column” option, which can be used to search for the keyword in all columns.

Keyword drop-down in NetworkMiner's Parameters tab

The Messages tab has also received an additional feature, which allows the filter keyword to be matched against the text in the message body as well as email headers when the “Any column” option is selected. This allows for an efficient analysis of messages (such as emails sent/received through SMTP, POP3 and IMAP as well as IRC messages and some HTTP based messaging platforms), since the messages can be filtered just like in a normal e-mail client.

We have also given up on using local timestamp formats; timestamps are now instead shown using the yyyy-MM-dd HH:mm:ss format with time zone explicitly stated.

Protocol Parsers

NetworkMiner 2.2 comes with a parser for the Remote Desktop Protocol (RDP), which rides on top of COTP and TPKT. The RDP parser is primarily used in order to extract usernames from RDP cookies and show them on the Credentials tab. This new version also comes with better extraction of SMB1 and SMB2 details, such as NTLM SSP usernames.

RDP Cookies extracted with NetworkMiner 2.2

One big change that has been made behind the scenes of NetworkMiner is the move from .NET Framework 2.0 to version 4.0. This move doesn’t require any special measures to be taken for most Microsoft Windows users since the 4.0 Framework is typically already installed on these machines. If you’re running NetworkMiner in Linux, however, you might wanna check out our updated blog post on how to install NetworkMiner in Linux.

We have also added an automatic check for new versions of NetworkMiner, which runs every time the tool is started. This update check can be disabled by adding a -noupdatecheck switch to the command line when starting NetworkMiner.

NetworkMiner Professional

Even though NetworkMiner 2.2 now uses ISO-like time representations NetworkMiner still has to decide which time zone to use for the timestamps. The default decision has always been to use the same time zone as the local machine, but NetworkMiner Professional now additionally comes with an option that allows the user to select whether to use UTC (as nature intended), the local time zone or some other custom time zone for displaying timestamps. The time zone setting can be found in the “Tools > Settings” menu.

The Port-Independent-Protocol-Detection (PIPI) feature in NetworkMiner Pro has been improved for more reliable identification of HTTP, SSH, SOCKS, FTP and SSL sessions running on non-standard port numbers.

CASE / JSON-LD Export

We are happy to announce that the professional edition of NetworkMiner 2.2 now has support for exporting extracted details using the Cyber-investigation Analysis Standard Expression (CASE) format, which is a JSON-LD format for digital forensics data. The CASE export is also available in the command line tool NetworkMinerCLI.

We would like to thank Europol for recommending us to implement the CASE export format in their effort to adopt CASE as a standard digital forensic format. Several other companies in the digital forensics field are currently looking into implementing CASE in their tools, including AccessData, Cellebrite, Guidance, Volatility and XRY. We believe the CASE format will become a popular format for exchanging digital forensic data between tools for digital forensics, log correlation and SIEM solutions.

We will, however, still continue supporting and maintaining the CSV and XML export formats in NetworkMiner Professional and NetworkMinerCLI alongside the new CASE format.

Credits

I would like to thank Sebastian Gebhard and Clinton Page for reporting bugs in the Credentials tab and TFTP parsing code that now have been fixed. I would also like to thank Jeff Carrell for providing a capture file that has been used to debug an issue in NetworkMiner’s OpenFlow parser. There are also a couple of users who have suggested new features that have made it into this release of NetworkMiner. Marc Lindke suggested the powerful deep search of extracted messages and Niclas Hirschfeld proposed a new option in the PCAP-over-IP functionality that allows NetworkMiner to receive PCAP data via a remote netcat listener.

Upgrading to Version 2.2

Users who have purchased a license for NetworkMiner Professional 2.x can download a free update to version 2.2 from our customer portal.

Those who instead prefer to use the free and open source version can grab the latest version of NetworkMiner from the official NetworkMiner page.

More... Share  |  Facebook   Twitter   Reddit   Hacker News Short URL: http://netres.ec/?b=17888CB

Posted by Erik Hjelmvik on Tuesday, 22 August 2017 11:37:00 (UTC/GMT)


Network Forensics Training in London

The Flag of the United States by Sam Howzit (CC BY 2.0)

People sometimes ask me when I will teach my network forensics class in the United States. The US is undoubtedly the country with the most advanced and mature DFIR community, so it would be awesome to be able to give my class there. However, not being a U.S. person and not working for a U.S. company makes it rather difficult for me to teach in the United States (remember what happened to Halvar Flake?).

So if you’re from the Americas and would like to take my network forensics class, then please don’t wait for me to teach my class at a venue close to you – because I probably won’t. My recommendation is that you instead attend my upcoming training at 44CON in London this September.

London Red Telephone Booth Long Exposure by negativespace.co (CC0)

The network forensics training in London will cover topics such as:

  • Analyzing a web defacement
  • Investigating traffic from a remote access trojan (njRAT)
  • Analyzing a Man-on-the-Side attack (much like QUANTUM INSERT)
  • Finding a backdoored application
  • Identifying botnet traffic through whitelisting
  • Rinse-Repeat Threat Hunting

The first day of training will focus on analysis using only open source tools. The second day will primarily cover training on commercial software from Netresec, i.e. NetworkMiner Professional and CapLoader. All students enrolling in the class will get a full 6 month license for both these commercial tools.

NetworkMiner CapLoader

Hope to see you at the 44CON training in London!

More... Share  |  Facebook   Twitter   Reddit   Hacker News Short URL: http://netres.ec/?b=174EFE8

Posted by Erik Hjelmvik on Tuesday, 25 April 2017 14:33:00 (UTC/GMT)


Domain Whitelist Benchmark: Alexa vs Umbrella

Alexa vs Umbrella

In November last year Alexa admitted in a tweet that they had stopped releasing their CSV file with the one million most popular domains.

Yes, the top 1m sites file has been retired

Members of the Internet measurement and infosec research communities were outraged, surprised and disappointed since this domain list had become the de-facto tool for evaluating the popularity of a domain. As a result of this Cisco Umbrella (previously OpenDNS) released a free top 1 million list of their own in December the same year. However, by then Alexa had already announced that their “top-1m.csv” file was back up again.

The file is back for now. We'll post an update before it changes again.

The Alexa list was unavailable for just about a week but this was enough for many researchers, developers and security professionals to make the move to alternative lists, such as the one from Umbrella. This move was perhaps fueled by Alexa saying that the “file is back for now”, which hints that they might decide to remove it again later on.

We’ve been leveraging the Alexa list for quite some time in NetworkMiner and CapLoader in order to do DNS whitelisting, for example when doing threat hunting with Rinse-Repeat. But we haven’t made the move from Alexa to Umbrella, at least not yet.


Malware Domains in the Top 1 Million Lists

Threat hunting expert Veronica Valeros recently pointed out that there are a great deal of malicious domains in the Alexa top one million list.

Researchers using Alexa top 1M as legit, you may want to think twice about that. You'd be surprised how many malicious domains end there.

I often recommend analysts to use the Alexa list as a whitelist to remove “normal” web surfing from their PCAP dataset when doing threat hunting or network forensics. And, as previously mentioned, both NetworkMiner and CapLoader make use of the Alexa list in order to simplify domain whitelisting. I therefore decided to evaluate just how many malicious domains there are in the Alexa and Umbrella lists.

hpHosts EMD (by Malwarebytes)

Alexa Umbrella
Whitelisted malicious domains: 1365 1458
Percent of malicious domains whitelisted: 0.89% 0.95%

Malware Domain Blocklist

Alexa Umbrella
Whitelisted malicious domains: 84 63
Percent of malicious domains whitelisted: 0.46% 0.34%

CyberCrime Tracker

Alexa Umbrella
Whitelisted malicious domains: 15 10
Percent of malicious domains whitelisted: 0.19% 0.13%

The results presented above indicate that Alexa and Umbrella both contain roughly the same number of malicious domains. The percentages also reveal that using Alexa or Umbrella as a whitelist, i.e. ignore all traffic to the top one million domains, might result in ignoring up to 1% of the traffic going to malicious domains. I guess this is an acceptable number of false negatives since techniques like Rinse-Repeat Intrusion Detection isn’t intended to replace traditional intrusion detection systems, instead it is meant to be use as a complement in order to hunt down the intrusions that your IDS failed to detect. Working on a reduced dataset containing 99% of the malicious traffic is an acceptable price to pay for having removed all the “normal” traffic going to the one million most popular domains.


Sub Domains

One significat difference between the two lists is that the Umbrella list contains subdomains (such as www.google.com, safebrowsing.google.com and accounts.google.com) while the Alexa list only contains main domains (like “google.com”). In fact, the Umbrella list contains over 1800 subdomains for google.com alone! This means that the Umbrella list in practice contains fewer main domains compared to the one million main domains in the Alexa list. We estimate that roughly half of the domains in the Umbrella list are redundant if you only are interested in main domains. However, having sub domains can be an asset if you need to match the full domain name rather than just the main domain name.


Data Sources used to Compile the Lists

The Alexa Extension for Firefox
Image: The Alexa Extension for Firefox

The two lists are compiled in different ways, which can be important to be aware of depending on what type of traffic you are analyzing. Alexa primarily receives web browsing data from users who have installed one of Alexa’s many browser extensions (such as the Alexa browser toolbar shown above). They also gather additional data from users visiting web sites that include Alexa’s tracker script.

Cisco Umbrella, on the other hand, compile their data from “the actual world-wide usage of domains by Umbrella global network users”. We’re guessing this means building statistics from DNS queries sent through the OpenDNS service that was recently acquired by Cisco.

This means that the Alexa list might be better suited if you are only analyzing HTTP traffic from web browsers, while the Umbrella list probably is the best choice if you are analyzing non-HTTP traffic or HTTP traffic that isn’t generated by browsers (for example HTTP API communication).


Other Quirks

As noted by Greg Ferro, the Umbrella list contains test domains like “www.example.com”. These domains are not present in the Alexa list.

We have also noticed that the Umbrella list contains several domains with non-authorized gTLDs, such as “.home”, “.mail” and “.corp”. The Alexa list, on the other hand, only seem to contain real domain names.


Resources and Raw Data

Both the Alexa and Cisco Umbrella top one million lists are CSV files named “top-1m.csv”. The CSV files can be downloaded from these URL’s:

The analysis results presented in this blog post are based on top-1m.csv files downloaded from Alexa and Umbrella on March 31, 2017. The malware domain lists were also downloaded from the three respective sources on that same day.

We have decided to share the “false negatives” (malware domains that were present in the Alexa and Umbrella lists) for transparency. You can download the lists with all false negatives from here:
https://www.netresec.com/files/alexa-umbrella-malware-domains_170331.zip


Hands-on Practice and Training

If you wanna learn more about how a list of common domains can be used to hunt down intrusions in your network, then please register for one of our network forensic trainings. The next training will be a pre-conference training at 44CON in London.

More... Share  |  Facebook   Twitter   Reddit   Hacker News Short URL: http://netres.ec/?b=1743FAE

Posted by Erik Hjelmvik on Monday, 03 April 2017 14:47:00 (UTC/GMT)


CapLoader 1.5 Released

CapLoader 1.5 Logo

We are today happy to announce the release of CapLoader 1.5. This new version of CapLoader parses pcap and pcap-ng files even faster than before and comes with new features, such as a built-in TCP stream reassembly engine, as well as support for Linux and macOS.

Support for ICMP Flows

CapLoader is designed to group packets together that belong to the same bi-directional flow, i.e. all UDP, TCP and SCTP packets with the same 5-tuple (regardless of direction) are considered being part of the same flow.

5-tuple
/fʌɪv ˈtjuːp(ə)l/
noun

A combination of source IP, destination IP, source port, destination port and transport protocol (TCP/UDP/SCTP) used to uniquely identify a flow or layer 4 session in computer networking.

The flow concept in CapLoader 1.5 has been extended to also include ICMP. Since there are no port numbers in the ICMP protocol CapLoader sets the source and destination port of ICMP flows to 0. The addition of ICMP in CapLoader also allows input filters and display filters like “icmp” to be leveraged.

Flows tab in CapLoader 1.5 with display filter BPF 'icmp'
Image: CapLoader 1.5 showing only ICMP flows due to display filter 'icmp'.

TCP Stream Reassembly

One of the foundations for making CapLoader a super fast tool for reading and filtering PCAP files is that it doesn’t attempt to reassemble TCP streams. This means that CapLoader’s Transcript view will show out-of-order segments in the order they were received and retransmitted segments will be displayed twice.

The basic concept has been to let other tools do the TCP reassembly, for example by exporting a PCAP for a flow from CapLoader to Wireshark or NetworkMiner.

The steps required to reassemble a TCP stream to disk with Wireshark are:

  1. Right-click a TCP packet in the TCP session of interest.
  2. Select “Follow > TCP Stream”.
  3. Choose direction in the first drop-down-list (client-to-server or server-to-client).
  4. Change format from “ASCII” to “Raw” in the next drop-down-menu.
  5. Press the “Save as...” button to save the reassembled TCP stream to disk.

Follow TCP Stream window in Wireshark

Unfortunately Wireshark fails to properly reassemble some TCP streams. As an example the current stable release of Wireshark (version 2.2.5) shows duplicate data in “Follow TCP Stream” when there are retransmissions with partially overlapping segments. We have also noticed some additional  bugs related to TCP stream reassembly in other recent releases of Wireshark. However, we’d like to stress that Wireshark does perform a correct reassembly of most TCP streams; it is only in some specific situations that Wireshark produces a broken reassembly. Unfortunately a minor bug like this can cause serious consequences, for example when the TCP stream is analyzed as part of a digital forensics investigation or when the extracted data is being used as input for further processing. We have therefore decided to include a TCP stream reassembly engine in CapLoader 1.5. The steps required to reassemble a TCP stream in CapLoader are:

  1. Double click a TCP flow of interest in the “Flows” tab to open a flow transcript.
  2. Click the “Save Client Byte Stream” or “Save Server Byte Stream” button to save the data stream for the desired direction to disk.

Transcript window in CapLoader 1-5

Extracting TCP streams from PCAP files this way not only ensures that the data stream is correctly reassembled, it is also both faster and simpler than having to pivot through Wireshark’s Follow TCP Stream feature.

PCAP Icon Context Menu

CapLoader 1.5 PCAP icon Save As...

The PCAP icon in CapLoader is designed to allow easy drag-and-drop operations in order to open a set of selected flows in an external packet analysis tool, such as Wireshark or NetworkMiner. Right-clicking this PCAP icon will bring up a context menu, which can be used to open a PCAP with the selected flows in an external tool or copy the PCAP to the clipboard. This context menu has been extended in CapLoader 1.5 to also include a “Save As” option. Previous versions of CapLoader required the user to drag-and-drop from the PCAP icon to a folder in order to save filtered PCAP data to disk.

Faster Parsing with Protocol Identification

CapLoader can identify over 100 different application layer protocols, including HTTP, SSL, SSH, RTP, RTCP and SOCKS, without relying on port numbers. The protocol identification has previously slowed down the analysis quite a bit, which has caused many users to disable this powerful feature. This new release of of CapLoader comes with an improved implementation of the port-independent protocol identification feature, which enables PCAP files to be loaded twice as fast as before with the “Identify protocols” feature enabled.

Works in Linux and macOS

One major improvement in CapLoader 1.5 is that this release is compatible with the Mono framework, which makes CapLoader platform independent. This means that you can now run CapLoader on your Mac or Linux machine if you have Mono installed. Please refer to our previous blog posts about how to run NetworkMiner in various flavors of Linux and macOS to find out how to install Mono on your computer. You will, however, notice a performance hit when running CapLoader under Mono instead of using Windows since the Mono framework isn't yet as fast as Microsoft's .NET Framework.

CapLoader 1.5 running in Linux with Mono
Image: CapLoader 1.5 running in Linux (Xubuntu).

Credits

We’d like to thank Sooraj for reporting a bug in the “Open With” context menu of CapLoader’s PCAP icon. This bug has been fixed in CapLoader 1.5 and Sooraj has been awarded an official “PCAP or it didn’t happen” t-shirt for reporting the bug.

PCAP or it didn't happen t-shirt
Image: PCAP or it didn't happen t-shirt

Have a look at our Bug Bounty Program if you also wanna get a PCAP t-shirt!

Downloading CapLoader 1.5

Everything mentioned in this blog post, except for the protocol identification feature, is available in our free trial version of CapLoader. To try it out, simply grab a copy here:
https://www.netresec.com/?page=CapLoader#trial (no registration needed)

All paying customers with an older version of CapLoader can download a free update to version 1.5 from our customer portal.

More... Share  |  Facebook   Twitter   Reddit   Hacker News Short URL: http://netres.ec/?b=1738EBA

Posted by Erik Hjelmvik on Tuesday, 07 March 2017 09:11:00 (UTC/GMT)


Enable file extraction from PCAP with NetworkMiner in six steps

NetworkMiner can reassemble files transferred over protocols such as HTTP, FTP, TFTP, SMB, SMB2, SMTP, POP3 and IMAP simply by reading a PCAP file. NetworkMiner stores the extracted files in a directory called “AssembledFiles” inside of the NetworkMiner directory.

NetworkMiner 2.1.1 with Files tab open.
Files extracted by NetworkMiner from the DFRWS 2008 challenge file suspect.pcap

NetworkMiner is a portable tool that is delivered as a zip file. The tool doesn’t require any installation, you simply just extract the zip file to your PC. We don’t provide any official guidance regarding where to place NetworkMiner, users are free to place it wherever they find it most fitting. Some put the tool on the Desktop or in “My Documents” while others prefer to put it in “C:\Program Files”. However, please note that normal users usually don’t have write permissions to sub-directories of %programfiles%, which will prevent NetworkMiner from performing file reassembly.

Unfortunately, previous versions of NetworkMiner didn’t alert the user when it failed to write to the AssembledFiles directory. This means that the tool would silently fail to extract any files from a PCAP file. This behavior has been changed with the release of NetworkMiner 2.1. Now the user gets a windows titled “Insufficient Write Permissions” with a text like this:

User is unauthorized to access the following file:
C:\Program Files\NetworkMiner_2-1-1\AssembledFiles\cache\FILENAME

File(s) will not be extracted!

Follow these steps to set adequate write permissions to the AssembledFiles directory in Windows:

  1. Open the Properties window for the AssembledFiles directory
  2. Open the “Security” tab
  3. Press “Edit” to change permissions
  4. Select the user who will be running NetworkMiner
  5. Check the “Allow”checkbox for Write permissions
  6. Press the OK button

Press Edit to change permissions for AssembledFiles folder

If you are running NetworkMiner under macOS (OS X) or Linux, then please make sure to follow our installation instructions, which include this command:

sudo chmod -R go+w AssembledFiles/

Once you have set up the appropriate write permissions you should be able to start NeworkMiner and open a PCAP file in order to have the tool automatically extract files from the captured network traffic.

More... Share  |  Facebook   Twitter   Reddit   Hacker News Short URL: http://netres.ec/?b=173588E

Posted by Erik Hjelmvik on Friday, 03 March 2017 09:44:00 (UTC/GMT)


10 Years of NetworkMiner

I released the first version of NetworkMiner on February 16, 2007, which is exactly 10 years ago today.

NetworkMiner 0.79 in Windows XP

One of the main uses of NetworkMiner today is to reassemble file transfers from PCAP files and save the extracted files to disk. However, as you can see in the screenshot above, the early versions of NetworkMiner didn’t even have a Files tab. In fact, the task that NetworkMiner was originally designed for was simply to provide an inventory of the hosts communicating on a network.

How it all started

So, why did I start designing a passive asset detection system when I could just as well have used a port scanner like Nmap to fingerprint the devices on a network? Well, I was working with IT security at the R&D department of a major European energy company at the time. As part of my job I occasionally performed IT security audits of power plants. During these audits I typically wanted to ensure that there were no rouge or unknown devices on the network. The normal way of verifying this would be to perform an Nmap scan of the network, but that wasn’t an option for me since I was dealing with live industrial control system networks. I knew from personal experience that a network scan could cause some of the industrial control system devices to drop their network connections or even crash, so active scanning wasn’t a viable option. Instead I chose to setup a SPAN port at a central point of the network, or even install a network TAP, and then capture network traffic to a PCAP file during a few hours. I found the PCAP files being a great source, not only for identifying the hosts present at a network, but also in order to discover misconfigured devices. However, I wasn’t really happy with the tools available for visualizing the devices on the network, which is why I stated developing NetworkMiner in my spare time.

Network Forensics

As I continued improving NetworkMiner I pretty soon ended up writing my own TCP reassembly engine as well as parsers for HTTP and the CIFS protocol (a.k.a SMB). With these protocols in place I was able to extract files downloaded through HTTP or SMB to disk with NetworkMiner, which turned out to be a killer feature.

Monthly downloads of NetworkMiner from SourceForge
Image: Monthly downloads of NetworkMiner from SourceForge

With the ability to extract file transfers from PCAP files NetworkMiner steadily gained popularity as a valuable tool in the field of network forensics, which motivated me to make the tool even better. Throughout these past 10 years I have single-handedly implemented over 60 protocols in NetworkMiner, which has been a great learning experience for me.

NetworkMiner Milestones

Looking Forward

People sometimes ask me what I’m planning to add to the next version of NetworkMiner. To be honest; I never really know. In fact, I’ve realized that those with the best ideas for features or protocols to add to NetworkMiner are those who use NetworkMiner as part of their jobs, such as incident responders and digital forensics experts across the globe.

I therefore highly value feedback from users, so if you have requests for new features to be added to the next version, then please feel free to reach out and let me know!

More... Share  |  Facebook   Twitter   Reddit   Hacker News Short URL: http://netres.ec/?b=17218C7

Posted by Erik Hjelmvik on Thursday, 16 February 2017 09:11:00 (UTC/GMT)

twitter

NETRESEC on Twitter

Follow @netresec on twitter:
» twitter.com/netresec


book

Recommended Books

» The Practice of Network Security Monitoring, Richard Bejtlich (2013)

» Applied Network Security Monitoring, Chris Sanders and Jason Smith (2013)

» Network Forensics, Sherri Davidoff and Jonathan Ham (2012)

» The Tao of Network Security Monitoring, Richard Bejtlich (2004)

» Practical Packet Analysis, Chris Sanders (2017)

» Windows Forensic Analysis, Harlan Carvey (2009)

» TCP/IP Illustrated, Volume 1, Kevin Fall and Richard Stevens (2011)

» Industrial Network Security, Eric D. Knapp and Joel Langill (2014)