NETRESEC Network Security Blog - Tag : DNS


Installing a Fake Internet with INetSim and PolarProxy

INetSim + PolarProxy

This is a tutorial on how to set up an environment for dynamic malware analysis, which can be used to analyze otherwise encrypted HTTPS and SMTPS traffic without allowing the malware to connect to the Internet. Dynamic malware analysis (or behavioral analysis) is performed by observing the behavior of a malware while it is running. The victim machine, which executes the malware, is usually a virtual machine that can be rolled back to a clean state when the analysis is complete. The safest way to prevent the malware from infecting other machines, or doing other bad things like sending SPAM or taking part in DDoS attacks, is to run the victim machine in an offline environment. However, network traffic analysis of malware is a central part of dynamic malware analysis, which is is why a “fake Internet” is needed in most malware labs.

INetSim and PolarProxy

INetSim is a software suite that simulates common internet services like HTTP, DNS and SMTP, which useful when analyzing the network behavior of malware samples without connecting them to the Internet. INetSim also has basic support for TLS encrypted protocols, like HTTPS, SMTPS, POP3S and FTPS, but requires a pre-defined X.509-certificate to be loaded at startup. This can cause malware to terminate because the Common Names (CN) in the presented certificates don’t match the requested server names. The victim machine will actually get the exact same certificate regardless of which web site it visits. INetSim’s TLS encryption also inhibits analysis of the network traffic captured in the malware lab, such as C2 traffic or SPAM runs, because the application layer traffic is encrypted. PolarProxy can solve both these issues because it generates certificates on the fly, where the CN value is dynamically set to the requested host name, and saves the network traffic in decrypted form to PCAP files. It is therefore a good idea to replace the TLS services in INetSim with PolarProxy, which will be used as a TLS termination proxy that forwards the decrypted traffic to INetSim’s cleartext services.

Malware Lab Setup

Install Linux

The first step is to install a Linux VM, which will act as a fake Internet to the victim machine(s). I'm using Ubuntu Server 18.04.3 LTS in this tutorial, but you can use any 64-bit linux distro. I'm adding two network interfaces to the Linux VM, one interface with Internet access and one that connects to an isolated offline network to which the victim VM's will be connected. The offline interface is configured to use the static IP 192.168.53.19.

Important: Do not bridge, bond or enable IP forwarding between the two interfaces!

Network connection config Ubuntu Server 18.04

Install INetSim

INetSim is available in Ubuntu's repo, so it is possible to install it with "apt install inetsim". However, I recommend installing INetSim as described in the official documentation to get the latest packaged version of INetSim.

sudo -s

echo "deb http://www.inetsim.org/debian/ binary/" > /etc/apt/sources.list.d/inetsim.list

curl https://www.inetsim.org/inetsim-archive-signing-key.asc | apt-key add -

apt update

apt install inetsim

exit

INetSim listens on 127.0.0.1 by default, change this to INetSim's offline IP address by un-commenting and editing the service_bind_address variable in /etc/inetsim/inetsim.conf.

service_bind_address    192.168.53.19

Also configure INetSim's fake DNS server to resolve all domain names to the IP of INetSim with the dns_default_ip setting:

dns_default_ip    192.168.53.19

Finally, disable the "start_service https" and "start_service smtps" lines, because these services will be replaced with PolarProxy:

start_service dns
start_service http
#start_service https
start_service smtp
#start_service smtps

Restart the INetSim service after changing the config.

sudo systemctl restart inetsim.service

Verify that you can access INetSim's HTTP server with curl:

curl http://192.168.53.19

<html>
  <head>
    <title>INetSim default HTML page</title>
  </head>
  <body>
    <p></p>
    <p align="center">This is the default HTML page for INetSim HTTP server fake mode.</p>
    <p align="center">This file is an HTML document.</p>
  </body>
</html>

It looks like INetSim's web server can be accessed alright.

Install PolarProxy

Next step is to install PolarProxy as a systemd service (as instructed here):

sudo adduser --system --shell /bin/bash proxyuser

sudo mkdir /var/log/PolarProxy

sudo chown proxyuser:root /var/log/PolarProxy/

sudo chmod 0775 /var/log/PolarProxy/

sudo su - proxyuser

mkdir ~/PolarProxy

cd ~/PolarProxy/

curl https://www.netresec.com/?download=PolarProxy | tar -xzvf -

exit

sudo cp /home/proxyuser/PolarProxy/PolarProxy.service /etc/systemd/system/PolarProxy.service

We will need to modify the PolarProxy service config file a bit before we start it. Edit the ExecStart setting in /etc/systemd/system/PolarProxy.service to configure PolarProxy to terminate the TLS encryption for HTTPS and SMTPS (implicitly encrypted email submission). The HTTPS traffic should be redirected to INetSim's web server on tcp/80 and the SMTPS to tcp/25.

ExecStart=/home/proxyuser/PolarProxy/PolarProxy -v -p 10443,80,80 -p 10465,25,25 -x /var/log/PolarProxy/polarproxy.cer -f /var/log/PolarProxy/proxyflows.log -o /var/log/PolarProxy/ --certhttp 10080 --connect 192.168.53.19 --terminate --connect 192.168.53.19 --nosni nosni.inetsim.org

Here's a break-down of the arguments sent to PolarProxy through the ExecStart setting above:

  • -v : verbose output in syslog (not required)
  • -p 10443,80,80 : listen for TLS connections on tcp/10443, save decrypted traffic in PCAP as tcp/80, forward traffic to tcp/80
  • -p 10465,25,25 : listen for TLS connections on tcp/10465, save decrypted traffic in PCAP as tcp/25, forward traffic to tcp/25
  • -x /var/log/PolarProxy/polarproxy.cer : Save certificate to be imported to clients in /var/log/PolarProxy/polarproxy.cer (not required)
  • -f /var/log/PolarProxy/proxyflows.log : Log flow meta data in /var/log/PolarProxy/proxyflows.log (not required)
  • -o /var/log/PolarProxy/ : Save PCAP files with decrypted traffic in /var/log/PolarProxy/
  • --certhttp 10080 : Make the X.509 certificate available to clients over http on tcp/10080
  • --terminate : Run PolarProxy as a TLS termination proxy, i.e. data forwarded from the proxy is decrypted
  • --connect 192.168.53.19 : forward all connections to the IP of INetSim
  • --nosni nosni.inetsim.org : Accept incoming TLS connections without SNI, behave as if server name was "nosni.inetsim.org".

Finally, start the PolarProxy systemd service:

sudo systemctl enable PolarProxy.service

sudo systemctl start PolarProxy.service

Verify that you can reach INetSim through PolarProxy's TLS termination proxy using curl:

curl --insecure --connect-to example.com:443:192.168.53.19:10443 https://example.com

<html>
  <head>
    <title>INetSim default HTML page</title>
  </head>
  <body>
    <p></p>
    <p align="center">This is the default HTML page for INetSim HTTP server fake mode.</p>
    <p align="center">This file is an HTML document.</p>
  </body>
</html>

Yay, it is working! Do the same thing again, but also verify the certificate against PolarProxy's root CA this time. The root certificate is downloaded from PolarProxy via the HTTP service running on tcp/10080 and then converted from DER to PEM format using openssl, so that it can be used with curl's "--cacert" option.

curl http://192.168.53.19:10080/polarproxy.cer > polarproxy.cer

openssl x509 -inform DER -in polarproxy.cer -out polarproxy-pem.crt

curl --cacert polarproxy-pem.crt --connect-to example.com:443:192.168.53.19:10443 https://example.com

<html>
  <head>
    <title>INetSim default HTML page</title>
  </head>
  <body>
    <p></p>
    <p align="center">This is the default HTML page for INetSim HTTP server fake mode.</p>
    <p align="center">This file is an HTML document.</p>
  </body>
</html>

Yay #2!

Now let's set up routing to forward all HTTPS traffic to PolarProxy's service on tcp/10443 and SMTPS traffic to tcp/10465. I'm also adding a firewall rule to redirect ALL other incoming traffic to INetSim, regardless of which IP it is destined to, with the final REDIRECT rule. Make sure to replace "enp0s8" with the name of your interface.

sudo iptables -t nat -A PREROUTING -i enp0s8 -p tcp --dport 443 -j REDIRECT --to 10443

sudo iptables -t nat -A PREROUTING -i enp0s8 -p tcp --dport 465 -j REDIRECT --to 10465

sudo iptables -t nat -A PREROUTING -i enp0s8 -j REDIRECT

Verify that the iptables port redirection rule is working from another machine connected to the offline 192.168.53.0/24 network:

curl --insecure --resolve example.com:443:192.168.53.19 https://example.com

<html>
  <head>
    <title>INetSim default HTML page</title>
  </head>
  <body>
    <p></p>
    <p align="center">This is the default HTML page for INetSim HTTP server fake mode.</p>
    <p align="center">This file is an HTML document.</p>
  </body>
</html>

Yay #3!

curl --insecure --resolve example.com:465:192.168.53.19 smtps://example.com

214-Commands supported:
214- HELO MAIL RCPT DATA
214- RSET NOOP QUIT EXPN
214- HELP VRFY EHLO AUTH
214- ETRN STARTTLS
214 For more info use "HELP <topic>".

Yay #4!

It is now time to save the firewall rules, so that they will survive reboots.

sudo apt-get install iptables-persistent

Install the Victim Windows PC

Configure a static IP address on the victim Windows host by manually setting the IP address. Set the INetSim machine (192.168.53.19) as the default gateway and DNS server.

Windows IPv4 Properties

Download the X.509 root CA certificate from your PolarProxy installation here: http://192.168.53.19:10080/polarproxy.cer

  1. Double-click on "polarproxy.cer"
  2. Click [Install Certificate...]
  3. Select 🔘 Local Machine and press [Next]
  4. Select 🔘 Place all certificates in the following store and press [Browse...]
  5. Choose "Trusted Root Certification Authorities" and press [OK], then [Next]
  6. Press [Finish]

You might also want to install the PolarProxy certificate in your browser. This is how you install it to Firefox:

  1. Options / Preferences
  2. Press [Privacy & Security]
  3. Scroll down to "Certificates" and press [View Certificates...]
  4. In the "Authorities" tab, press [Import...]
  5. Open "polarproxy.cer"
  6. ☑ Trust this CA to identify websites. (check the box)
  7. Press [OK]

Now, open a browser and try visiting some websites over HTTP or HTTPS. If you get the following message regardless of what domain you try to visit, then you've managed to set everything up correctly:

This is the default HTML page for INetSim HTTP server fake mode.

This file is an HTML document.

Accessing the Decrypted Traffic

PCAP files with decrypted HTTPS and SMTPS traffic are now available in /var/log/PolarProxy/

PolarProxy will start writing to a new capture file every 60 minutes. However, the captured packets are not written to disk instantly because PolarProxy uses buffered file writing in order to improve performance. You can restart the proxy service if you wish to flush the buffered packets to disk and have PolarProxy rotate to a new capture file.

sudo systemctl restart PolarProxy

I also recommend capturing all network traffic sent to INetSim with a sniffer like netsniff-ng. This way you’ll get PCAP files with traffic from INetSim’s cleartext services (like DNS and HTTP) as well.

PCAP or it didn’t happen!

Credits

I'd like to thank Thomas Hungenberg and Patrick Desnoyers for providing valuable feedback for this blog post!

Posted by Erik Hjelmvik on Monday, 09 December 2019 08:40:00 (UTC/GMT)

Tags: #PolarProxy #HTTPS #SMTPS #HTTP #SMTP #DNS #Malware #TLS #PCAP #tutorial

More... Share  |  Facebook   Twitter   Reddit   Hacker News Short URL: https://netresec.com/?b=19Ce12f


CapLoader 1.8 Released

CapLoader 1.8

We are happy to announce the release of CapLoader 1.8 today!

CapLoader is primarily used to filter, slice and dice large PCAP datasets into smaller ones. This new version contains several new features that improves this filtering functionality even further. To start with, the “Keyword Filter” can now be used to filter the rows in the Flows, Services or Hosts tabs using regular expressions. This enables the use of matching expressions like this:

  • amazon|akamai|cdn
    Show only rows containing any of the strings “amazon” “akamai” or “cdn”.
  • microsoft\.com\b|windowsupdate\.com\b
    Show only servers with domain names ending in “microsoft.com” or “windowsupdate.com”.
  • ^SMB2?$
    Show only SMB and SMB2 flows.
  • \d{1,3}\.\d{1,3}\.\d{1,3}\.255$
    Show only IPv4 address ending with “.255”.

For a reference on the full regular expression syntax available in CapLoader, please see Microsoft’s regex “Quick Reference”.

One popular workflow supported by CapLoader is to divide all flows (or hosts) into two separate datasets, for example one “normal” and one “malicious” set. The user can move rows between these two sets, where only one set is visible while the rows in the other set are hidden. To switch which dataset that is visible versus hidden the user needs to click the [Invert Hiding] button (or use the [Ctrl]+[Tab] key combination). With this new release we’ve also made the “Invert Hiding” functionality available by clicking the purple bar, which shows the number of rows present in the currently viewed set.

CapLoader Invert Hiding GIF

Readers with a keen eye might also notice that the purple bar charts are now also accompanied by a number, indicating how many rows that are visible after each filter is applied. The available filters are: Set Selection, BPF and Keyword Filter.

NetFlow + DNS = Great Success!

CapLoader’s main view presents the contents of the loaded PCAP files as a list of netflow records. Since the full PCAP is available, CapLoader also parses the DNS packets in the capture files in order to enrich the netflow view with hostnames. Recently PaC shared a great idea with us, why not show how many failed DNS lookups each client does? This would enable generic detection of DGA botnets without using blacklists. I’m happy to announce that this great idea made it directly into this new release! The rightmost column in CapLoader’s hosts tab, called “DNS_Fails”, shows how many percent of a client’s DNS requests that have resulted in an NXRESPONSE or SRVFAIL response.

CapLoader 1.8

Two packet capture files are loaded into CapLoader in the screenshot above; one PCAP file from a PC infected with the Shifu malware and one PCAP file with “normal traffic” (thanks @StratosphereIPS for sharing these capture files). As you can see, one of the clients (10.0.2.107) has a really high DNS failure ratio (99.81%). Unsurprisingly, this is also the host that was infected with the Shifu, which uses a domain generation algorithm (DGA) to locate its C2 servers.

Apart from parsing A and CNAME records from DNS responses CapLoader now also parses AAAA DNS records (IPv6 addresses). This enables CapLoader to map public domain names to hosts with IPv6 addresses.

Additional Updates

The new CapLoader release also comes with several other new features and updates, such as:

  • Added urlscan.io service for domain and IP lookups (right-click a flow or host to bring up the lookup menu).
  • Flow ID coloring based on 5-tuple, and clearer colors in timeline Gantt chart.
  • Extended default flow-timeout from 10 minutes to 2 hours for TCP flows.
  • Changed flow-timout for non-TCP flows to 60 seconds.
  • Upgraded to .NET Framework 4.7.2.

Updating to the Latest Release

Users who have previously purchased a license for CapLoader can download a free update to version 1.8 from our customer portal. All others can download a free 30 day trial from the CapLoader product page (no registration required).

Credits

We’d like to thank Mikael Harmark, Mandy van Oosterhout and Ulf Holmström for reporting bugs that have been fixed in this release. We’d also like to thank PaC for the DNS failure rate feature request mentioned in this blog post.

Posted by Erik Hjelmvik on Tuesday, 28 May 2019 10:45:00 (UTC/GMT)

Tags: #CapLoader #NetFlow #regex #DNS #DGA #Stratosphere

More... Share  |  Facebook   Twitter   Reddit   Hacker News Short URL: https://netresec.com/?b=1950482


Domain Whitelist Benchmark: Alexa vs Umbrella

Alexa vs Umbrella

In November last year Alexa admitted in a tweet that they had stopped releasing their CSV file with the one million most popular domains.

Yes, the top 1m sites file has been retired

Members of the Internet measurement and infosec research communities were outraged, surprised and disappointed since this domain list had become the de-facto tool for evaluating the popularity of a domain. As a result of this Cisco Umbrella (previously OpenDNS) released a free top 1 million list of their own in December the same year. However, by then Alexa had already announced that their “top-1m.csv” file was back up again.

The file is back for now. We'll post an update before it changes again.

The Alexa list was unavailable for just about a week but this was enough for many researchers, developers and security professionals to make the move to alternative lists, such as the one from Umbrella. This move was perhaps fueled by Alexa saying that the “file is back for now”, which hints that they might decide to remove it again later on.

We’ve been leveraging the Alexa list for quite some time in NetworkMiner and CapLoader in order to do DNS whitelisting, for example when doing threat hunting with Rinse-Repeat. But we haven’t made the move from Alexa to Umbrella, at least not yet.


Malware Domains in the Top 1 Million Lists

Threat hunting expert Veronica Valeros recently pointed out that there are a great deal of malicious domains in the Alexa top one million list.

Researchers using Alexa top 1M as legit, you may want to think twice about that. You'd be surprised how many malicious domains end there.

I often recommend analysts to use the Alexa list as a whitelist to remove “normal” web surfing from their PCAP dataset when doing threat hunting or network forensics. And, as previously mentioned, both NetworkMiner and CapLoader make use of the Alexa list in order to simplify domain whitelisting. I therefore decided to evaluate just how many malicious domains there are in the Alexa and Umbrella lists.

hpHosts EMD (by Malwarebytes)

Alexa Umbrella
Whitelisted malicious domains: 1365 1458
Percent of malicious domains whitelisted: 0.89% 0.95%

Malware Domain Blocklist

Alexa Umbrella
Whitelisted malicious domains: 84 63
Percent of malicious domains whitelisted: 0.46% 0.34%

CyberCrime Tracker

Alexa Umbrella
Whitelisted malicious domains: 15 10
Percent of malicious domains whitelisted: 0.19% 0.13%

The results presented above indicate that Alexa and Umbrella both contain roughly the same number of malicious domains. The percentages also reveal that using Alexa or Umbrella as a whitelist, i.e. ignore all traffic to the top one million domains, might result in ignoring up to 1% of the traffic going to malicious domains. I guess this is an acceptable number of false negatives since techniques like Rinse-Repeat Intrusion Detection isn’t intended to replace traditional intrusion detection systems, instead it is meant to be use as a complement in order to hunt down the intrusions that your IDS failed to detect. Working on a reduced dataset containing 99% of the malicious traffic is an acceptable price to pay for having removed all the “normal” traffic going to the one million most popular domains.


Sub Domains

One significat difference between the two lists is that the Umbrella list contains subdomains (such as www.google.com, safebrowsing.google.com and accounts.google.com) while the Alexa list only contains main domains (like “google.com”). In fact, the Umbrella list contains over 1800 subdomains for google.com alone! This means that the Umbrella list in practice contains fewer main domains compared to the one million main domains in the Alexa list. We estimate that roughly half of the domains in the Umbrella list are redundant if you only are interested in main domains. However, having sub domains can be an asset if you need to match the full domain name rather than just the main domain name.


Data Sources used to Compile the Lists

The Alexa Extension for Firefox
Image: The Alexa Extension for Firefox

The two lists are compiled in different ways, which can be important to be aware of depending on what type of traffic you are analyzing. Alexa primarily receives web browsing data from users who have installed one of Alexa’s many browser extensions (such as the Alexa browser toolbar shown above). They also gather additional data from users visiting web sites that include Alexa’s tracker script.

Cisco Umbrella, on the other hand, compile their data from “the actual world-wide usage of domains by Umbrella global network users”. We’re guessing this means building statistics from DNS queries sent through the OpenDNS service that was recently acquired by Cisco.

This means that the Alexa list might be better suited if you are only analyzing HTTP traffic from web browsers, while the Umbrella list probably is the best choice if you are analyzing non-HTTP traffic or HTTP traffic that isn’t generated by browsers (for example HTTP API communication).


Other Quirks

As noted by Greg Ferro, the Umbrella list contains test domains like “www.example.com”. These domains are not present in the Alexa list.

We have also noticed that the Umbrella list contains several domains with non-authorized gTLDs, such as “.home”, “.mail” and “.corp”. The Alexa list, on the other hand, only seem to contain real domain names.


Resources and Raw Data

Both the Alexa and Cisco Umbrella top one million lists are CSV files named “top-1m.csv”. The CSV files can be downloaded from these URL’s:

The analysis results presented in this blog post are based on top-1m.csv files downloaded from Alexa and Umbrella on March 31, 2017. The malware domain lists were also downloaded from the three respective sources on that same day.

We have decided to share the “false negatives” (malware domains that were present in the Alexa and Umbrella lists) for transparency. You can download the lists with all false negatives from here:
https://www.netresec.com/files/alexa-umbrella-malware-domains_170331.zip


Hands-on Practice and Training

If you wanna learn more about how a list of common domains can be used to hunt down intrusions in your network, then please register for one of our network forensic trainings. The next training will be a pre-conference training at 44CON in London.

Posted by Erik Hjelmvik on Monday, 03 April 2017 14:47:00 (UTC/GMT)

Tags: #Alexa #Umbrella #domain #DNS #malware

More... Share  |  Facebook   Twitter   Reddit   Hacker News Short URL: https://netresec.com/?b=1743fae


CapLoader 1.3 Released

CapLoader Logo

A new version of our heavy-duty PCAP parser tool CapLoader is now available. There are many new features and improvements in this release, such as the ability to filter flows with BPF, domain name extraction via passive DNS parser and matching of domain names against a local white list.


Filtering with BPF

The main focus in the work behind CapLoader 1.3 has been to fully support the Rinse-Repeat Intrusion Detection methodology. We've done this by improving the filtering capabilities in CapLoader. For starters, we've added an input filter, which can be used to specify IP addresses, IP networks, protocols or port numbers to be parsed or ignored. The input filter uses the Berkeley Packet Filter (BPF) syntax, and is designed to run really fast. So if you wanna analyze only HTTP traffic you can simply write “port 80” as your input filter to have CapLoader only parse and display flows going to or from port 80. We have also added a display filter, which unlike Wireshark also uses BPF. Thus, once a set of flows is loaded one can easily apply different display filters, like “host 194.9.94.80” or “net 192.168.1.0/24”, to apply different views on the parsed data.

CapLoader BPF Input Filter and Display Filter
Image: CapLoader with input filter "port 80 or port 443" and display filter "not net 74.125.0.0/16".

The main differences between the input filter and display filter are:

  • Input filter is much faster than the display filter, so if you know beforehand what ports, protocols or IP addresses you are interested in then make sure to apply them as an input filter. You will notice a delay when applying a display filter to a view of 10.000 flows or more.
  • In order to apply a new input filter CapLoader has to reload all the opened PCAP files (which is done by pressing F5). Modifying display filters, on the other hand, only requires you to press Enter or hit the “Apply” button.
  • Previously applied display filters are accessible in a drop-down menu in the GUI, but no history is kept of previous input filters.


NetFlow + DNS == true

The “Flows” view in CapLoader gives a great overview of all TCP, UDP and SCTP flows in the loaded PCAP files. However, it is usually not obvious to an analyst what every IP address is used for. We have therefore added a DNS parser to CapLoader, so that all DNS packets can be parsed in order to map IP addresses to domain names. The extracted domain names are displayed for each flow, which is very useful when performing Rinse-Repeat analysis in order to quickly remove “known good servers” from the analysis.


Leveraging the Alexa top 1M list

As we've show in in our previous blog post “DNS whitelisting in NetworkMiner”, using a list of popular domain names as a whitelist can be an effective method for finding malware. We often use this approach in order to quickly remove lots of known good servers when doing Rinse-Repeat analysis in large datasets.

Therefore, just as we did for NetworkMiner 1.5, CapLoader now includes Alexa's list of the 1 million most popular domain names on the Internet. All domain names, parsed from DNS traffic, are checked against the Alexa list. Domains listed in the whitelist are shown in CapLoader's “Server_Alexa_Domian” column. This makes it very easy to sort on this column in order to remove (hide) all flows going to “normal” servers on the Internet. After removing all those flows, what you're left with is pretty much just:

  • Local traffic (not sent over the Internet)
  • Outgoing traffic to either new or obscure domains

Manually going through the remaining flows can be very rewarding, as it can reveal C2 traffic from malware that has not yet been detected by traditional security products like anti-virus or IDS.

Flows in CapLoader with DNS parsing and Alexa lookup
Image: CapLoader with malicious flow to 1.web-counter[.]info (Miuref/Boaxxe Trojan) singled out due to missing Alexa match.

Many new features in CapLoader 1.3

The new features highlighted above are far from the only additions made to CapLoader 1.3. Here is a more complete list of improvements in this release:

  • Support for “Select Flows in PCAP” to extract and select 5-tuples from a PCAP-file. This can be a Snort PCAP with packets that have triggered IDS signatures. This way you can easily extract the whole TCP or UDP flow for each signature match, instead of just trying to make sense of one single packet per alert.
  • Improved packet carver functionality to better carve IP, TCP and UPD packets from any file. This includes memory dumps as well as proprietary and obscure packet capture formats.
  • Support for SCTP flows.
  • DNS parser.
  • Alexa top 1M matching.
  • Input filter and display filter with BPF syntax.
  • Flow Producer-Consumer-Ratio PCR.
  • Flow Transcript can be opened simply by double-clicking a flow.
  • Find form updated with option to hide non-matching flows instead of just selecting the flows that matched the keyword search criteria.
  • New flow transcript encoding with IP TTL, TCP flags and sequence numbers to support analysis of Man-on-the-Side attacks.
  • Faster loading of previously opened files, MD5 hashes don't need to be recalculated.
  • A selected set of flows in the GUI can be inverted simply by right-clicking the flow list and selecting “Invert Selection” or by hitting Ctrl+I.


Downloading CapLoader 1.3

All these new features, except for the Alexa lookup of domain names, are available in our free trial version of CapLoader. So to try out these new features in CapLoader, simply grab a trial download here:
https://www.netresec.com/?page=CapLoader#trial (no registration needed)

All paying customers with an older version of CapLoader can grab a free update for version 1.3 at our customer portal.

Posted by Erik Hjelmvik on Monday, 28 September 2015 07:30:00 (UTC/GMT)

Tags: #CapLoader #BPF #Berkeley Packet Filter #Rinse-Repeat #DNS #Alexa #PCAP #NetFlow #Malware #C2

More... Share  |  Facebook   Twitter   Reddit   Hacker News Short URL: https://netresec.com/?b=15914E3


NetworkMiner 1.6 Released

We've released version 1.6 of NetworkMiner today!

Confetti in Toronto by Winnie Surya Image credits: Confetti in Toronto by Winnie Surya

The new features in NetworkMiner 1.6 include:

  • Drag-and-Drop
    Reassembled files and images can be opened with external tools by drag-and-dropping items from NetworkMiner's Files or Images tabs onto your favorite editor or viewer.

  • Email extraction
    Improved extraction of emails and attachments sent over SMTP.

  • DNS analysis
    Failed DNS lookups that result in NXDOMAIN and SERVFAIL are displayed in the DNS tab along with the flags in the DNS response.

  • Live sniffing
    Improved live sniffing performance.

  • PCAP-over-IP
    Remote live sniffing enabled by bringing the PCAP-over-IP feature into the free open source version of NetworkMiner.


Identifying Malware DNS lookups

NetworkMiner Professional 1.6 with DNS traffic from the Contagio Kuluoz-Asprox

DNS traffic from the Kuluoz-Asprox botnet (PCAP file available via Contagio)

Note the NXDOMAIN responses and “No” in Alexa top 1 million column in the screenshot above; these domains are probably generated by a domain generation algorithm (DGA).

Live Sniffing with Pcap-over-IP

The PCAP-over-IP functionality enables live sniffing also on non-Windows machines, simply by running tcpdump (or dumpcap) and netcat like this:

# tcpdump -i eth0 -s0 -U -w - | nc localhost 57012
For more information about how to run NetworkMiner in Linux, please read our HowTo install NetworkMiner in Ubuntu Fedora and Arch Linux blog post.

To receive the Pcap-over-IP stream in NetworkMiner, simply press Ctrl+R and select a TCP port.

NetworkMiner Pcap-over-IP

For more information about this feature please see our previous blog post about the PCAP‑over‑IP feature.

NetworkMiner Professional

The professional version of NetworkMiner additionally contains the following improvements of the command line tool NetworkMinerCLI:

  • Enabled reading of PCAP and PcapNG data from standard input (STDIN)
  • Full support for PCAP-over-IP
  • More detailed DNS logging in NetworkMinerCLI's CSV export of DNS responses

The ability to read PCAP data from STDIN with NetworkMinerCLI makes it really simple to do live extraction of emails and email attachments. Here's an example showing how to do live SMTP extraction in Linux:

# tcpdump -i eth0 -s0 -w - port 25 or 587 | mono NetworkMinerCLI.exe -r - -w /var/log/smtp_extraction/

The syntax for extracting emails and attachments in Windows is very similar:

C:\>dumpcap.exe -i 1 -f "port 25 or 587" -w - | NetworkMinerCLI.exe -r -

The TCP ports 25 and 587, which are used in the capture filter above, are the standard port numbers for SMTP. In order to do live extraction of files sent over HTTP, simply use “port 80” as capture filter instead. Likewise, X.509 certificates can also be extracted from HTTPS sessions simply by using “port 443” as capture filter.

Download NetworkMiner 1.6

The most recent release of the free (open source) version of NetworkMiner can be downloaded from SourceForge or our NetworkMiner product page. Paying customers can download an update for NetworkMiner Professional from our customer portal.

Credits

We would like to thank Dan Eriksson (FM CERT) and Lenny Hansson (Danish GovCERT) for submitting bug reports and feature requests.

Posted by Erik Hjelmvik on Monday, 16 June 2014 11:00:00 (UTC/GMT)

Tags: #Netresec #NetworkMiner #Professional #SMTP #Extract #DNS #PCAP-over-IP

More... Share  |  Facebook   Twitter   Reddit   Hacker News Short URL: https://netresec.com/?b=1463B56


DNS whitelisting in NetworkMiner

Stack of Papers by Jenni C

One of the new features in NetworkMiner Professional 1.5 is the ability to check if domain names in DNS requests/responses are “normal” or malicious ones. This lookup is performed offline using a local copy of Alexa's top 1 million domain name list.

We got the idea for this feature via Jarno Niemelä's great presentation titled “Making Life Difficult for Malware”. Despite working for F-Secure Jarno presents several smart ideas for avoiding malware infections without having to install an AV-product.

One of Jarno's slides contains the following suggestions:

Block Traffic To Sites Your Users Don’t Go To

Block subdomain hosting TLDs
  • co.cc, co.tv, ce.ms, rr.nu, cu.cc, cz.cc, vv.cc, cw.cm, cx.cc, etc
Block domains that provide dynamic DNS
  • *dyndns*, *no-ip*, 8866.org, thescx.info, 3322.org, sock8.com
Block file sharing sites, some malware use them
  • fileleave.com, dropbox.com, rapidshare.com, megafiles.com
For strict policy, allow DNS resolving only to Alexa top 1M[1]
  • Tip: Instead of null routing domains set up landing page
  • Either with a link that allows domain or IT ticket

Preventing users from visiting sites outside of the top 1 million websites (according to Alexa) sounds a bit harsh. In fact, we at Netresec just recently made it into the top 1M list (the current rank for netresec.com is 726 922). There are also many good and legit sites that are not yet on this list. Our idea is, however, to give analysts a heads up on queried DNS names that are not on the top 1M list by displaying this information in NetworkMiner's DNS tab.

NetworkMiner Professional 1.5 with DNS tab showing Alexa result for office.windowupdate.com

The screenshot above contains a lookup for the domain “office.windowupdate.com” (note the missing “s” in “windows”). This domain name was previously used by the C2 protocol Lurk (see Command Five's report “Command and Control in the Fifth Domain” for more details). The “Alexa Top 1M” column in NetworkMiner's DNS tab indicates whether or not the domain name is a well known domain. The malicious “office.windowupdate.com” is marked with “No”, while the proper “www.update.microsoft.com” is indeed on the list. It is, however, important to note that only the second-level domain is checked by NetworkMiner; i.e. in this case “windowupdate.com” and “microsoft.com”.

The DNS whitelisting technique can also come in handy when dealing with malware that employs domain generation algorithms (DGAs) (see the Damballa blog for additional info regarding DGAs). It is probably safe to say that these auto-generated domains should never show up in the Alexa Top 1M list.

Posted by Erik Hjelmvik on Wednesday, 02 October 2013 22:30:00 (UTC/GMT)

Tags: #Netresec #DNS #domain #DGA #malware #Alexa

More... Share  |  Facebook   Twitter   Reddit   Hacker News Short URL: https://netresec.com/?b=13A66EB


Extracting DNS queries

There was recently a question on the Wireshark users mailing list about “how to get the query name from a dns request packet with tshark”. This is a problem that many network analysts run into, so I decided to write a blog post instead of just replying to the mailing list.

Note: the pcap file used in this blog post is from the DFRWS 2009 Challenge.

Who queried for a particular domain?

Tshark can easily be used in order to determine who queried for a particular domain, such as google.com, by using the following command:

tshark -r nssal-capture-1.pcap -T fields -e ip.src -e dns.qry.name -R "dns.flags.response eq 0 and dns.qry.name contains google.com"
137.30.123.78 google.com
137.30.123.78 www.google.com
137.30.123.78 id.google.com
137.30.123.78 images.google.com
137.30.123.78 tbn2.google.com
137.30.123.78 tbn0.google.com
137.30.123.78 tbn2.google.com
137.30.123.78 tbn1.google.com
137.30.123.78 tbn3.google.com
137.30.123.78 tbn3.google.com

List all queries

A list of ALL queries can be built with the same command, but without filtering on a particular domain:

tshark -r nssal-capture-1.pcap -T fields -e ip.src -e dns.qry.name -R "dns.flags.response eq 0"
137.30.123.78 fp.ps3.us.playstation.com
137.30.123.78 cmt.us.playstation.com
137.30.123.78 google.com
137.30.123.78 www.google.com
137.30.123.78 www.mardigrasday.com
137.30.123.78 pagead2.googlesyndication.com
137.30.123.78 googleads.g.doubleclick.net
137.30.123.78 www.google-analytics.com
137.30.123.78 mardigrasday.makesparties.com
137.30.123.78 images.scanalert.com
137.30.123.78 a248.e.akamai.net
137.30.123.78 ssl-hints.netflame.cc
...

DNS lists in NetworkMiner

There is a DNS tab in NetworkMiner, which displays a nice list of all DNS queries and responses in a pcap file. Loading the same nssal-capture-1.pcap into NetworkMiner generates the following list:


DNS tab with nssal-capture-1.pcap loaded

NetworkMiner Professional also has the ability to export this data to a CSV file. The command line tool NetworkMinerCLI can also generate such a CSV file without a GUI, which is perfect if you wanna integrate it in a customized script.

Posted by Erik Hjelmvik on Sunday, 17 June 2012 17:45:00 (UTC/GMT)

Tags: #domain #tshark #pcap

More... Share  |  Facebook   Twitter   Reddit   Hacker News Short URL: https://netresec.com/?b=126C5CB

twitter

NETRESEC on Twitter

Follow @netresec on twitter:
» twitter.com/netresec


book

Recommended Books

» The Practice of Network Security Monitoring, Richard Bejtlich (2013)

» Applied Network Security Monitoring, Chris Sanders and Jason Smith (2013)

» Network Forensics, Sherri Davidoff and Jonathan Ham (2012)

» The Tao of Network Security Monitoring, Richard Bejtlich (2004)

» Practical Packet Analysis, Chris Sanders (2017)

» Windows Forensic Analysis, Harlan Carvey (2009)

» TCP/IP Illustrated, Volume 1, Kevin Fall and Richard Stevens (2011)

» Industrial Network Security, Eric D. Knapp and Joel Langill (2014)