Don’t Forget about Domain Trusts

I recently was talking to an organization about their security posture and mostly everything I recommended to them, they had already implemented and plus some. The audits I conducted for them seconded what they were saying. I must say, I was thoroughly impressed. There was, however, one gray area that stood out to me and that was Domain Trusts. In their eyes, they didn’t have any but the Domain Controller displayed otherwise.

I’m sure everyone knows how to check via the GUI but did you also know it can be done through PowerShell? If not, let’s proceed.

From within a Domain Controller of a system with Remote System Administration Tools (RSAT) installed, we can utilize the Active Directory module which contains the Get-AdTrust cmdlet. For us to view Trusts, we can do the following:

From the above, we see the trust is with the Multiverse domain. We can also see that the direction is bi-directional, meaning it is a Two-Way Trust. It is also non-transitive, noted by the numerical one listed in the Trust Attributes property.

We can also get this same information using WMI, which we will use on the same server. To do so, we can do the following:

A simple script for this can be found at HERE.

From the above, we see the Trust Attributes property again along with a Trusted Domain property, which depicts the name of the domain we have a trust with. In addition, we see the Trust Direction property with a value of three, which depicts two-way.

For future reference, the meanings for each available value in Trusted Attributes and Trusted Direction are below.

Hidden Gems in McAfee ePO Audit Logs

There is no shortage of organizations these days running McAfee’s ePolicy Orchestrator in an effort to combat maliciousness. Much like any endpoint security platform, it has its strengths and weaknesses. One of the great features of the application is that it contains an audit log containing authentication information to include any supplied usernames, as shown below.

As weird as it may seem, sometimes people will type in their username AND their password on the same line and submit them for authentication, not realizing the mistake. Well, McAfee ePO logs that information as is. In the example below, the password is likely ‘PA$$word1337’.

This happens more often than what you may think and users with the applicable permissions can see this information in the audit logs. From a blue team perspective, being cognizant of the likelihood of this artifact is vital. As a red team member, this is easy pickings. Not only will it likely get you into the application with the user’s information in the logs but that user could also have more rights than the user you may be currently using. Also, whatever the password is for the user whose information is in the log, it is also likely to be that user’s password to their local or domain account or possibly to other applications in the network.

Finding Reflective DLL Injections

DLL injections that originate from a malicious DLL written to a disk are commonly detected by any decent AV product. Detecting reflective DLL injection, however, are not as straightforward. Malware injected directly into a process using reflective DLL injection typically will not exist on disk. A co-worker of mine developed a tool called Evil Injection Finder (EIF), which is designed to help you find those evil injections! Administrative rights are currently necessary to adequately examine the memory of running processes. Some memory pages will be unreadable if marked as protected processes by the OS, such as LSASS.

The example below demonstrates using EIF with a signature file to find injects in all processes on the system. A meterpreter has been loaded into MicrosoftEdge using reflective injection.

I wanted to run EIF on remote systems but it didn’t have that capability so I developed EIF_Parser, which provides the following capabilities:

  • Executes Evil Inject Finder (EIF) on a remote system or systems
  • Retrieves the data gathered by EIF on remote systems
  • On the local system, presents only the processes with ‘yes’ in the MZ or DOS column
  • Logs systems not accessible, for one reason or another

The tools can be found at the below links:

Hunting Self-signed Certificates

Self-signed certificates could be indicative of malicious behavior on a system and being able to identify them is a key task in responding to an incident. Having self-signed certificates in an environment isn’t always a bad thing but not being able to identify them and their purpose is! Nonetheless, taking to PowerShell we can search for them and there is a Certificate PSDrive that will assist with our process. A key indication of self-signed certificates is the issuer and subject being the same. It is also worth noting that a CA may issue a certificate to itself to support key rollover or changes in certificate policies.

This and more PowerShell scripts can be found on my GitHub.

Hashes of All Running Processes

A great starting point for anyone analyzing a system is the running processes. Taking the time to not only retrieve the command line execution of the process but also the parent process will enable you to find outliers. Taking it a step further, retrieving the hashes of the binary of each process expand your aperture substantially. Especially when you are able to group and stack those hashes against those from other machines. With that in mind, I’ve written a simple little script that will get the hashes of all running processes.

Link to code:
https://github.com/WiredPulse/PowerShell/blob/master/Processes%20and%20Services/Get-ProcessHash.ps1

PowerShell Cheat Sheet

I recall when I started out in PowerShell coming from Python. Some aspects of the language I was able to pick up on rather quickly while other aspects took some take. I found myself writing down notes until I was able to remember them on my own. Reminiscing on that inspired me to develop a cheat sheet for others who are aspiring to make the jump. The cheat sheet is not an all incompassing list but it touches on most of the important areas of the language to get a person started.

PS Cheat Sheet

Find Malicious Versions of CCleaner

In light of the recent discovery about the malicious versions of CCleaner and the millions affected, it felt like a great time to write some PowerShell scripts that enable a person to identify if the malicious versions of CCleaner are on a system and if so, provides a method to delete the software.

The below checks a local machine for the malicious versions of CCleaner.

Using PS Remoting, the below allows you to get a list of systems with the infected versions.

Using PS Remoting, the below allows you to remove CCleaner with the infected versions.

Using WMI, the below allows you to look for the infected versions. It also writes a log of infected and not infected machines along with deleting the software from the infected machines.

 

 

Determining WinRM connections to a Machine

PSRemoting is an awesome feature in Microsoft Windows that serves as a ssh-like function. In Server 2012 and newer, it is enabled by default. You will, however, need to enable the feature on any client system you’d want to use it on. Some organizations feel having the service enabled throughout their organization is more of the burden than something that will increase productivity. Most of those of thoughts stem from not knowing who and is connecting or connected to systems. Luckily, there is a built-in cmdlet to should ease the worrying.

With suitable rights on a system, we can use the below to see who is connected to our system.

Below are the results.

To clean this up a little, we can do the following:

Our results are shown below and are a little easier to understand.

You could easily setup this up on some reoccurring schedule and output it to a file for further analysis.

Base64 with PowerShell

All too often I find myself on a Windows system and need to either encode or decode base64. Rather than using an online service, installing a program, or going to a *nix based system, I took to PowerShell. In PowerShell, we can use .NET to accomplish this.

Encoding:

The result is this base64 encoded text:

Decoding:

 

Getting hashes with Microsoft’s File Checksum Integrity Verifier (FCIV)

Are you responding to an incident? Are you trying to hash particular portions of the disk for comparison with a known good hashes? Are you questioning whether or not to trust the binaries on the possibly compromised system disk in order to get said hashes? Well have no fear, Microsoft has a portable program called File Checksum Integrity Verifier (FCIV) that can help and it can be downloaded here. Since it comes from Microsoft, it will by signed by them as well.

The portable program can be executed from a CD\DVD, flash drive, or network share. With it, we can get MD5 and\or SHA1 hashes files on a system by either printing them to screen or outputting them to another file or database. With this in mind, we can feel better about our capture source and method and easily save the day! We can accomplish this using the below.

Recursively capturing hashes from a specified directory.

(more…)

Get Registry Hives and Keys Remotely

Talking with a buddy of mine, the conversation about retrieving Registry Hives and Keys remotely came up. He initially was looking for something he could use and eventually sided with an open-source program on the web. I, myself, tested said program as well and it for the most part did what it said it would. In the end though, that is just another product I could be adding to someone’s network. With that said, I took to PowerShell! Which I ended up using reg.exe wrapped in PowerShell to export to Hives and Keys. I now needed something as the workhorse to execute this remotely and that’s where WMI came in. I used it to start a process-call against a supplied list of systems and once complete, Get-ChildItem is used to pull the .reg file back to my system. The code can be found HERE.

Finding Passwords in Text Files with PowerShell

Using PowerShell, we can look in text files for strings that fit the criteria for passwords and return the potential password, file path, and line number. The criteria that is being search uses regex expressions and looks for at least four characters but no more than 15 with at least one character being upper, lower, a number, and a special character. The data is returned in a xml file and is best read back into PowerShell using out-gridview (my fav.). The code is on my GitHub located HERE.

Mass Import of McAfee Firewall Domains to block

As of late, I’ve been experimenting more and more with the McAfee HIPS Firewall with the McAfee ePO. So far, I think it is decent. It is at least stateful, so that’s a plus. The firewall has a feature to block domains and using the GUI, you can only add them one at a time. There is an option to import them but that would require us to have it in a readable format the McAfee could understand. Thinking outside the box, I decided to put an entry in the firewall and export that policy in order to get a feel for the structure. Once I did that, I was able to take a list of domains from www.malwaredomains.com, change some formatting in their file, and fit it into the McAfee format. The result is a perfectly formatted firewall policy ready to import. The workhorse of it all, PowerShell!

In my testing from testing with www.malwaredomains.com, I imported over 14000 entries and while McAfee HIPS took it, I don’t think it can handle that much as the server became incredibly slow. Nonetheless, you could now take my script, make some minor adjustments and use it with your malware domain listing of choice. Since we are on the subject, below are a few other sites that are good sources as well.

The code, by the way, is on my github at https://github.com/WiredPulse/PowerShell/tree/master/McAfee

Sinkhole Domains Using DNS with the help of PowerShell

Thanks to Jason Fossen, there was no need to create a PowerShell script to input domains to sink. He had already created one called Sinkhole-DNS.ps1 (located here). One of the options in the script is to read in a file with domains listed. I like to frequent www.malwaredomains.com for my listing of bad domains and wanted to use that to feed it into the PowerShell script so I fired up ISE and begin knocking out some code. In the end, I developed a script located here that will download domains.zip from www.malwaredomains.com, uncompress it and take only the domain names out, and output them to a file. The file can then be used with the Sinkhole-DNS.ps1 script to import the domains into DNS. The syntax for this is shown below.

Blocking DNS Tunneling at the Host

There was a time where there was an alarming rate at which malware would use some unique port that wasn’t used by other services. The port was usually some ephemeral port. These days though, it is being seen more and more of malware using ports commonly open outbound on a system such as 22, 25, 53, 80, and 443. The one that we will focus on is 53, which is registered for DNS, and blocking it from an attack commonly known as DNS tunneling.

It is best to think through what systems are needed to be contacted for DNS services. Once you have that in mind, we can begin blocking any other DNS traffic that might be malicious. We will do this with the Windows Firewall.

Configuring the way to block port 53 traffic to a specific IP address and to every IP address except for one can be found HERE.

Understand the DNS traffic can be routed through another port. Unfortunately, the Windows Firewall won’t be able to help with that detection.

Under The Wire!

Under The Wire, the PowerShell gaming server is now web based and can be access at www.underthewire.tech. On there, you will find directions to access our servers using you own instance of PowerShell. To date, we have two games that are live with another in production.

Parsing Registry files with RegRipper

The registry of a system contains a lot of good data that can be used forensic analysis. Parsing that data from dead box forensics (bit image) using RegRipper (rip.pl) will provide you with a lot of useful information. RegRipper is an automated HIVE parser that can parse the forensic contents of the SAM, SECURITY, SYSTEM, SOFTWARE, and the NTUSER.DAT HIVES that it is pointed at. You can even use this to forensically mine the contents of restore point registry files. RegRipper utilizes plugins and aside from the default ones installed during installation, more are available online. The program is available for use on Linux or Windows. The Windows variant includes a GUI.

Rip.pl can be invoked by pointing the -r HIVEFILE at the hive you would like to mine forensically. You also need to tell RegRipper the type of hive file it is (sam, security, software, system, ntuser). Hives can be found at C:\Windows\system32\config and the ntuser.dat is located on the root of the each users profile. Once RegRipper is installed on your system, you can use the below syntax to get started and useful options.

# rip.pl -r -f
[Useful Options]
-r Registry hive file to parse
-f Use(sam, security, software, system, ntuser)
-1 List all plugins
-h Help

No Need to Unzip, Just Use Zcat or Zgrep

There will be times when you may encountered a zipped file and want to quickly parse it without having to unzip it. When the time comes, zcat and zgrep will be your savior. The usages of both are very straightforward but there are man pages for both for further reading. Basic usages of the two are depicted below.

Display the contents of a zipped file

Search for specific characters/words in zipped files.

Analyzing Various Memory Capture Formats

In a world where there are so many choices for capturing memory and analyzing it, I felt there would be some benefit in compiling a list for quick reference.

FTK Imager
– Outputs to .mem
– Can be analyzed in Volatility
Vol.py –f –profile=

VMWare (.vmem)
– .vmem and .vmsm are created when a VM is suspended
– Can be analyzed with Volatility (.vmem and .vmsm have to be in the same directory) Vol.py –f –profile=

DumpIt
– Outputs to .raw
– Can be analyzed in Volatility
Vol.py –f –profile=

Hibernation file (hiberfile.sys)
– The File is created when a system is put into hibernation mode
– Located at the root of the C:\
– The file needs to be converted before using. It can be converted to .img using Volatility
Vol.py imagecopy –f hiberfile.sys –O –profile=
– After conversion to .img, it can be analyzed in Redline or Volatility
Vol.py –profile=

Mandiant Memoryze
– Outputs to .img
– Can be analyzed in Redline or Volatility
Vol.py –profile=

Crash Dumps
– Extension will be .dmp
– Will be written to C:\Windows\Minidump or C:\Windows by default
– Dumps can be forced to happen by adding the value called namedCrashOnCtrlScroll with a REG_DWORD value of 0x01 at HKEY_LOCAL_MACHINE\System\CurrentControlSet\
Services\kbdhid\Parameters. After rebooting the machine, hold down the rightmost CTRL key and press the SCROLL LOCK key twice
– Can be analyzed with Volatility
Vol.py –profile= – Can be analyzed in Redline but must be converted to .img first
Vol.py imagecopy –f –O –profile=

Application Whitelisting with Applocker

If you are a part of defending an infrastructure, then you know defense-in-depth is the name of the game. The more detection systems that can be employed to detect anomalies or malicious actions, the better chance you stand to have a safe network. One of many ways to aid in this endeavor is application whitelisting. While there are many different types of application whitelisting, I’d like to focus on the Windows Applocker.

Applocker does have some cons. For starters, the feature is available in Win7 and Server 2008R2 and up. Applocker is limited to Windows 7 Enterprise and Ultimate along with Server 2008R2 clients. For XP and Vista, you can use Software Restriction Policy to provide some defense but not as much as what Applocker would provide.

To configure it, you need to go to Administrative Tools and open up Group Policy Management console on your Domain Controller. Once you are there, right click on the Default Domain Policy and click Edit to open Group Policy Management Editor. Navigate to Computer Configuration > Policies > Windows Settings > Security Settings > Application Control Policies > AppLocker. You should see the below screen. Take note to the options available on the right pane after click on Applocker and those shown below in the left pane.

Applocker

The first thing we want to do is setup the rule enforcements by clicking on “Configure Rule Enforcement” in the middle of the right pane. From there we have the option of audit only (logging) or enforcing when a rule of that category is triggered.

The next thing we need to do is add the rules, which are the options under Applocker in the left pane. Below is a quick breakdown of those categories.

Executable Rules: This will contain the rules which apply to executable files.
Windows Installer Rules : This will contain the rules which apply for the windows installer packages with .msi and .msp extensions.
Script Rules : This will contain rules which apply to scripts files with .ps1, .cmd, .vbs, .bat, .js extensions.

Now that we have Applocker configured, we need to turn on the Application Identity service across the domain, in order to do that we will create a GPO. The path to said GPO is Computer Configuration > Windows Settings > Security Settings > System Services> Application Identity.

Network Grep for the Folks Who Love to Grep!

Network grep (ngrep) is a great program that allows you to search and filter network packets rather quickly. There is some resemblance to the well-known Linux grep program. Ngrep can analyze live traffic or saved pcaps. The man pages for ngrep are rather straightforward. Ngrep currently recognizes IPv4/ 6, TCP, UDP, ICMPv4/6 and IGMP. The program also understands regular and hex expressions, which is a huge benefit. In the simplest terms, ngrep applies the most common features of grep at the network layer. A few key switches that I will typically use are below but a full list can be found on the man pages.

-q | Will ‘quiet’ the output by printing only packet headers and relevant payloads
-t | Print the timestamp every time there is a match
-i | Ignore case
-I | Read in saved pcap
-w | Expression must match word – regex
-W byline | Linefeeds are printed as linefeeds, making the output pretty and more legible
-s | Set BPF capture length

Below are a few examples of common usages of ngrep.

This command will query all interfaces and protocols for a string match of ‘HTTP’.

If you have a network capture file in .pcap format, use -I $FILE to filter the capture instead of a network interface. This can be handy, for example, if you have a record of a networking event and you need to do a quick analysis.

Reverse of the above command, using only the -O flag will filter against a network interface and copy the matched packets into a capture file in .pcap format.

Search for .exe

Monitor for current email transactions and print the addresses.

This will grab the password and username of all ftp sessions.

Capture network traffic incoming to eth0 interface and show parameters following HTTP GET or POST methods

Monitor all traffic on your network using port 80 with a source IP of 12.34.56.78

Monitor all traffic on your network using port 80 with a source IP of 12.34.56.78 and destination of 98.76.54.32

Search the word “login” tranversing port 23 using regex

The match expression can be combined with a pcap filter. For example, suppose we wanted to look for DNS traffic mentioning cyberfibers.com

Berkley packet filter (bpf) adds to the flexibility of ngrep. Bpf specifies a rich syntax for filtering network packets based on information such as IP address, IP protocol, and port number.

IP address

IP protocol

Port number

For even more granularity, you can combine primitives using boolean connectives and, or and not to really specify what your looking for.

Extracting Data with Bulk Extractor

When it comes to forensics, styles and methodologies may vary from person to person (or organization). Some methods take longer than others and results may vary. One tool/ technique that I lean to time and time again is using Bulk Extractor. Bulk Extractor is a program that enables you to extract key information from digital media. Its usage is valuable no matter the type of case you may be working. A list of the type of information it can extract is depicted on their webpage at https://github.com/simsong/bulk_extractor/wiki/Testing.

There is a Windows and Linux variant of the program both capable of running from the command line or GUI. It is 4-8 times faster than other tools like EnCase or FTK due to its multi-threading. The program is capable of handling image files, raw devices, or directories. After it completes, it outputs its findings to an .xml file, which can be read back into Bulk Extractor for analysis. The output will look similar to below.

Bulk_Extractor

The scanners that you selected to run against your image file have will out to a report in the reports column. Not all scanners generate their own report as they may bucket the information that they find with another report. The chart above can help you determine where a scanner will output. Also, when a selected scanner doesn’t return any suitable data, you will not see a report for it. When you do select a report, it will output its findings to the middle column. From there you can type in strings to search for our just scroll down to view the data. If you want to go further into it data, just click on one of the findings in the middle column and more output will appear in the image column all the way to the right. The image column by default will display the text and the location of the data in the image file. There is an option though to change the image output from text to hex.

(more…)

Analyzing Memory in the Wonderful World of Redline

Redline is one of a few memory capture/analyzer programs that I keep in my toolkit. How it works is that the software needs to only be installed on the system that you will be analyzing the data on and from there, you would configure the options you want to include grabbing a copy of live memory and save the custom script from Redline. With said script in hand, you would then run it on the system from which memory will be captured. The output will be saved to the location from where the script ran from. With the output data in hand, you would then take it back to you analyst system and import the data into Redline.

Upon importing the data, you will be presented with a few options to aid you in your investigation and after that, you will be presented with a gui of said data. Redline will also provide an MRI (Malware Risk Index) score based on a multiple factors and if it considers it to be bad or very suspicious, it will have a red circle next to it as seen below.

redline

Redline has the ability to analyze memory captures for Memoryze and other captures in the format of .img. I have tried other memory capture formats with 50/50 results.

Another Layer of Defense… Microsoft Baseline Security Analyzer (MBSA)

Once installed, you can use the program via the GUI or command line. If utilizing the GUI, it is very straightforward as there are only three options available (scan a computer, scan multiple computers, and view existing security reports).

At the conclusion of a scan, a report will be produced at which time you will be presented with an overall assessment and a breakdown of each category analyzed. The score is broken down into four categories, which are depicted below.

• Green checkmark — check passed
• Yellow exclamation — check failed – (non-critical)
• Red “X” — check failed (critical)
• Blue “I” — additional information

An additional benefit is that the program depicts what was scanned, the result details, and how to fix the program. While MBSA shouldn’t be the only defense a user has on their system, it should definitely be in their arsenal.

When a scan is performed, the program reaches out to the Internet to get the latest information, in order to accurately depict the state of the system. There may be cases where an Internet connection is not feasible and in that case, you can use MBSA offline. The offline assessment would then only be able to provide the information it knows about as of the last time it scanned and had Internet access. The use MBSA offline yet still have updated information, you can air-gap a few files over to the system doing the scanning. The files needed to do an offline assessment are

• Security update catalog (wsusscn2.cab), available from the Microsoft website: http://go.microsoft.com/fwlink/?LinkID=74689.
• Windows Update Redistribution Catalog (wu redist.cab) at http://update.microsoft.com/redist/wuredist.cab.
• Authorization catalog (muauth.cab) for Windows Update site access, available from the Microsoft website or by examining the contents of the wuredist.cab file at http://update.microsoft.com/redist/wuredist.cab.
• Windows Update Agent standalone installers (if not already installed). The latest versions are available by examining the contents of the wuredist.cab file at http://update.microsoft.com/redist/wuredist.cab.

(more…)

Linux Secure Copy (SCP)

SCP is a must for quick transfer of files in native environments. In order to interact with a Windows machine, an SSH server is needed on the system but you may be able to get around that be specifying a different port.

Below are a few examples of how it help you in your daily work.

Copy the file “some_data.txt” from a remote host to the local host

Copy the file “some_data.txt” from the local host to a remote host

Copy the directory “some_dir” from the local host to a remote host’s directory “data”

Copy the file “data.txt” from remote host “sys_1” to remote host “sys_2”

your_username@sys_2:/some/remote/directory/
Copying the files “data.txt” and “more_data.txt” from the local host to your home directory on the remote host

Copy the file “data.txt” from the local host to a remote host using port 2264

Copy multiple files from the remote host to your current directory on the local host

Search Exchange 2010 Mailboxes

NOTE: The user you run the script with must have the “Discovery Management” RBAC Role.

This script will search all mailboxes for email with attachments named “document1” and “document2” regardless of the file extension. The script will then copy the email message to the “admin.mailbox” mailbox in a folder called “Search_07102014”. Once the script is complete open “admin.mailbox” in Outlook and you’ll see the “Search_11242015” folder under the Inbox containing all the results.

This modification will search for all “.doc” and “.pdf” files and copy them to the same mailbox and folder.

To search for keywords use this modification.

Under the Wire v2

I just posted v2 of Under to Wire which contains an additional 5 levels to Century. V2 can be found at the link on the right-hand side of the screen or here.

This release will be the last one containing Century and the next variation that the team and I will be working on will be called Cyborg. It will still have the same feel as Century but will be focused primarily on Active Directory, DNS, DHCP and few other random areas that will total somewhere around 20 to 25 levels (like Century).

I hope you enjoy the additional 5 levels of Century and stay tuned for the release of Cyborg within Under the Wire.

Traffic Generators

These tools will generate traffic and transmit it, retransmit traffic from a capture file, perhaps with changes, or permit you to edit traffic in a capture file and retransmit it.

• Bit-Twist includes bittwist, to retransmit traffic from a capture file, and bittwiste, to edit a capture file and write the result to another file (GPL, BSD/Linux/OSX/Windows)

• Cat Karat is an easy packet generation tool that allows to build custom packets for firewall or target testing and has integrated scripting ability for automated testing. (Windows)

• D-ITG (Distributed Internet Traffic Generator) is a platform capable to produce traffic at packet level accurately replicating appropriate stochastic processes for both IDT (Inter Departure Time) and PS (Packet Size) random variables (exponential, uniform, cauchy, normal, pareto, …).

• epb (ethernet package bombardier) is a simple CLI tool for generating/converting ethernet packets from plain text/pcap/netmon/snoop files. (BSD like, Linux/Unix)

• Mausezahn is a free fast traffic generator written in C which allows you to send nearly every possible and impossible packet.

(more…)

Unzip a file that is zipped many times

This script is used for unzipping zipped files inside of a zipped file. The zipped files are password protected. I developed this because it seems like every capture the flag I do, there is a scenario where this could be used.

This Bash script can be found in my script repo on the right-hand side of the screen.

PowerShell Web Server for Raw Text Transmission

This script will create a temporary web server on the local system and will listen on the host IP and specified port. You will then be able to post some raw data that will be accessible on the network. When running the script you will be asked what port to listen on and what raw data to post. This script does not supporting the posting of files or folders.

The raw data can be accessed one of three ways.

Option 1: PowerShell — Using the below syntax to view it on the screen. It will be in
the raw content section.
Invoke-WebRequest http://<IP_Address>:<port>/default

Option 2: PowerShell — Using the below syntax to save the data to a local file
Invoke-WebRequest http://:/default -OutFile downloaded_data.txt

Option 2: Internet browser — Using the below syntax to view it in the browser
http://:/default

This PowerShell script can be found in my script repo on the right-hand side of the screen.

PowerShell Web Server for File Transmission

This script will deploy a temporary web server on the local system and will listen on the port of your choice. Once it is listening, you will be able to transfer .txt and .html files from the directory in which the script is ran from (not located). The web server will continue to run as long as the script is running.

To execute, run the script and when prompted, input a port to listen on. To access the system and the data in the directory that the script ran from, use the below syntax from another system.

Invoke-WebRequest http:/:/file_in_dir.txt -OutFile downloaded_data.txt

Example: “Invoke-WebRequest http:/192.168.1.1:8001/passwords.txt -OutFile passwords.txt”

This PowerShell script can be found in my script repo on the right-hand side of the screen.

PowerShell Network Connection Monitor Script

This script displays the current TCP/IP connections for a local or remote system to include the PID, process name, port, and its current running state (listening, established, etc..). If the port is not yet established, the port number is shown as an asterisk (*). It will also take the initial output and save it to old_state.txt and then sleep for a period of time of your choosing before running again and outputting to new_state.txt. It will then compare the two files and print the output to the screen. Both files will be saved to the directory in which the script was ran from (not located). It will continue to do this process until the script is stopped.

This PowerShell script can be found in my script repo on the right-hand side of the screen.

PowerShell Remote Process Termination

Ever remotely executed a program on another system but the process failed to exit which lead it to being an active process on the users system? No matter the cause or what your purpose on the system is, that is never a good thing. We can quickly fix the issue with PowerShell. To do so, we can use a script in which we supply it with the hostname or IP along with the process name of the process.

This PowerShell script can be found in my script repo on the right-hand side of the screen.

Disconnect… Making the Internet Safer and More Private One Connection at a Time

Have you ever been browsing the web for a good or service and notice that a totally unrelated site suggests the very same or similar items you were previously searching for? What about browsing the web and it taking forever to load a page? Did you know that some websites not only see what you are doing, but also where your physical location is? What about that some ads contain malware? If you are like most people, you may have answered no to all or some of those questions but now that you know, now what? Well the open-source Disconnect plug-in available in Google Chrome and Mozilla Firefox could help you tremendously in stopping the aforementioned from occurring. Disconnect prides itself on making the Internet safe and private while increasing browsing speeds.

So how does it work? Well, after installation, a Disconnect icon will be visible in your toolbar. Clicking on it will bring up the menu as shown below.

Disconnect_1

(more…)

Detecting Alternate Data Streams with PowerShell and DOS

Alternate Data Streams (ADS) are nothing new and there are a few ways to detect them within a NTFS filesystem. My tools of choice for detecting an ADS is LADS (List Alternate Data Streams) by Frank Heyne or SysInternals’ Streams… both of which work rather well. My issue though is that I, much like the customer, normally wants to limit and lessen the amount of external tools that are added to any of their systems. Resident to Microsoft Windows, we have a way to do some detection using one of two ways but one provides a little more capability than the other. Let’s check them both out.

The DOS way depicted below will recursively search a directory (/s), search for ADS (/s), and then look at the string “:DATA”.

The PowerShell way is depicted below. Be advised that the cmdlet used below goes back as far as version 2. The –Stream option was not available until version 4.

If you just executed these commands, you probably noticed how a number of the files might have popped up matching the (more…)

WMI on Linux

WMI is a great way to query Windows systems without being so intrusive. As of late, I have been dealing with it more and more. Typically, I use a Windows system to query another Windows system but the lack of speed inherit to the Windows OS always has me searching for better ways to complete simple tasks. I quickly turned to Linux as its speed one of many great features of the OS. Using WMI within Linux is achievable although many may not know it. Getting started is pretty simple, to do so check out the below.

1. Install the repo (CentOS 6 or newer).
[nando@localhost home]$ rpm -Uvh http://www6.atomicorp.com/channels/atomic/centos/6/x86_64/RPMS/atomic-release-1.0-19.el6.art.noarch.rpm

2. Install WMIC from the repository.
[nando@localhost home]$ yum –y install wmi

Some common queries and what the grab are below.
wmic -U admin%admin1234 //192.168.2.2 “SELECT CommandLine,Name,ProcessId FROM Win32_Process”

wmic -U admin%admin1234 //192.168.2.2 “SELECT * FROM Win32_ComputerSystem”

Under the Wire… Windows Shell War Gaming

My boss and I had a conversation a few months ago regarding Over the Wire, a Linux war gaming server. The conversation revolved around how it was a great tool for those trying to build strength in Linux. From that conversation, we had a thought of why there wasn’t a variant on Windows focusing on the command line and from that thought came Under the Wire.

Under the Wire is a Windows Server 2008R2 Core system. The war game focuses on the Windows command line and the hope is that it helps people hone their skills or gain a better understanding for some of the things that can be done with a Windows shell.

It’s not expected for anyone to know everything they will encounter in this game, so please don’t panic, as the purpose of the game is to learn.

The object of the game is to use the hints for each level to find the password for the following level. For example, the password for level 2 is somewhere in level 1 and the password for level 3 is somewhere in level 2. That is the case for all levels, with level 20 being the last one. Once you have successfully logged into level 20, you have successfully completed the game.

The VM, instructions, and change log can be found here -> Under the Wire

General Notes:
• All passwords are lowercase regardless of how they may appear on the screen.

• The username for logging in will be century plus the corresponding level number. For example, the username for level 1 is century1 and for level 2, it would be century2 and so forth.

• The default shell is Powershell but you can switch to command line if you want. You can easily switch back and forth by typing cmd or powershell in the shell. If you wish to have multiple shells open, you can achieve that by doing the below.
1.Type taskmgr in the shell
2. Hit file > New Task (run…)
3. Type powershell or cmd

• You may find that while trying to accomplish a level using one shell it may render an access denied error. If that is the case, please just use the other shell. During testing, at least one of the shells worked for every level.

• You may be warned that this isn’t a genuine copy of windows. That alert is due to not having a product key and the trial period expiring. It does not hinder the game in any way other than the warning popping up. If it appears, simply exit out of it and continue on.

• Some things that may help you in the game are below.
– The Internet
– Get-help
– /?
– The Tab key will help with finishing out commands

Pushpin… Taking Reconnaissance to Another Level

If you are on the offensive side, part of your strategy encompasses reconnaissance at some point. If you are on the defensive side, there is still reconnaissance to be done in order to see what is available about you. Well, a great tool to add to your tool bag is Recon-ng as it makes the recon process simple and seamless. An awesome feature of the program is Pushpin. Pushpin allows you to utilize APIs and grid coordinates in order to display any postings within a designated area. This capability is incredible and could be used for a number of reasons. In any case, a list of the currently released APIs can be found at https://bitbucket.org/LaNMaSteR53/recon-ng/wiki/Usage%20Guide. In most cases, you will have to register with the site in which you are trying to get an API for. Some of the APIs include Twitter, YouTube, LinkedIn, and Instagram. Also, the program has a Metasploit type feel so if you are comfortable with that, you will do just fine. The source code can be found at https://bitbucket.org/LaNMaSteR53/recon-ng/src.

To give you a feel for how simple it is, I’ll walk through running the program with Twitter APIs and we will use the Georgia Dome in Atlanta as our area of interest. We will start at the point following installation.
(more…)

Shipping Windows logs to Logstash via Nxlog

In order to correlate the logs of your system, you are either going to have to manually upload them to your correlation system or setup an automated way. Nxlog is one of a few agents that will enable automated shipping of logs. I particularly like it because it is light on the system and not a pain to setup. Below are the steps to get you going. I will be shipping the logs using the json format. There are many formats available, one just has to do research on which one satisfies their needs. The configuration we will use transports the logs over port 3515, so you will need to ensure that the port is open.

1. Navigate to http://nxlog.org/products/nxlog-community-edition/download and download the .msi version for Windows.

2. Install the downloaded .msi using the default options.

3. After installation is complete, open the configuration file located at C:\program files (x86)\nxlog\conf\nxlog.conf.

4. Replace the contents of the file with the below. The only thing you need to change IP address 111.111.111.111 with the IP of your Logstash server.
(more…)

ELK, the free alternative to Splunk

Installation of ELK is not too bad. There are a few guides online that walk through the processes but you will be hard pressed to find one to covers it all the way through. Some great links to help with this endeavor are:

https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-4-on-ubuntu-14-04

https://www.ddreier.com/setting-up-elasticsearch-kibana-and-logstash/

http://www.networkassassin.com/elk-stack-for-network-operations-reloaded/

https://www.elastic.co

For those who are inclined to install ELK in Windows, these sites are pretty useful.

https://community.ulyaoth.net/threads/how-to-install-logstash-on-a-windows-server-with-kibana-in-iis.17/

http://girl-germs.com/?p=438

ELK stack, what is that?

In a previous post I did a comparison of ELK and Splunk. I will take a few minutes here to kind of explain what ELK is. ELK stack (Elasticsearch, Logstash, Kibana) is simply amazing. Each program making up ELK brings their own uniqueness and are vital parts to making the thing work. Elasticsearch provides the search capability for Kibana. Logstash is the receiver of all the logs being ingested into ELK. Kibana is the visual portion of the stack allowing for the searching, correlation, and dashboards. The picture below brings it all together for us.

ELK Pic

Bare Monkey (Volatility)

I’ve been working on Bare Monkey for a few months now. Bare Monkey inputs a Windows memory capture and runs it against all Volatility plugins and outputs them to a text file. Afterwards, it deletes the generated files that are empty and then compresses the files left. It also creates a tarball and a MD5 hash. The README and code can be found on my github at www.github.com/wiredpulse/BareMonkey.

You will have to change the extension to .sh and chmod 755.

Some of the benefits of the program are that Volatility will no longer be needed after the program runs, you can analyze the output with a text editor, and grep through the data rather quickly.

Splunk vs. ELK Stack

When conversing about log collection and correlation on an Enterprise level, Splunk usually always comes up in the conversation. While I am an avid Splunk fan, outside of the free version, it can be a little expensive. ELK (Elasticsearch, Logstash, and Kibana) is very comparable to Splunk, in my opinion. Through my research and hands-on experience with the two, I’ve formulated the below thoughts and comparison.

 

Cost (Monetarily):

Splunk: Free up to 500MB a day. The paid version has unlimited indexing per day.

ELK: Free. There is a newer paid version that comes with support.

 

Cost (Time):

Splunk: One could have it up and running rather quickly. The amount of time already spent on (more…)

Converting a DD image into a VM – pt. 2

This is part 2 of the tutorial to convert a DD image into a VM. The below instruction picks up from the position that one already got a DD image and has it unzipped and uncompressed. To finish the task, please read on.

1. Copy the target_image from your linux forensics system to your Windows forensics system

2. To convert the raw file into a virtual machine using Live View, change the extension of the targetimage raw file to .dd

3. Create a folder on the desktop of your Windows forensics system for which we will put the VM after conversion.

4. Open Live View 0.8 short cut on desktop

5. When the program opens, make the following changes. Once complete, your screen should look like the below.

– Ram size: 1024 (default is 512)

– Operating system: Linux

(more…)

Converting a DD image into a VM – pt. 1

 

A good buddy of mine introduced me to LiveView, which creates virtual machines from DD images. There were a number of other programs out there that can do the same thing but didn’t seem as smooth as LiveView is.

One may be wondering why what is the need for all of this? Well, let’s say you are inspecting a suspected or known compromised system. Good practice is to not do anything (or at least as little as possible) to the system in question. In order for one to preserve the system and get an image to work off of, we can make a DD (binary) image. From there, we can use LiveView and convert the DD image into a working virtual machine. From there, one can get a memory capture and/or begin any other forensics on the system yet not affect the original hard drive. LiveView can be found at http://www.cert.org/digital-intelligence/tools/liveview.cfm. You will need to install it on your Windows forensics system prior to continuing.

 

Below are the instructions on using the software that my buddy made.

1. Access the target from the forensics system (linux) using SSH

2. Elevate privileges

(more…)

Collaboration with Elog

Elog is a great program used for collaboration in a LAN or WAN environment. Its very simple to use and easily customizable. This program is ideal for sharing notes or analyzing data and ensuring everyone else knows what is going on. There is an email function as well and the ability to export and import notes/data if desired. The program can be downloaded here: https://midas.psi.ch/elog/index.html. Below are some of the things I did for customization

Alter the look of the program, it’s a .css written in html — /usr/local/elog/themes/default/default.css

Removed the word ‘demo’ from the URL and from the page and changed it to something else — /usr/local/elog/elogd.cfg

Add/adjust the fields of the form — config option listed on the menu bar of the program

Log transcript — /usr/local/elog/logbooks

After you adjust any of these, you are going to restart the elogd service and reload apache.

Splitting up a Large VM for Easier Transmission

Here is the scenario: you have a VM that you want to transfer to another system over the Internet. The VM, in its entirety, is too big to transfer as is. So what do we do? Well, we could convert the .vmx into an .ova and then split it into a few manageable sizes for transport. Once on the distant end, we can easily put it all back together. Using the steps outlined here: https://communities.vmware.com/message/2244209, we can do this. Below are very generic steps to achieve this.

1. Convert the VM’s .vmx to ova in terminal

2. Use the ‘split’ commands to breakdown the ova into manageable sizes (I usually do mine in 550 MB (550000000 bytes)). In this case the command would be ‘split -b 550000000 your_vm.ova vm_brokedown

3. Transfer the smaller files to the destination

4. In terminal on the distant end, type cat vm_brokedown* > your_vm.ova

5. Import the ova into the Hypervisor of your choice.

Renaming a Linux NIC interface

You may be wondering why this is even a topic of discussion. Well, certain Linux distros such as CentOS come with the main interface as eth0. For me, it’s not as big of a deal. The concern comes in when I am developing baselines and distributing them back into the community. The more I can do to ensure that things look the same across the distros, the better. In order to rename the interface, one can do the below.

1. Open a terminal and ensure you are Root.

2. Get the MAC and current listing of the interface. Be sure to make note of the MAC for a future step.

(more…)

Memory Capture with FTK Imager

I previously wrote about using DumpIt for Windows memory captures. If all you need from a system is to capture memory, it fits the bill rather well. There have been some times where it’s given me some issue grabbing memory over 8GB. Nonetheless, what if you need to do more? Let’s say you need to get a binary image also, DumpIt can’t help you there. FTK Imager will do both and more. Today I’ll speak on the memory capture piece and will visit the binary image capture at a later time. To get a capture, follow the below very simple directions.

1. Download FTK Imager from their official site at http://accessdata.com/product-download.

2. Once downloaded and installed, open the program.

3. Click ‘File’ and select ‘Capture Memory’ as depicted in the below picture.

ftk-1

(more…)

The One Page Linux Manual

For those trying to learn Linux, it can be a daunting task. The are a number plethora of resources online and built into the OS. For those just looking for something “light” and tangible, I recommend the one page Linux manual. It fits the bill for the most part (although its actually two pages).

The One Page Linux Manual

Creating a share on Linux and accessing via Windows

There are many times when there is data on a Linux system that needs to be moved to another system like Windows. Well, the question is how do you do that? The method that I have found to be the easiest is to use Samba. Below are the steps to achieve the overall intent.

1) Install Samba. The below syntax is for Debian based systems. For RPM, do “yum install samba”

2) Configure a username and password that will be used to access the share. In this case, the user I will use is john as he is already a user on my system.

(more…)

To broadcast a SSID or not to broadcast a SSID, that is the question

For some, wireless security and securing ones home network can mean a number of things. Some people feel that disabling the broadcast of their SSID gives them that extra layer of security. Depending on the context of the conversation at that time, I can somewhat see their perspective. From my standpoint, I will most of the time disagree with disabling SSID broadcasting. Mainly due to the commercial tools available that will decloak an SSID revealing it. My professional opinion is that everyone who does this is just trying to protect their network but it does intrigue me as to what is so important that they are trying to secure. With that said, the aforementioned could draw people to your network just to figure that out.

Everything we did in life has risks. So the risks here are 1) one can broadcast their SSID, blend in, and hope to not be attacked or 2) not broadcast and take the chance of someone not using any tools to identify cloaked networks.

Jump Bag Stuff

Wifi-Pineapple – https://hakshop.myshopify.com/products/wifi-pineapple?variant=81044992

PWN Plug – https://www.pwnieexpress.com/product/pwn-plug-elite/

Read-Only Flash Drive – http://www.kanguru.com/storage-accessories/flash-blu2.shtml

SmartSniff – http://www.nirsoft.net/utils/smsniff.html

Parse and Extract PST and OST Mailboxes

Libpff is a powerful mail examination tool. The tool will allow you to examine and extract data without having to attach the PST to Outlook and has the ability to view emails that are encrypted. In my example below, I will be using the tool via the SANS SIFT workstation as it is already installed. If you want to the program on a different distribution, the source code can be found at https://github.com/libyal/libpff. While I have an example below of parsing the information, I encourage you to check out the man pages as it is pretty short and straightforward.

Note: the PST I am using is called target_pst.pst

1) Export the PST.

2) Verify that a target.pst.export, target.pst.orphans, and target.pst.recovered directory are now present.
(more…)

Parsing Metadata with ExifTool

Its one thing to have a piece of data but its another thing to be able to get the metadata about said data. ExifTool (http://www.sno.phy.queensu.ca/~phil/exiftool/) is a tool that will allow just that. Its command line based but there is a GUI version as well called pyExifTool (https://hvdwolf.github.io/pyExifToolGUI/). The tool not only allows you to read the metadata but also change it, if necessary. A person could also add his or her own custom tags as well. Below is an example of using the program.

Note: My JPG file name is called pic11.jpg

1) Examine the file using ExifTool

(more…)

Windows Memory Capture using DumpIt

One of the simplest tools for capturing memory from a Windows system is DumpIt. The program is very portable and saves the capture to wherever the program is ran from. Most people will run it from a flash drive but depending on your company’s security policy that may not be an option so one can run it from a network share as well. It is advised not to save the program to the system you want to capture from and run it from I was going to document the steps but there is no need, it is just that simple. Below is the link for the software and if need be, there is a video depicting the steps.

http://www.moonsols.com/2011/07/18/moonsols-dumpit-goes-mainstream/

Memory Capture via Hibernation File

If you are having a hard time getting a memory capture using commercial tools, have no fear, Microsoft to the rescue! Starting with Win2K, each version of Windows has supported OS hibernation. When you put a system into hibernation, it creates a hiberfil.sys file on the root of the filesystem (in most cases, C:\). That in itself is a capture of memory. The only problem is that you can’t just right-click and copy the file as it is locked. You could possibly copy by booting into safe mode (I haven’t tried it), slave the hard drive to another system and copy that way, or use some third-party program. The one that I recommend is X-ways WinHex. There is a free version of the software but due to the size of the hibernation file, you will need the licensed version, which costs $222.

Assuming you have the licensed version, below are the steps to copy the hibernation file.

1) Verify there is a hiberfil.sys file on the root of your filesystem (most likely c:\). If the file is not there, ensure hibernation is enabled and then put your system into hibernation. Once powered off, turn it back on and check again.

(more…)

Display Credentials For All Previous Wireless Networks Connected To

I was at a friend’s house and needed to connect my laptop to his network. My friend was reluctant to give me password to his network and decided to type it in himself. In his mind, he was just doing his part to provide some security to his home network, so I don’t blame him but it did spark my curiosity as to was there a way to pull the password. So I fired up PowerShell and began pegging away. The below code will return SSID and passwords for all systems the computer it is read from his connected to.
(more…)

Determining what profile to use when analyzing Windows memory in Volatility

No need to guess or experiment with different profiles, let Volatility figure that out for you. In testing, this worked with all formats that Volatility supports. If you were the one to do the memory dump or if the file was labeled OS information, this wouldn’t be a concern or a needed step. To let the magic happen, follow the below.

This analyzes the memory capture metadata and displays which profile is suggested to be used.

The output will be something similiar to this:
(more…)

Forensics Posters

Anybody getting into forensics knows its like putting on a pair of glasses and seeing things in a whole new light. Part of being able to identify bad or evil is being able to identify normal. In my opinion, SANS did a pretty good job depicting some common things to look for when beginning the forensics process. The posters can be found at the below link.

http://digital-forensics.sans.org/blog/2014/03/26/finding-evil-on-windows-systems-sans-dfir-poster-release

Building a profile for Volatility

After capturing Linux memory using LiME (or your program of choice), we can analyze it using Volatility. In order to do so, you will need to build a profile for Volatility to use. The profile is based on the kernel/version of the system in which the memory capture was done on. The maintainers of the Volatility Project have a repo of pre-built profiles on their page located at https://github.com/volatilityfoundation/profiles/tree/master/Linux. Carnegie Mellon University also has prebuilt profiles as well and they are located at https://forensics.cert.org.
In order to build a profile, following the below instructions. For this demo, I am using a Kali 1.0.9 (Debian) system to build my profile on an Ubuntu system to do the analyzing on.

1) Install dwarfdump. On RedHat(Fedora)-based systems, this can be done by typing ‘yum install dwarfdump’

2) Download the necessary source code to compile the module.dwarf file

3) Change directory into the newly created vol-mem-profile directory

(more…)

Linux Memory Capture with LiME

When doing forensics, grabbing a capture of the live memory is vital. There are a few different programs out there to accomplish the task but in my testing, I felt LiME was the best choice. It wasn’t intrusive at all on the system and was pretty straightforward. Once I compiled it, I loaded it up on my flash drive and on I went. Below are the steps I took to achieve it all.

Notes: I am using a Kali system and will be moving the compiled LiME program to the target using a flash drive.

1) Make a directory for LiME.

2) Change Directory into the newly created lime directory.

(more…)

Bitnami