Terminal - Yara Analysis
Just by the Christmas tree in Santa’s lobby I’ll find Fitzy Shortstack and the Yara Analysis Cranberry Pi:
Hiya, I’m Fitzy Shortstack!
I was just trying to learn a bit more about YARA with this here Cranberry Pi terminal.
I mean, I’m not saying I’m worried about attack threats from that other con next door, but…
OK. I AM worried. I’ve been thinking a bit about how malware might bypass YARA rules.
If you can help me solve the issue in this terminal, I’ll understand YARA so much better! Would you please check it out so I can learn?
And, I’ll tell you what – if you help me with YARA, I’ll give you some tips for Splunk!
I think if you make small, innocuous changes to the executable, you can get it to run in spite of the YARA rules.
The terminal presents challenge centered around
This critical application is supposed to tell us the sweetness levels of our candy
manufacturing output (among other important things), but I can't get it to run.
It keeps saying something something yara. Can you take a look and see if you
can help get this application to bypass Sparkle Redberry's Yara scanner?
If we can identify the rule that is triggering, we might be able change the program
to bypass the scanner.
We have some tools on the system that might help us get this application going:
vim, emacs, nano, yara, and xxd
The children will be very disappointed if their candy won't even cause a single cavity.
The terminal has an application called
the_critical_elf_app, and when I try to run it, a Yara rule name is printed instead:
The implication is that before it runs, it’s scanned by a set of rules, and if any hit, it stops.
The terminal promises that
yara is installed, but it’s broken and can’t be accessed by a non-root user. This video walks through the solution, and at the end, I’ll upload
yara to the terminal and show how much easier it makes the challenge:
Fitzy has some hints about Splunk:
Thanks - you figured it out!
Let me tell you what I know about Splunk.
Did you know Splunk recently added support for new data sources including Sysmon for Linux and GitHub Audit Log data?
Between GitHub audit log and webhook event recording, you can monitor all activity in a repository, including common
gitcommands such as
git status, and
You can also see cloned GitHub projects. There’s a lot of interesting stuff out there. Did you know there are repositories of code that are Darn Vulnerable?
Sysmon provides a lot of valuable data, but sometimes correlation across data types is still necessary.
Sysmon network events don’t reveal the process parent ID for example. Fortunately, we can pivot with a query to investigate process creation events once you get a process ID.
Sometimes Sysmon data collection is awkward. Pipelining multiple commands generates multiple Sysmon events, for example.
Did you know there are multiple versions of the Netcat command that can be used maliciously?
nc.openbsd, for example.
In the badge, three Splunk-related hints unlock:
- Between GitHub audit log and webhook event recording, you can monitor all activity in a repository, including common
gitcommands such as
git status, and
- Sysmon network events don’t reveal the process parent ID for example. Fortunately, we can pivot with a query to investigate process creation events once you get a process ID.
- Did you know there are multiple versions of the Netcat command that can be used maliciously?
nc.openbsd, for example.
Angel Candysalt is with the Splunk terminal in Santa’s Great Room:
Greetings North Pole visitor! I’m Angel Candysalt!
A euphemism? No, that’s my name. Why do people ask me that?
Anywho, I’m back at Santa’s Splunk terminal again this year.
There’s always more to learn!
Take a look and see what you can find this year.
With who-knows-what going on next door, it never hurts to have sharp SIEM skills!
Each year there’s a Splunk challenge that requires answering a series of questions based on data loaded into a Splunk instance. This years starts with this introduction:
This video shows how I found each of the answers:
The answers to the questions follow:
|Capture the commands Eddie ran most often, starting with git. Looking only at his process launches as reported by Sysmon, record the most common git-related CommandLine that Eddie seemed to use.
|Looking through the git commands Eddie ran, determine the remote repository that he configured as the origin for the ‘partnerapi’ repo. The correct one!
|Eddie was running Docker on his workstation. Gather the full command line that Eddie used to bring up a the partnerapi project on his workstation.
|docker compose up
|Eddie had been testing automated static application security testing (SAST) in GitHub. Vulnerability reports have been coming into Splunk in JSON format via GitHub webhooks. Search all the events in the main index in Splunk and use the sourcetype field to locate these reports. Determine the name of the vulnerable GitHub repository that the elves cloned for testing and document it here. Inspect the repository.name field in Splunk.
|Another elf started gathering a baseline of the network activity that Eddie generated. Start with their search and capture the full process_name field of anything that looks suspicious.
|Uh oh. This documentation exercise just turned into an investigation. Starting with the process identified in the previous task, look for additional suspicious commands launched by the same parent process. One thing to know about these Sysmon events is that Network connection events don’t indicate the parent process ID, but Process creation events do! Determine the number of files that were accessed by a related process and record it here.
|Use Splunk and Sysmon Process creation data to identify the name of the Bash script that accessed sensitive files and (likely) transmitted them to a remote IP address.
On entering the last answer, the following pops up: