Advent of Cyber 2023

December is here, and this year it feels different—it's my first December with my newfound interest in cybersecurity. To broaden my understanding of the various fields, I took on TryHackMe's 'Advent of Cyber' - 24 days of Christmas-themed CTFs on:

  • Penetration Testing
  • Security Operations and engineering (DevSecOps)
  • Digitial forensics and incident response
  • Machine learning
  • Malware analysis

Below are some of my thoughts and learnings from this event.

Day 1: Machine Learning Chatbots - Security Concerns and Defenses

Learning Objectives: Understanding natural language processing, prompt injection attacks, and defense mechanisms.

I explored how AI chatbots function through natural language processing (NLP) and identified the vulnerabilities they face, particularly prompt injection attacks. These attacks manipulate the chatbot’s responses by inserting specific queries. To defend against such vulnerabilities, I learned about using well-constructed system prompts and implementing AI-assisted interceptors to prevent malicious inputs.

Day 2: Log Analysis with Data Science

Learning Objectives: Introduction to data science in cybersecurity, and key libraries like Pandas and Matplotlib.

This challenge focused on applying data science principles to cybersecurity tasks. I worked with Python and learned to use Pandas for data manipulation and Matplotlib for data visualization. By analysing log data, I gained insights into network activities and identified potential security threats.

Day 3: Brute-Forcing with Hydra

Learning Objectives: Understanding password complexity, generating password combinations with Crunch, and using Hydra for brute-force attacks.

The third day’s challenge delved into password security. I learned how the complexity of passwords impacts the feasibility of brute-force attacks. Using tools like Crunch to generate password lists and Hydra to automate the brute-forcing process, I gained practical experience in testing the strength of passwords and understanding the importance of robust password policies.

Day 4: Custom Wordlist Generation with CeWL

Learning Objectives: Understanding CeWL, its capabilities, and how to leverage it for generating custom wordlists.

CeWL, a tool that spiders websites to generate wordlists based on their content, was the focus. I learned to create tailored wordlists for brute-forcing login pages or uncovering hidden directories by extracting words from a site's HTML, URLs, and content. By using CeWL and wfuzz together, I was able to successfully brute-force a login portal and understand the power of context-specific wordlists in penetration testing.

Day 5: Reverse Engineering with DOS

Learning Objectives: Navigating legacy systems, understanding DOS, and learning about file signatures and magic bytes.

I delved into the Disk Operating System (DOS) and its modern-day counterparts, the Windows Command Prompt and PowerShell. This challenge highlighted the importance of understanding file management, directory structures, and command syntax. I also learned about the significance of file signatures and magic bytes in data recovery and file system analysis, restoring a backup file by identifying and correcting its magic bytes.

Day 6: Memory Corruption - Memories of Christmas Past

Learning Objectives: Understanding memory safety in programming languages, buffer overflows, and exploiting memory corruption

I learned about the risks of memory corruption in certain programming languages, particularly how variables might overflow into adjacent memory and corrupt it. By exploiting a simple buffer overflow, I was able to manipulate memory directly, leading to unintended behavior in the application. This practical experience underscored the importance of secure coding practices to prevent such vulnerabilities.

Day 7: Log Analysis - ‘Tis the Season for Log Chopping!

Learning Objectives: Revisiting the importance of log files, understanding proxy logs, and building Linux command-line skills for log analysis.

This challenge focused on analysing proxy logs to uncover potential security incidents. I honed my Linux command-line skills, learning to use commands like cat, grep, cut, and sort to parse and analyse log entries. By identifying suspicious domains and extracting meaningful information from log files, I was able to pinpoint potential security threats and retrieve exfiltrated data.

Day 8: Disk Forensics - Have a Holly, Jolly Byte!

Learning Objectives: Using FTK Imager to analyse and recover digital artifacts, verifying drive integrity, and understanding forensic analysis tools.

I delved into disk forensics using FTK Imager, a tool for acquiring and analysing computer data while preserving its integrity. I learned to navigate the user interface, preview file content, and recover deleted files. By analysing digital artifacts and verifying the integrity of evidence, I uncovered critical information such as the malware C2 server and hidden files, highlighting the importance of thorough forensic analysis in cybersecurity.

Day 9: Malware Analysis - She Sells C# Shells by the C2shore

Learning Objectives: Safe malware analysis, .NET binaries, using dnSpy for decompilation, and building a methodology for source code analysis.

I analysed a malware sample in a sandbox environment to prevent harm. Using dnSpy, I decompiled the malware written in C# and examined its code. The malware communicated with a command and control (C2) server using HTTP requests to execute commands and report results. I identified key behaviors like sleeping, executing shell commands, and implanting binaries, and found the decryption key used for C2 data.

Day 10: SQL Injection - Inject the Halls with EXEC Queries

Learning Objectives: Identifying and exploiting SQL injection vulnerabilities, using stacked queries for remote code execution, and understanding PHP's role in web development.

I identified SQL injection vulnerabilities in a PHP-based web application. By using stacked queries, I enabled the xp_cmdshell stored procedure on the SQL Server, allowing for remote code execution. I downloaded a reverse shell payload using certutil.exe and established a connection back to my system. After gaining control of the server, I restored the defaced website and retrieved several flags.

Day 11: Active Directory - Jingle Bells, Shadow Spells

Learning Objectives: Understanding Active Directory (AD), Windows Hello for Business (WHfB), exploiting GenericWrite privileges, and conducting a Shadow Credentials attack

I explored Active Directory, a centralised authentication system used in Windows environments, and learned how WHfB replaces passwords with cryptographic keys. To exploit the GenericWrite privilege, I identified write capabilities using PowerView and used Whisker to simulate device enrollment, updating the msDS-KeyCredentialLink. With Rubeus, I obtained a TGT and NTLM hash for the vulnerable user, and used Evil-WinRM for a pass-the-hash attack, gaining access to the Administrator’s desktop.

Day 12: Defence in Depth - Sleighing Threats, One Layer at a Time

Learning Objectives: Defence in Depth, endpoint hardening, and a simple Boot2Root methodology.

In this challenge, I addressed poor security practices where a server was vulnerable by design. Here’s how I elevated privileges and secured the system:

Exploitation steps:
  1. Web Shell: Used the Script Console to execute a Groovy script and establish a reverse shell.
  2. User Escalation: Found backup.sh with credentials for tracy in /opt/scripts and used SSH to log in as tracy.
  3. Root Access: Discovered tracy had sudo privileges (sudo su), gaining root access.
Defence in Depth:
  1. Least Privilege: Removed tracy from the sudo group to limit permissions.
  2. SSH Hardening: Disabled password-based SSH logins to prevent lateral movement using compromised credentials.
  3. Stronger Password Policies: Addressed weak passwords and discouraged storing plaintext credentials in scripts.
  4. Zero Trust: Reverted Jenkins config to restrict platform access, promoting zero trust.

Day 13: Intrusion Detection - To the Pots, Through the Walls

Learning Objectives: Incident analysis through the Diamond Model, defensive strategies, firewall rules, and honeypot setup.

In this challenge, I learned to use the Diamond Model for security analysis, which includes Adversary, Victim, Infrastructure, and Capability. I applied defensive strategies such as threat hunting and vulnerability management to strengthen the organization's security posture.

Defensive Infrastructure:
  1. Firewalls: Implemented stateful inspection firewalls using ufw to control network traffic and prevent unauthorized access.
  2. Honeypots: Set up a honeypot using PenTBox to lure attackers and gather intelligence on their behaviors, reducing the risk to actual assets.

Day 14: Machine Learning - The Little Machine That Wanted to Learn

Learning Objectives: Understanding machine learning, basic structures and algorithms, and using neural networks to predict defective toys.

In this challenge, I delved into machine learning (ML) and its various structures like genetic algorithms, particle swarm optimisation, and neural networks. I focused on neural networks, which mimic how neurons work in the brain and can be trained to provide correct transformations.

Application:
  • Built a neural network using Python and libraries such as NumPy, Pandas, and scikit-learn.
  • Trained the network on a dataset of toy measurements to predict defective toys.
  • Achieved over 90% accuracy, resulting in the flag being obtained

Day 15: Machine Learning - Jingle Bell SPAM: Machine Learning Saves the Day!

Learning Objectives: Understanding the steps in a Machine Learning (ML) pipeline, ML classification and training models, dataset splitting, model preparation, and evaluation.

In this challenge, I helped build a spam email detector using Machine Learning. The process involved using a provided dataset for training and testing, as well as choosing the appropriate ML classification algorithm, identifying the important features that contribute to the model's decision making process for the task, and lastly evaluating the model's effectiveness against the dataset. In the end, I was able to identify 3 spam emails with one containing the flag.

Day 16: Machine Learning - Can't CAPTCHA this Machine!

Learning Objectives: Complex neural network structures, convolutional neural networks (CNNs), optical character recognition, and integrating neural networks into red team tooling.

In this challenge, I explored Convolutional Neural Networks (CNNs) and how they can be used for feature extraction and optical character recognition (OCR). By building a CAPTCHA-cracking CNN, I automated the process of solving CAPTCHAs by integrating the CNN into a brute force script.

Process:
  1. Feature Extraction: Used CNN to automatically select important features from the provided data and process them through a neural network.
  2. Training the CNN: Utilised Attention OCR for training, which reads one character at a time using sliding windows.
  3. Brute Forcing the Admin Panel: Created a script to solve CAPTCHAs and brute force login attempts.

Day 17: Traffic Analysis - I Tawt I Taw A C2 Tat!

Learning Objectives: Understanding network traffic data formats, differences between full packet captures and network flows, processing network flow data, using the SiLK tool suite, and hands-on experience in network flow analysis.

In this challenge, I gained insights into network traffic analysis and learned to differentiate between full packet captures and network flows. I also explored the SiLK tool suite, which is essential for analysing network flows.

Key techniques and tools:
  1. rwfileinfo: Used to overview file information and discover high-level details of network flow files.
  2. rwstats: Utilised for generating quick statistics and identifying anomalies within network traffic data.

Day 18: Eradication - A Gift That Keeps on Giving

Learning Objectives: Identifying CPU and memory usage in Linux, killing unwanted processes, finding persistence mechanisms, and permanently removing persistent processes.

In this challenge, I learned to identify and manage high CPU usage processes in Linux. Here's how I tackled a persistent process:

  1. Identified the Process: Used the top command to find a process named a consuming 100% CPU.
  2. Attempted to Kill the Process: Used sudo kill, but the process respawned with a new PID.
  3. Checked Cronjobs: Verified user and root cronjobs but found no trace..
  4. Checked Running Services: Used systemctl to list enabled services and identified a suspicious service running the a process.
  5. Stopped and Disabled the Service: Used sudo systemctl stop a-unkillable.service to stop the service and checked that the process was terminated.
  6. Removed Service Files: Deleted service files from /etc/systemd/system/ and reloaded the service configurations with systemctl daemon-reload.

This successfully eradicated the persistent process, normalising CPU usage and eliminating the unwanted service.

Day 19: Memory Forensics - CrypTOYminers Sing Volala-lala-latility

Learning Objectives: Understanding memory forensics, volatile data, memory dumps, Volatility tool, and Volatility profiles.

In this challenge, I learned to perform memory forensics, which involves examining a computer's volatile memory (RAM) to uncover digital evidence.

Volatility tool

Volatility is a command-line tool for analysing memory dumps, listing active and closed network connections, running processes, command line history, and extracting malicious processes.

Aanalysis
  1. Running Processes: Used the linux_pslist plugin to examine running processes in the memory dump.
  2. Process Extraction: Extracted suspicious programs for further analysis.
  3. MD5 Hash Calculation: Provided MD5 hashes of extracted binaries for threat intelligence.
  4. Persistence Mechanisms: Checked for cronjobs to identify persistence tactics used by malicious actors.
  5. File Extraction: Used the linux_enumerate_files plugin to review files of interest and understand how persistence mechanisms were implemented.

This challenge provided me hands-on experience in analysing memory dumps and identifying potential indicators of compromise (IOCs).

Day 20: DevSecOps - Advent of Frostlings

Learning Objectives: Poisoned pipeline execution, securing CI/CD pipelines, secure software development lifecycles (SSDLC), and CI/CD best practices.

In this challenge, I learned about CI/CD (Continuous Integration and Continuous Delivery) and how to secure these pipelines to prevent attacks such as poisoned pipeline execution (PPE). DevSecOps integrates security into CI/CD, ensuring consistency and threat reduction throughout the software development lifecycle (SDLC).

Investigations
  • Merge Requests: Checked for any attempts to merge code changes.
  • Job Logs: Reviewed job logs to identify triggered workflows and running jobs.
  • Pipeline Logs: Examined pipeline logs to detect anomalies.
  • Restored Original Code: Replaced the compromised .gitlab-ci.yml file with the original code to secure the pipeline.

Day 21: DevSecOps - Yule be Poisoned: A Pipeline of Insecure Code!

Learning Objectives: Understanding larger CI/CD environments, exploring indirect poisoned pipeline execution (PPE), and applying CI/CD exploitation knowledge.

In this challenge, I learned about CI/CD environments and how indirect poisoned pipeline execution (PPE) can be used to exploit them. Jenkins, a local automation server, was the primary platform discussed for handling pipeline build segments. Remote platforms like Travis CI also serve similar purposes.

Exploitation
  • Main/Branch Protection: Direct changes to the Jenkinsfile were rejected due to branch protection.
  • Makefile Modification: Successfully modified the makefile in the gift-wrapper repository to include malicious commands which gets executed each time the pipeline is ran.

Day 22: SSRF - Jingle Your SSRF Bells: A Merry Command & Control Hackventure

Learning Objectives: Understanding server-side request forgery (SSRF), different types of SSRF, prerequisites for exploiting the vulnerability, attack mechanics, exploitation techniques, and mitigation measures.

In this challenge, I delved into SSRF, a vulnerability that allows attackers to trick web applications into making unauthorised requests to internal or external resources on the server's behalf.

Exploitation
  • Took advantage of inadequate input validation by changing the URL parameter in the login page url to fetch any files I wanted.
  • Located admin credentials in the config.php file

Day 23: Coerced Authentication - Relay All the Way

Learning Objectives: Basics of network file shares, NTLM authentication, NTLM authentication coercion attacks, using Responder for attacks, and forcing authentication coercion using lnk files.

In this challenge, I explored NTLM authentication and how attackers can perform authentication coercion attacks to uncover sensitive information.

Exploitation
  • Used Responder to capture NTLMv2-SSP hashes from network file shares including that of the Administrator user.
  • Used John the Ripper tool to extract the username and password from the hash
  • Used evil-winrm to remotely login to the target system as the Administrator, achieving access to the remote desktop via terminal.

Day 24: Mobile Analysis - You Are on the Naughty List, McGreedy

Learning Objectives: Collecting digital evidence, challenges with modern smartphones, and using Autopsy Digital Forensics with an actual Android image.

In this challenge, I explored digital forensics, focusing on collecting evidence from digital devices like smartphones.

Analysis
  • Phone is put in a Faraday bag to prevent remote data wiping
  • Used adb backup -all -f android_backup.ab to create a logical image of the phone.
  • Imported the image into Autopsy Digital Forensics for detailed examination.

This challenge provided me hands-on experience in collecting and analyzing digital evidence from smartphones using digital forensic tools.

Reflections

Overall, I thoroughly enjoyed AOC2023 - it was my first Advent of Cyber/Code and I'm deeply proud of myself for embracing the challenges the TryHackMe team had crafted together. I definitely learned more than I anticipated and feel even more inspired to further dive into the rabbit hole of cybersecurity.