Integrating Veeam Backup & Replication with Sleuthkit Autopsy ( Part 1 )
In today's digital age, data security and forensic analysis play a crucial role in investigating incidents and ensuring the integrity of digital assets. As organizations increasingly rely on backup solutions like Veeam Backup & Replication for data protection, the need for seamless integration with forensic analysis tools becomes paramount.
Building on previous discussions like:
You might have triggered Veeam incident API from SIEM/XDR
once an IOC was identified in an active attack. The incident API would capture
an immediate backup image of the compromised server, preserving data before
encryption. Utilizing the built-in YARA engine with Veeam, you could scan
against its historical backup chain for known malicious tools used in the kill
chain, pre-encryption or exfiltration.
The identified backups can be mounted with Veeam without
recovering the data, providing a sandbox for forensic analysis. Whether you're
in law enforcement or incident response, Autopsy can be used to run multiple
ingest modules against these backup images to build a timeline and case against
a cyber incident or individual.
Introduction:
In this blog post, we'll explore how to automate the process
of making backup data from Veeam available for analysis in Autopsy. By
integrating these two tools, we can streamline forensic investigations and
expedite incident response processes, without having to use a
production-compromised server or recover a copy. Instead, we will use Veeam FLR
to mount the backup filesystem and present it as a data source to Autopsy in an
automated process.
Requirements:
- Autopsy
installed on the server used for forensic analysis.
Autopsy (sleuthkit.org
- Python
installed, along with the following libraries installed via pip:
Libraries used in Python script:
pip
install requests tqdm
The other libraries (os, subprocess,
and shutil) are built into Python and do not need separate installation.
Find the Script Here : mritsurgeon/autopsy (github.com)
Setup:
Before we dive into the integration process, let's set up
the necessary variables:
Set up These variables Before Script Execution
# Veeam Backup & Replication server IP/FQDN address
veeam_server = "10.0.0.1"
# Server IP/FQDN address where the backup will be mounted
mount_server = "10.0.0.1"
# Username for accessing mount server, using $ share
username = f"administrator@{mount_server}"
# Password for accessing mount server over network $ share
password = "Veeam123"
# Veeam API user to get OAuth Token
api_user = "administrator"
# Veeam API password to get OAuth Token
api_password = "Veeam123"
# Identifier for the case in Autopsy
case_number_ID = "x12jz"
# Path to Autopsy installation directory
autopsy_path = r"C:\Program
Files\Autopsy-4.20.0\bin"
# Folder where the restored data will be mounted for
analysis on local Server & CASE files
triage_folder = r'C:\triage'
How it Works:
The integration script follows these steps:
- Obtain
Access Token: The script obtains an access token from the Veeam API
using the provided credentials.
- Fetch
Restore Point Data: It fetches the latest restore point data from
Veeam backup.
- Select
Restore Point: The latest restore point, filtered based on allowed
operations, is selected for Mount.
- Initiate
Mount: The script initiates the FLR process.
- Wait
for Data Availability: After Mount, it waits for 2 minutes to ensure
the data is available on Mount Server by checking for Volume1 Presence.
- Symlink
Data: It mounts the backup data on the network path and creates
symbolic links in the triage folder to local machine running Autopsy.
- Launch Autopsy: Finally, it constructs and executes a command to launch Autopsy & create a Case with the backup data as Data Source for forensic analysis.
Usage:
Using the integration script is straightforward:
- Set
the required variables at the beginning of the script.
- Run the script using Python.
FLR Mounting VM File System
- Autopsy will be
launched automatically after the Mount process is completed.
- You will See Autopsy running in Command line Mode :
Looking into the Command line we can see where it is in process:
You can also see a Triage folder on your root of C :
In your Triage folder you will notice
Symlink to FLR of Filesystem mount of the Backup Being
Investigated
&
The Case folder for your Investigation
Once Complete, you will See script at this point:
You will also have a New Window Open in Autopsy:
Select Open Case , and select the newly create case folder in your local Triage folder.
Once opened, you will see Volumes of the Server we want to investigate:
** To Improve the speed of the Command Line, I left off
the Ingest module parameter:
Ingest Modules are Autopsy plug-ins , each Ingest Module is designed to analyse and retrieve specific data from the data source in this case our backups.
Autopsy ingest modules are specialized tools within the
Autopsy digital forensics platform designed to analyze and extract information
from various types of digital evidence. These modules help forensic examiners
efficiently process and interpret data during investigations. Here's a brief
explanation of some:
- Recent
Activity: Shows recent user activity within the last seven days,
aiding in understanding recent actions on a system.
- Hash
Lookup: Calculates file hash values and checks them against known
databases to identify potentially malicious files.
- File
Type Identification: Determines the file type based on internal
signatures, helping to identify file formats even if they have been
renamed or have misleading extensions.
- Extension
Mismatch Detector: Identifies files with extensions that do not match
their detected file type, potentially indicating attempts to hide data.
- Embedded
File Extraction: Extracts files from archive formats to analyze their
contents, facilitating keyword searches and hash lookups within compressed
files.
- Picture
Analyzer: Extracts metadata from images using the EXIF format,
providing information such as camera settings, geolocation, and
timestamps.
- Keyword
Search: Allows manual text searches within the data source for
specific keywords or phrases.
- Email
Parser: Parses and analyzes email communications, assisting in
investigations involving email evidence.
- Encryption
Detection: Identifies encrypted files using entropy calculations and
specialized tests, helping to flag potentially encrypted data.
- Interesting
Files Identifier: Enables the identification and flagging of files or
directories deemed significant by the examiner.
- Central
Repository: Facilitates cross-case and cross-data source artifact
matching, aiding in correlation and analysis across multiple cases or data
sources.
- PhotoRec
Carver: Recovers files from unallocated space on a storage device,
useful for retrieving deleted or lost files.
- Virtual
Machine Extractor: Adds virtual machines found in the data source as
new data sources for further analysis.
- Data
Source Integrity: Verifies the integrity of the data source by
checking hashes or calculating them if unavailable.
- Drone
Analyzer: Analyzes files from the internal SD card of a drone,
assisting in investigations involving drone-related incidents.
- iOS
Analyzer: Utilizes external modules to analyze iOS logs, events, and
property list (Plist) files for insights into iOS devices.
- Android
Analyzer: Analyzes SQLite and other files from Android devices, aiding
in investigations involving Android-based evidence.
- GPX
Parser: Imports GPS data from GPX files, useful for mapping and
geolocation analysis.
These modules provide forensic examiners with powerful tools
to uncover, analyze, and interpret digital evidence effectively during
investigations
So let’s run the Ingest module against our data:
Right Click on our Server ( My Example : Tiny VM )
We select the ingest modules we want to use for our investigation:
You can see it here Ingest running:
This can take some time:
It's started Populate, I can see A lot of URLs / Emails Aswell
as Encrypted files, Further I’ve tagged 2 Python Script I want to come back
too and analyze.
More Ingest Models can be added that are not out of the box, for things like Browser Passwords, cookies, history & other meta data.
You can find More Here:
Autopsy
3rd Party Modules - SleuthKitWiki
&
autopsy/thirdparty at develop · sleuthkit/autopsy (github.com)
Finally, we can generate a Case report (I’ve done no customization)
Conclusion:
By automating the integration between Veeam and Autopsy,
organizations can streamline their forensic analysis processes and enhance
their incident response capabilities. This integration empowers cybersecurity
professionals to investigate incidents more efficiently and effectively,
ultimately strengthening the organization's overall security posture by
leveraging the data that Veeam already has , this would include a Scenario
where a Hacker or Bad Actor tried to clean up after the event , we could find
such evidence in older backup images over time.
This is the process of using backup images as the forensic source,
without having to recover that backup image & protecting the backup image
as its always in a Read-only state.
Comments
Post a Comment
Leave your Thoughts & Comments , and I'll reply as soon as possible.
Thank you for you views in Advance.