Automated Collection

Consistently Collect Data from Remote Hosts

The first step in an effective first response is to ensure that you collect the right data.  Unlike other commercial tools, Cyber Triage does not require an agent to be installed on a live system. Instead, the collection tool is pushed out as needed or run from a USB drive.

Its targeted collection approach saves time because it copies the most important data from the system and does not require the user to make a forensic image of the entire drive.

Get More Details about Automated Collection

Collection tool properties:

  • Runs on all versions of Microsoft Windows XP and newer.
  • Requires no installation on target systems. It is pushed to live systems as needed or can run directly from a USB drive.
  • Contained in a single executable, which makes it easy to deploy.
  • Collection can be started manually or automated from a SIEM or other workflow tool using our REST API.
  • Analyzes disk images in raw or E01 formats.

Performs targeted file collection through:

  • Registry analysis identifying startup programs, drivers, services, and programs that were run.
  • Reviewing scheduled tasks.
  • Event log and registry analysis for login and remote desktop activity.
  • Using The Sleuth Kit® forensics library, thereby making it less vulnerable to typical rootkits and does not modify file access times.

Additional collected data includes:

  • Volatile data (running processes, open ports, logged in users, active network connections, DNS cache, etc.)
  • File metadata about all files for timeline, blacklisting, and indicator of compromise analysis.
  • A file system scan for indicators of data exfiltration and suspicious executables.

Fully Automated Analysis

Automatically Identify Known Bad and Suspicious Items

After data is collected from the target system, it is stored in a central database and analyzed.  Cyber Triage immediately gets to work looking for data that is similar to indicators and evidence from past incidents. 

Cyber Triage automatically looks for evidence that an experienced responder would first look for.  It looks for suspicious processes and startup items and sends all collected files for malware analysis.  If found, the high threat items are shown to the user.

Get More Details about Automated Analysis

Automated analysis techniques will find:

  • Files with malware based on results from multiple OPSWAT Metascan engines.
  • Known bad files and other items based on IOCs and blacklists.
  • Windows processes that were tampered with by verifying parent hierarchy and owner.
  • Programs and scheduled tasks that were run out of uncommon locations.
  • Startup programs, services, or drivers in uncommon locations or that are not signed.
  • Processes with names that are too similar to normal Windows processes.
  • Processes that could have been exploited and are now running command prompts.
  • Active network connections to uncommon remote ports.
  • Listening ports on uncommon local ports.
  • Remote desktop connections with suspicious users and settings.
  • User accounts with abnormal behaviors and failed logins.
  • Executable files hidden in NTFS Alternate Data Streams.
  • Executable files that have suspicious structure and settings.
  • Encrypted archive files that could be from data exfiltration.
  • And more.

In addition, Cyber Triage will ignore known good operating system and application files based on MD5 hash values and NIST NSRL, which reduces the amount of data that needs to be analyzed and reviewed.

Partially Automated Analysis

Help the User Find Anomalous Data

Every host is different because each user has different usage patterns and technical expertise.  When responding to an incident, the responder needs to make decisions about each host. Cyber Triage helps them make those decisions.

Get More Details about Partially Automated Analysis

When reviewing data, Cyber Triage enables you to make a decision by identifying:

  • Suspicious items based on the automated analysis techniques.
  • Whitelisted items that were marked as good in a previous session.
  • Blacklisted items that were found to be bad in a previous session.
  • How common or rare an item is based on how often it was found in previous sessions.
  • Common data amongst hosts that are grouped into an incident.
  • What changed on the host since the last time it was analyzed.