Skip to main content
Panther Knowledge Base

Can I use the Panther API to find how many enabled detections have fired recently?

QUESTION

Can I use the Panther API to determine how many enabled detections have fired in the last 30 days?

ANSWER

While Panther does offer a REST API to interact with your detections, it is not currently possible to determine how many enabled detections have fired in the last 30 days. However, you can achieve this using common command-line tools and a Data Lake query, either in your Panther Console or in Python via the API.

To answer this question, first find out how many detections in your environment are enabled, and then use that list of enabled detections to search for alert counts per detection.

Step 1: Find the number of enabled detections

CLI instructions

Each Panther detection is represented by an under-the-hood YAML file with a field Enabled set to true or false. Therefore, you can find the number of detections that are enabled in your environment by running a quick grep for `Enabled: true` over your Panther detections directory.

  • If your team manages detections via a remote repository, you can run the command over your local panther-analysis clone. That command might look like this:
    grep -rl "Enabled: true" panther-analysis | xargs awk '/RuleID/ {print $2}'

 

Alternate instructions: Download detections from the Panther Console

Alternatively, if you manage your detections in the console, you can download all of your console detections and run this locally:

  1. In the Panther Console, navigate to Build > Bulk Uploader.
  2. Click Download all entities, and then unzip the file.
  3. In your terminal, navigate to the directory where you unzipped the folder, and run the same command as above, passing inall-entitiesas the input file name:
    grep -rl "Enabled: true" all-entities | xargs awk '/RuleID/ {print $2}'

Running the command will output a list of all of the rule IDs that are enabled in your environment.

 

Step 2: Find alert counts per detection

Now that you have the list of rule IDs, you can query the panther_views.public.all_rule_matches table for a count of all entries matching those rule IDs over the past 30 days.

  • Use a query similar to the following:SELECT p_rule_id, count(*) FROM panther_views.public.all_rule_matches WHERE p_occurs_since('30 days') AND p_rule_id IN ('ruleID_1', 'ruleID_2', 'ruleID_3'...) GROUP BY p_rule_id ORDER BY 2 ASC

 

If you run the SQL query inside Panther's Data Explorer, or construct it via Search, you should see a table of results like this:

p_rule_id

count(*)

Okta.APIKeyRevoked

2

Okta.APIKeyCreated

2

Panther.User.Modified

2

Panther.SAML.Modified

3

 

 

Alternate solution

Alternatively, to accomplish these two steps programmatically, you can create a modified version of the Python script found here: Data Lake Queries > Execute a data lake (Data Explorer) Query. You will need your Panther API endpoint and your Panther API key. See How to use Panther's API for instructions on how to obtain both.

See list-panther-detections for a complete end-to-end example of how to run this query programmatically.

The linked example can be run like this:

python list_detections.py <your-detections-directory-here>

[{'count(*)': 2, 'p_rule_id': 'Panther.User.Modified'}, {'count(*)': 2, 'p_rule_id': 'Okta.APIKeyCreated'}, {'count(*)': 2, 'p_rule_id': 'Okta.APIKeyRevoked'}, {'count(*)': 3, 'p_rule_id': 'Panther.SAML.Modified'}, {'count(*)': 4, 'p_rule_id': 'AWS.EC2.SecurityGroupModified'}, {'count(*)': 5, 'p_rule_id': 'Panther.Detection.Deleted'}, {'count(*)': 20, 'p_rule_id': 'AWS.IAM.PolicyModified'}, {'count(*)': 22, 'p_rule_id': 'AWS.CloudTrail.IAMAnythingChanged'}, {'count(*)': 47, 'p_rule_id': 'Standard.BruteForceByIP'}, {'count(*)': 9224, 'p_rule_id': 'AWS.CloudTrail.UnauthorizedAPICall'}]
10 detections fired over the past 30 days!