Data Sources
Use the search bar above or navigate the categories below to find articles about Data Sources.
For setup instructions, check out the Panther documentation on Data Sources.
- Cloud Accounts
- Cloud Security logs have incorrect label and id values in Panther
- Does Panther support Cloud Security Scanning for platforms other than AWS?
- For Self-hosted accounts, can I remove panther-account from my Cloud Accounts?
- How can I query all my distinct EC2 instances or EKS clusters in Panther?
- How do I list all my connected cloud resources in Panther?
- How do I onboard multiple AWS accounts all at once as Cloud Accounts in Panther?
- How to downgrade or turn off alerts for a specific AWS account within Panther
- My cloud resources have not updated in Panther
- What is the PantherCloudFormationStackSetExecutionRole and do I need it when setting up cloud accounts in Panther?
- Why does my Cloud Account display "Real Time Scanning Not Enabled" in Panther?
- Why did my Cloud Accounts turn unhealthy?
- Custom Logs
- Can a Panther schema accept both strings and arrays in the same field?
- Can I delete or rename a schema in Panther?
- Can I specify multiple accepted data types for a log schema field in Panther?
- Can I use multiple Panther indicators for a single field in a schema?
- Can I use multiple timestamp formats in one schema in Panther?
- Can I use Panther's fastmatch in a custom schema for timestamps with spaces?
- Can I use the native parameter in a custom schema in Panther?
- Can Panther authenticate a Stripe-Signature using the HMAC auth method?
- Can Panther parse logs in ORC format from Apache Hive?
- Does Panther have native support for Google Workspace Admin Alerts?
- Does Panther support logs in parquet format?
- Does Panther support parsing nanoseconds for timestamps in custom logs?
- Does Panther support the %-S code from the strftime format for a custom Microsoft schema?
- Error 'Source xx did not pass configuration check' when trying to create a new Azure Blob Storage log source in Panther
- Error 'unexpected format: "%N" found in 10 byte' when running pantherlog parse
- How can I ingest log events into Panther if they contain duplicate field names?
- How can I write multiple pantherlog tests for a schema?
- How does Panther match custom schemas if multiple schemas are used?
- How does the "validate" attribute work in Panther custom schemas?
- How do I change a Custom Schema field type in Panther?
- How do I download a newer version of pantherlog?
- How do I exclude a schema test from a group of tests with Pantherlog?
- How do I infer sample Cloudwatch Log Events and or JSON Array Events in Panther?
- How do I resolve a "DecodeTime: failed to parse" error for a custom schema in Panther?
- How do I resolve pantherlog errors when I try to run multiple schema tests?
- How do I resolve the Panther tool error "cannot be opened because the developer cannot be verified"?
- How is the field p_event_time populated in my custom schema in Panther?
- How to add an unsupported log source to Panther and request for new log sources
- How to resolve "EventTime: DecodeTime" parsing error when testing schemas with pantherlog
- How to resolve "Failed to infer schema... error found in #1 byte" when inferring schema in Panther
- Invalid memory address or nil pointer dereference error in Pantherlog
- Is it possible to extract a nested field while ingesting logs to Panther?
- Is there a maximum size limit on data that Panther ingests?
- I get the classification error "wrong number of fields" in my Panther Console while ingesting my logs as CSV data
- Pantherlog test fails with CSV input
- Troubleshooting CLI errors with "pantherlog parse"
- What is the native parser in Panther-provided schemas?
- When adding or removing fields from a custom schema in Panther, what happens to the corresponding columns in the data lake?
- Why are all my incoming logs only matching 1 schema?
- Why can’t I find logs in the data lake after ingesting data using a custom schema in Panther?
- Why do I see "schema update is not backwards compatible" when updating a schema in Panther?
- Data Transports
- "Failed to update source" error when adding IAM role to my Panther SQS log source
- AWS Kinesis: Firehose Delivery Streams combines data into one line to S3. How can Panther ingest the logs?
- Can I configure AWS managed Kafka to send data to Panther?
- Can I create multiple Panther log sources from one S3 bucket?
- Can I delete log objects from my source S3 bucket after Panther ingests them?
- Can I partition buckets by their log stream name with a Cloud Watch log source in Panther?
- Can I rename my SNS topic from panther-notifications-topic to something different?
- Can I send logs to Panther through a webhook?
- Can Panther ingest compressed data?
- Choosing the best method to ingest GitHub audit logs into Panther
- Does Panther add logs from my S3 bucket that existed before I started using Panther?
- How can I change the waiting period for my Panther Log Source drop-off alarm?
- How can I get Panther to ingest old data from an S3 bucket?
- How can I ingest GuardDuty findings via CloudWatch instead of S3 in Panther?
- How can I share an S3 bucket from other SIEMs with Panther?
- How come no data is coming in for a new S3 log source in Panther?
- How do I configure an S3 log source in Panther with a prefix exclusion or inclusion?
- How do I get copies of logs from Panther S3 buckets into my own AWS account S3 buckets?
- How do I get my events on separate lines when using AWS Event Bridge with S3 and Panther?
- How do I resolve "AccessDenied" key errors when ingesting logs to Panther via S3?
- How do I resolve the error "failure to download encrypted files from S3" while ingesting CloudTrail logs in Panther?
- How do I set up a Panther SQS data transport that connects to EventBridge?
- How many prefix filters can I add to an S3 log source?
- How to modify the time Panther waits before sending log source health alarms
- How to reduce log source health alerts for low-volume log sources
- How to solve "Source experienced errors recently while trying to access S3 objects" for Panther Log Source
- Missing data when ingesting data into the webhook HTTP endpoint in Panther
- No data flow or errors after creating IAM role manually for S3 source in Panther
- Panther Log Source error: "Bucket notifications are not properly configured"
- SNS topic not working for an SQS source created in the Panther Console
- What IP does Google see when Panther pulls logs from a GCS bucket?
- What is the ARN of my Panther SQS queue log source?
- What is the default retention period of the panther-input-data-notifications-queue?
- Why are S3 objects are being overwritten while Panther’s log processing is reading them?
- Why do I get an error when trying to ingest zst compressed files in Panther?
- Why is my SNS topic stuck in a "pending confirmation" state for the SQS confirmation for Panther?
- Supported Logs
- After fixing an unhealthy log source, why do I still get an error banner in the Panther Console?
- Can I backfill the logs of a new log source into Panther?
- Can I exclude logs from ingestion into Panther?
- Can I integrate with Google BigQuery API to query Gmail logs from Panther?
- Can I pause ingestion of a log source in Panther?
- Can I view log source integration API tokens in plaintext in Panther?
- Can Panther filter sensitive fields such as passwords out of incoming logs?
- Can Panther perform deduplication on incoming log events?
- Does my log source overview in the Panther Console report raw ingested log volume or uncompressed?
- Does Panther allow logs to be overwritten or does it append only?
- Does Panther have a native integration for 1Password auditevents logs?
- Does Panther have out-of-the-box support for ingesting Google Alerts?
- Does Panther support ingesting events from CyberHaven?
- Error "event exceeds maximum size: event at offset 1 is larger than xx Bytes" on Panther AWS Config History log source
- Error "event exceeds maximum size: event at offset 1 is larger than xx Bytes" on Panther CloudTrail log source
- Error "OrganizationDomain invalid, failed to satisfy the condition: tinesDomain" when trying to set up a Tines log source in Panther
- Error 'failed to parse integer: strconv.Atoi parsing ...' when processing AWS S3 Server Access logs in Panther
- How are IP addresses normalized and stored in Panther?
- How can I adapt my custom Panther CrowdStrike detections and queries to work with the Crowdstrike.FDREvent log type?
- How can I exclude logs from my Panther GCP integration?
- How can I identify recently deleted log sources in the "Ingestion By Log Source" graph in the Panther Dashboard?
- How can I ingest AWS EKS logs to only one log source in Panther?
- How can I restrict Panther user access to log types?
- How can I set up multiple CloudTrail log sources in Panther?
- How do I add new AWS CloudTrail log sources to Panther when the original does not have a prefix?
- How do I ingest IP addresses from GitHub Audit Logs into Panther?
- How do I resolve "Organization not found" when ingesting GitHub audit logs in Panther?
- How do I resolve the error "authentication failed with HTTP status code 500: unable to authenticate" when onboarding Salesforce logs to Panther?
- How do I resolve the error "Field validation failed on the 'required' tag" in Panther?
- How do I resolve the Zendesk log error 403 "You do not have access to this page" in Panther?
- How often does Panther pull logs from log sources?
- How often does Panther try to log in to my Salesforce integration if the password is not valid?
- How to Fix "Invalid Redirect" for Panther's Zoom Integration
- How to make separate Snowflake tables for my Panther log sources that use the same Panther-managed log type and schema
- How to resolve "invalid header" error when trying to ingest AWS VPC flow logs through Cloudwatch in Panther
- If I delete a log source in Panther, is its data deleted?
- How do I see who created a log source in Panther?
- Is log ingestion case sensitive in Panther?
- Is SystemLog the only Okta log that Panther pulls for Okta integrations?
- Is there a way to integrate CloudTrail logs in Panther without using event notifications?
- I want to ingest a specific field from my log event. Can I do that in Panther?
- Panther latency ingesting audit logs from Snyk
- Troubleshooting log ingestion issues in Panther
- What inclusion filter should I use in my GCS logging sink to only push Panther-supported GCP logs to Panther?
- What is the difference between the Panther log types GSuite.Reports and GSuite.ActivityEvent?
- When a log source is deleted, does Panther capture the source ID in it's audit logs?
- When trying to onboard a new Okta log source in Panther I get an Okta API permission error
- Why are 1Password ItemUsage logs missing in Panther?
- Why are certain ItemUsage 1Password Events not showing up in Panther?
- Why did my Zendesk log source in Panther stop receiving logs?
- Why does my Guard Duty log source in say it cannot access a log file in Panther?
- Why does my recently parsed event have an old p_event_time in Panther?
- Why do I experience delays in parsing Google Workspace events within my Panther Console?
- Why do I get the error "failed to read line: gzip decompression failed: flate: corrupt input before offset" on my Lacework log source within Panther?
- Why do I see "no bot scopes required" when onboarding Slack audit logs to Panther?
- Why do I see a "log not CSV" error when trying to test sample data against the AWS.ALB schema in the Panther Console?
- Why do I see a “402 payment required” error while onboarding logs from Atlassian to Panther?
- Why do I see a “ratelimited” error while onboarding Slack logs to Panther?
- Why do I see classification failures for Github Audit logs when I am using a Panther-provided schema?
- Why do I see high latency on some of my log types in Panther?
- Why do Panther API requests keeps blocking me from resetting my Salesforce security token?
- Why haven't I received Salesforce Logs in Panther for the past 24 hours?
- Will there be a gap in the logs if permissions fail in a native Panther log source?