Troubleshooting guide for Panther's parsing errors

Last updated: November 27, 2025

Issue

When trying to ingest some logs, the following system error occurs: 

Source [X] experienced errors recently while trying to access S3 objects...invalidstream: log entry is not JSON object

or

"error": "invalid stream: ReadObjectCB: object not ended with }, error found in #10 byte of ...

or error: parse failed: readObjectStart: expect { or n but found...

In some cases, the payload in the error message may appear compressed or encoded.

Cause

This issue occurs when:

  • The log event is not formatted as valid JSON

  • The log contains invalid or null characters

  • The source double-compresses the payload (e.g., .gz containing already-compressed data)

Resolution

1. Use Raw Event Filters

To automatically exclude improperly formatted events, configure Raw Event Filters in Panther.

2. Inspect the Logs in AWS S3

  1. Log into AWS S3 and download the problematic log file.

  2. Open it in a text editor and check for:

    • Invalid characters. Examples below:

      • NaN: This is not a valid Json field. If your json event has NaN, it'll cause the parsing to fail with an error: "error": "invalid stream: ReadObjectCB: object not ended with }, error found in #1 byte of

    • Formatting anomalies that prevent valid JSON parsing

3. Verify Compression

If the payload appears compressed:

  1. Check the S3 object metadata in AWS Console.

  2. Look for content-encoding: gzip.

    • If you had to decompress the file locally, this indicates double compression.

    • Double compression produces malformed data that Panther cannot parse.

Fix: Update the log source configuration to avoid compressing the data twice before storing it in S3.

4. Pre-process Logs if Needed

If modifying the source isn’t feasible, consider pre-processing logs using tools like Cribl or other filtering solutions to clean or reformat the data before it reaches Panther.