Ignore Unparseable JSON with jq
Assuming that each log entry is exactly one line, you can use the -R
or --raw-input
option to tell jq to leave the lines unparsed, after which you can prepend fromjson? |
to your filter to make jq try to parse each line as JSON and throw away the ones that error.
I have log stream where some messages are in json format.I want to pipe the json messages through jq, and just echo the rest.
The json messages are on a single line.
Solution: use grep and tee to split the lines in two streams, those starting with "^{" pipe through jq and the rest just echo to terminal.
kubectl logs -f web-svjkn | tee >(grep -v "^{") | grep "^{" | jq .
or
cat logs | tee >(grep -v "^{") | grep "^{" | jq .
Explanation:tee generates 2nd stream, and grep -v prints non json info, 2nd grep only pipes what looks like json opening bracket to jq.
There are several Q&As on the FAQ page dealing with the topic of "invalid JSON", but see in particular the Q:
Is there a way to have jq keep going after it hits an error in the input file?
In particular, this shows how to use --seq.
However, from the the sparse details you've given (SO recommends a minimal example be given), it would seem it might be better simply to use inputs
. The idea is to process one JSON entity at a time, using "try/catch", e.g.
def handle: inputs | [., "length is \(length)"] ;def process: try handle catch ("Failed", process) ;process
Don't forget to use the -n option when invoking jq.
See also Processing not-quite-valid JSON.