Is it possible to split a JSON list into multiple records?
AnsweredI have JSON formatted records that contain lists like:
[{"port":"8500","alias":"TCP","address":"127.0.0.1","name":"consul"},{"port":"22","alias":"TCP","address":"0.0.0.0","name":"sshd"},..]
What I'd like to do is to be able to extract the list items (and non-predetermined number of them) and treat them each as a separate entry so I can do aggregation over them. Is this possible?
For example I'd like to be able to do a count by "port" over the entries of the list.
-
Matthew,
One approach to soliving this would be to use a regular expression coupled with the multi modifier.
It might look like this for your case of counting the ports:
port alias address name | parse regex "port\":\"(?<port>\d+)" multi
Or, you could parse the complete json objects, and apply json auto to them after the fact like so:
port alias address name | parse regex "(?<json_blob>\{[^\}]+\})" multi | json auto field=json_blob
-
Thanks for the reference to multi, that was exactly what I was looking for. I was able to get the aggregated results I needed with:
| json "snapshot","hostIdentifier" | parse regex field=snapshot "(?<json_blob>{[^}]+})" multi | json field=json_blob "address","name","port","alias" | count by address,name,port,alias,hostIdentifier | fields -hostIdentifier | fields -address | count by port,alias,name | num(port) | sort by port asc
I am trying to add this to an Interactive Dashboard and it complains that the timeframe is too long to be run without being optimized (24 hours), even though the query itself only takes 6 secs to run. Is this a problem with using multi? It seems the heuristic for not running the query is off in this case.
-
Matthew,
Unfortunately, multi isn't supported in Interactive dashboards for more than 6 hours at this time.
We're always working to improve that heuristic, but there are some cases that are difficult to anticipate.
For now, on this query, your best bet is to save the search and run it as needed in the Search interface.
Please sign in to leave a comment.
Comments
3 comments