Using a Linux collector to pull json log files
Hello,
I've found multiple resources in SumoLogic documentation around setting up a Linux collector and being able to take advantage of JSON files. I feel a bit lost though as there are various things you can do in this area. Long story short basically what my situation is:
I installed a tool that's now generating a new JSON log file each day that's name includes the time stamp and is written a log folder. Am I able to configure the linux-based collector to push those files up to SumoLogic? I've seen various things around defining a source file or folder for local configuration file management. Using a source.json to provide a specific source, etc. However, my interpretation for these is that you define what files are to be collected before starting the collector service. Once started, it doesn't add more? If that were the case, then every day when a new JSON log file is created, it wouldn't register the newly created JSON files?
I think I'm clearly just misinterpreting something here and would love some guidance on how you guys would do what I'm trying to accomplis.
TLDR; I have a tool creating daily json log files in Linux - looking to have the collector monitor that folder to push the JSON logs into my SumoLogic service.
-
Yes, you should be able to configure the Linux-based collector to push JSON log files up to Sumo Logic. You should install a collector on your Linux machine as explained here:
https://help.sumologic.com/03Send-Data/Installed-Collectors/04Install-a-Collector-on-Linux
Source.json is to configure sources at the collector install time but you can add sources from the UI also. Configure local file source pointing to your JSON file to ingest JSON logs in real-time and continuously.
Please sign in to leave a comment.
Comments
1 comment