how to setup to read files based on the timestamps
-rw-r--r-- 1 oracle oinstall 4672105 Apr 1 01:02 audit_logs_date_2019_04_01.txt
-rw-r--r-- 1 oracle oinstall 13341173 Apr 2 01:20 audit_logs_date_2019_04_02.txt
-rw-r--r-- 1 oracle oinstall 4208537 Apr 3 01:03 audit_logs_date_2019_04_03.txt
-rw-r--r-- 1 oracle oinstall 4638445 Apr 4 01:03 audit_logs_date_2019_04_04.txt
-rw-r--r-- 1 oracle oinstall 4130853 Apr 5 01:13 audit_logs_date_2019_04_05.txt
-rw-r--r-- 1 oracle oinstall 4099019 Apr 6 01:18 audit_logs_date_2019_04_06.txt
-rw-r--r-- 1 oracle oinstall 4510607 Apr 7 01:14 audit_logs_date_2019_04_07.txt
-rw-r--r-- 1 oracle oinstall 4022591 Apr 8 01:03 audit_logs_date_2019_04_08.txt
our files are generated for auditing as above
i need to configure to read these files once a day and store it for review
i tried using the below but dont see the logs in sumolic
/usr/local/bin/configure-td-agent -f none -p /u02/audit_sumologic/* -n audit
-
Hi Samir,
When you wish to ingest data files based on the time of the file generation, your best bet is to configure an S3 bucket to store the files, and then configure an S3 source to ingest the data. The S3 source can be based on polling the source, or on event notification.
Sumo’s S3 integration combines scan based discovery and event based discovery into a unified integration that gives you the ability to maintain a low-latency integration for new content and provide assurances that no data was missed or dropped.
For further information on configuring S3 sources, please refer to the following documentation:
Let us know if this is helpful, and if you have any other questions. We also have a great public slack channel at slack.sumologic.com if you prefer that method for help.
Thanks for posting to the community!
Please sign in to leave a comment.
Comments
1 comment