Dealing with Base64 encoded log data
Is it possible to decode an embedded base64 encoded value in my logs? Testing GCP AppEngine log delivery to SL, the logs are delivered as a JSON structure, where the actual log data is embedded as a base64 encoded string. E.g.,
{
"message": {
"data":"base64 encoded string here that contains the actual log from AppEngine"
"attributes": {
"logging.googleapis.com/timestamp":"2018-02-23T19:49:24Z"
},
"message_id":"1234",
"messageId":"1234",
"publish_time":"2018-02-23T19:49:28.768Z",
"publishTime":"2018-02-23T19:49:28.768Z"
},
"subscription":"projects/PROJECT_NAME/subscriptions/sumologic"
}
-
Hi Mark,
If you are using the instructions provided in the GCP Collection documentation and have updated your HTTP Source with the thirdPartyRef details listed in this doc, this should tell that Source to decode the "data" value from your GCP logs on ingest.
The values in the JSON that determine the decoding of the "data" string are the "contentType" and "serviceType" values, which both need to be set to "GoogleCloudLogs"
Here is a sample portion of the JSON, with the important portions highlighted."contentType": "GoogleCloudLogs",
"thirdPartyRef":{
"resources":[
{
"serviceType": "GoogleCloudLogs",
"path":{
"type":"NoPathExpression"
},
"authentication": {
"type": "GoogleCloudAuthentication",
"validations": [
{
"type":"GoogleCloudValidationDoc",
"name":"<google-validation-html-filename>",
"content": "<google-validation-html-file-content>"
}
]
}
}
]
}Within the Sumo Logic Collector Management page you can view the current JSON configuration for a Source by selecting the "i" icon to the far right of the Source name. Within this JSON check if the above values are set. If not you may need to post a new update via the Collection API to set these values. Note: When you first get the JSON via the Collector API there are a number of keys returned, one of these keys is named "alive." Make sure this key is removed prior to posting back your updates as I have seen this may cause the update to silently fail.
If you continue to have any problems please let me know and I can pull this into a support case and work with you to try and get this corrected.
-Kevin -
I started following the process more or less from scratch. I regenerated the HTTP source URL in Sumo, added the new source URL to API creds in GCP, downloaded the verification HTML file, created a new API key pair, downloaded the 'source.json' and edited it with the updated information, including 'serviceType: GoogleCloudLogs', and pushed the updated JSON to the source URL.
In the response, serviceType has switched back to 'other'.
$ grep serviceType source.json
"serviceType":"GoogleCloudLogs",Response after pushing the edited source.json:
"thirdPartyRef":{
"resources":[{
"serviceType":"Other", -
Here is a proper JSON for updating the HTTP Source for GCP. We need to strip any non-required values from the JSON in order to address the issue of the content types not being set correctly. Just update the "id", "name", "category", "validations.name" and "validations.content" to match with your HTTP Source.
{
"source":{
"id":xxxxxxxxx,
"name":"Google Cloud Logs",
"category":"google_cloud_logs",
"contentType":"GoogleCloudLogs",
"thirdPartyRef":{
"resources":[{
"serviceType":"GoogleCloudLogs",
"path":{
"type":"NoPathExpression"
},
"authentication":{
"type":"GoogleCloudAuthentication",
"validations":[{
"type":"GoogleCloudValidationDoc",
"name":"googlexxxxxxxxxxx.html",
"content":"google-site-verification: googlexxxxxxxxxx.html"
}]
}
}]
},
"messagePerRequest":false,
"sourceType":"HTTP"
}
}
Please sign in to leave a comment.
Comments
7 comments