Using Logstash Logback Encoder for a Java Application

Comments

6 comments

  • Official comment
    Avatar
    Graham

    Update: We now have an official Logback appender:

    https://github.com/SumoLogic/sumologic-logback-appender

    ____________________________________________

    Hi Chris,

    In order to use JSON parsing in Sumo Logic, you just need to ensure the message is a valid JSON object. I see in that link that the Logback encoder will output events as Logstash-compatible JSON, so assuming this is normal JSON we shouldn't have any issues parsing it.

    Also, some customers use this open source Logback appender to log straight to Sumo Logic: https://github.com/relateiq/sumo-logback-appender

    Let me know if this answers your question, I'm happy to assist further. If you use Slack, you can join our public Slack channel here slack.sumologic.com - I'm @grahamwatts-sumologic.

    Comment actions Permalink
  • Avatar
    chris.anatalio

    Graham, 

    Thanks for the response.


    We will be logging valid JSON objects to our log files.  However, the whole log file itself will not be a valid JSON.  

    The log file will just be valid JSON log objects separated by a newline.  Will this work, or does the entire log file itself have to be a valid JSON document.  i.e. A JSON array of our JSON log objects.

     

    Example log file: 

    {
    "@time":"2017-08-14 17:43:44.630",
    "level":"INFO",
    "logSource":"com.core.logging.ControllerLogging",
    "msg":"Request TrackerId=tracking:1234: End request for resource requestMethod=POST path=/api/. Status statusCode=500. Total time: duration=183 ms",
    "jsonMsg":null,
    "thread":"eventloop-thread-0",
    "OG-TrackerId":"tracking:1234",
    "requestMethod":"POST",
    "path":"/api/",
    "statusCode":500,
    "duration":183,
    "context":"default"
    }
    {
    "@time":"2017-08-14 17:43:44.630",
    "level":"INFO",
    "logSource":"com.core.logging.ControllerLogging",
    "msg":"Request OG-TrackerId=tracking:12345: End request for resource requestMethod=POST path=/api/. Status statusCode=500. Total time: duration=183 ms",
    "jsonMsg":null,
    "thread":"eventloop-thread-0",
    "OG-TrackerId":"tracking:12345",
    "requestMethod":"POST",
    "path":"/api/",
    "statusCode":500,
    "duration":183,
    "context":"default"
    }
    0
    Comment actions Permalink
  • Avatar
    chris.anatalio

    Our team considered: https://github.com/relateiq/sumo-logback-appender but decided against it.

    Factors in our decision include:

    • Last commit was over 2 years ago
    • Only 1 contributor and 30 commits
    • Lack of a README or any documentation

    It did not seem like a mature, well supported or well documented tool.

    0
    Comment actions Permalink
  • Avatar
    Graham Watts

    Thanks for the info Chris. Here is another more recently updated Logback to Sumo Logic Appender: https://github.com/vital-software/sumo-logback

    The correct way to send logs into sumo is one valid JSON object per message. If you are sending multiple messages, you might be able to use boundary regex to have Sumo Logic split them, as long as they are not wrapped in [brackets]. 

    1
    Comment actions Permalink
  • Avatar
    chris.anatalio

    Thanks we ended up using the Sumo Logic agent to ship logs to CloudWatch then hooked a Lambda up to send our logs to Sumo Logic using a HTTP source endpoint.

    Logs are being parsed correctly.

    0
    Comment actions Permalink
  • Avatar
    Graham Watts

    Glad to hear it's working! Question on that configuration - why not use our collector to send the data straight to Sumo Logic rather than into a CloudWatch log group? This would be the lowest latency way to analyze/alert on data in Sumo.


    0
    Comment actions Permalink

Please sign in to leave a comment.