Structured Logging using Serilog

By | September 28, 2016

This tutorial demonstrates Structured Logging using Serilog with ElasticSearch serving as the backend log repository. This post is a follow up on the beginner post I wrote on Serilog. If you are a starter on Serilog then I will recommend that you give a quick read to that post. As usual I will start with WHY followed by HOW.

WHY

Earlier when we had to log the contents of a given object then typically we used to extract each field of the object, shove the value of that field in the debug text and finally wrote the text into debug logs. When logs are used? When something goes wrong. You start hitting logs with Cntl+F, Find, Grep, Regexes and what not. And if you want to get some insights from the logs then say hello to twisted world of scripts.

Structured logging is different. We just store the object as it is. The internals of the logging framework then de-construct the object. On behalf of you. You need not extract fields of interest from the object, get them into string format and then shove it in log statements. .

Want more?
Well there are document databases like MongoDB, ElasticSearch etc which provide pretty awesome search functionality on these kind of data. Got it? Search facility !!

So let us take a step back.
First we get to store object as it is in the logs.
And then on top of that we get user friendly search functionality too.
If that was not enough, there are ways to find trends and insights from your application logs too.
Structured logging using Serilog

HOW

This tutorial will be what I will call a hairy tutorial. Just to be a complete demo it touches many things which I will not be covering. Here ElasticSearch is going to be one such thing. To make it easier to follow I am uploading the whole solution below.

SolutionFile

Before we start off let us discuss the concepts of Serilog sinks. In my earlier post I logged data to console and to files. They were in essence sinks. They decide where your logs go. For example serilog console sink sends the logs to good old terminal. ElasticSearch sink will store logs in an ElasticSearch index( In lingo of ElasticSearch, database is called index). A MongoDB sink will write logs into MongoDB and so on. You can check the list of sinks here. As you can see the possibilities are indeed huge. We made a leap from plain text logs to logs which are stored in database !!

Some groundwork.
You will need to install nuget packages for SeriLog and ElasticSearch sink.
To make it easy I am listing down the packages.config of the project so that you can check the versions.



  
  
  

I am on Visual Studio 2013 and the project targets .NET 4.5
There are updated versions of these packages now on nuget but I found that they do not play nice with each other. You can install these older versions of packages via nuget commandline.
Visual Studio -> Tools -> Nuget Package Manager -> Package Manager Console.Run the command given below. Note that the SeriLogTutorial at the end of the command is the project where I want to install these packages.

Install-Package Serilog.Sinks.Elasticsearch -Version 2.0.80 SerilogTutorial

We will be sending data to elasticsearch instance from the application. Hence there will be a little bit of configuration which goes into the app.config of your application (If it is not there in your application yet then you have to add this file). These are the settings which the SeriLog Elasticsearch sink needs.


    
    
    
    
  

Elasticsearch.nodeUris is a comma separated list of the server instances where ElasticSearch is running. I am keeping only one there.

Elasticsearch.indexFormat decides the name of the index where the logs will be written. Index in world of elasticsearch is database. This tells how the database will be created. In the world of logging usually a different database is created each day to store the logs for that day. Here everyday database will be created starting with the name batman followed by the date of creation.

There are many more options which are used in industrial environment but I will omit them to keep this tutorial brief.

After running the program we can open look for the logs using ElasticSearch commands. I will go into these in details in my soon to be coming ElasticSearch tutorial series. For time being you can take solace from the fact that you can see the that data indeed got logged into the ElasticSearch instance.
Here I am firing the command in the terminal just to demonstrate that. See the batmobile-2016.09.28. Yep! That’s your log data in Elasticsearch.

> curl http://localhost:9200/_cat/indices?v
health status index                pri rep docs.count docs.deleted store.size pri.store.size
yellow open   .kibana                1   1          2            0      8.8kb          8.8kb
yellow open   batmobile-2016.09.28   5   1          5            0     24.6kb         24.6kb

You can query the data too. I have listed the whole log as it is stored in the elasticsearch instance. It is at the end of the post as it is rather long and scary.

With that sorted let us take a deep dive into what happened.

This is the file Engine.cs which actually generated the data. Nothing much here. Just keep an eye on the public fields in this file. Especially the last one. Rest you can ignore.

using System;
using System.Collections.Generic;

namespace Transportation
{
    public class Engine
    {
        public double Temperature { get; set; }
        public int Rpm { get; set; }
        public double Thrust { get; set; }
        public List RpmLog = new List();

        private Random _random;

        public Engine()
        {
            _random = new Random();
        }

        public void Start()
        {
            Rpm = _random.Next(100, 201);
            Temperature = _random.Next(10, 51);
            Thrust = _random.Next(100, 301);
        }

        public void Run(int timeInSecs)
        {
            for (var i = 0; i < timeInSecs; i++)
            {
                var change = _random.Next(0, 2);
                var rpmChange = _random.Next(0, 21);
                var temperatureChange = rpmChange * _random.Next(0, 101) / 100;
                var thrustChange = rpmChange * _random.Next(0, 101) / 100;

                if (change == 0)
                {
                    Rpm = Rpm + rpmChange;
                    RpmLog.Add(Rpm);
                    Temperature = Temperature + temperatureChange;
                    Thrust = Thrust + thrustChange;
                }
                else
                {
                    Rpm = Rpm - rpmChange;
                    RpmLog.Add(Rpm);
                    Temperature = Temperature - temperatureChange;
                    Thrust = Thrust - thrustChange;
                }
            }
        }
    }
}

This is the file BatMobile.cs where the data was logged.

using Serilog;
using System;

namespace Transportation
{
    public class BatMobile
    {
        private readonly Engine _engine;

        public BatMobile()
        {
            Log.Logger = new LoggerConfiguration()
                   .ReadFrom.AppSettings()
                   .CreateLogger();
            _engine = new Engine();
        }

        public void DriveFor(int timeInSecs)
        {
            _engine.Start();
            Log.Logger.Information("Engine Started at {TimeOfStart}", DateTime.Now);
            _engine.Run(timeInSecs);
            Log.Logger.Information("RPM log {RpmLog}", _engine.RpmLog);
            Log.Logger.Information("Engine stats which will not work {WillNotWork}", _engine);
            Log.Logger.Information("Engine stats {@BatmobileEngine}", _engine);
            Log.Logger.Information("Engine ran for {EngineRunDuration}", timeInSecs);
        }
    }
}

Line 21
This is where we log the time at which the Engine started. Notice that I passed DateTime.Now instead of a string. I did it on purpose. Serilog knows how to process DateTime.Now. Infact you can pass on Booleans, Numerics, Strings, Temporals, Guid, Uri, Nullables to Serilog as they are. Serilog calls them Scalars and will handle each one of them on your behalf. Say bye bye to ancient days where you extracted values out of these and shoved them into the log statements manually. Shove. I love this word.

But what is the deal with those curly braces around “TimeOfStart”? To put it in layman’s terms, Normal database tables have columns. Elasticsearch indexes (table is called index in elasticsearch) have fields. Simple. This “TimeOfStart” in curly braces is a request to elasticsearch to add a field (read a column) “TimeOfStart” into the index (read a table) where it will store the log data. Now the elasticsearch has a different way of writing queries. I will cover the details in my coming elasticsearch tutorials. However for time being just understand that if it was regular database then you might have written something like this

“SELECT * FROM mylogdata WHERE TimeOfStart=’August 30th 2016, 14:16:22.997′”.

Coming back to topic, this is what Serilog logged.
Engine Started at 08/30/2016 15:51:42

Line 23
Here I am sending _engine.RpmLog to be logged. Remember that this is a list !! Serilog is smart enough to unwrap the list and log the individual values in it. Serilog can also handle Dictionaries as long as the key are scalars.

This is what Serilog logged.
RPM log [194, 213, 215, 196, 184, 190, 196, 195, 203, 197, 202, 199, 211, 205, 203, 212, 201, 187, 181, 178, 184, 172, 169, 186, 182, 189, 206, 220, 220, 231, 212, 230, 249, 238, 248, 240, 254, 273, 264, 272, 253, 262, 269, 260, 260, 257, 259, 260, 277, 292, 288, 276, 276, 268, 269, 256, 249, 232, 251, 262]

Line 24
This is a line which as said will not work. I am trying to send the whole object and want the Serilog to store it properly. However Serilog will not do that until you specifically ask for it. In this case it will just call the ToString on this object and put whatever is the result.

This is what Serilog logged.
Engine stats which will not work \”Transportation.Engine\”

Line 25
We correct mistake in line 24 by putting a ‘@’ symbol before the name of property. And Serilog then dives into the internals of this object and logs each individual public property.

This is what Serilog logged.
Engine stats Engine { Temperature: 112, Rpm: 293, Thrust: 289, RpmLog: [150, 170, 187, 189, 200, 203, 208, 221, 206, 216, 236, 255, 267, 286, 291, 273, 267, 259, 264, 264, 277, 277, 297, 297, 304, 313, 326, 346, 326, 309, 314, 319, 307, 289, 293, 294, 282, 275, 293, 307, 316, 316, 308, 311, 293, 284, 293, 306, 295, 314, 300, 315, 299, 288, 294, 298, 281, 293, 311, 293] }
Notice how Serilog was able to dive into all the public properties of this class object and log their individual values.

This is a very vast topic. And when we talk of logs and ElasticSearch then ELK e.g. Elasticsearch, Logstash and Kibana inevitably come up. I will be writing some posts on them shortly (hopefully). I hope that this bare bone introduction to Structured Logging using Serilog inspires you to give love/respect to logs they so richly deserve.

BTW are we ready to see the whole thing that was logged into Elasticsearch database by Serilog?

> curl -XGET http://localhost:9200/batmobile-2016.09.28/_search?pretty    
{
  "took" : 1,
  "timed_out" : false,
  "_shards" : {
    "total" : 5,
    "successful" : 5,
    "failed" : 0
  },
  "hits" : {
    "total" : 5,
    "max_score" : 1.0,
    "hits" : [ {
      "_index" : "batmobile-2016.09.28",
      "_type" : "logevent",
      "_id" : "AVdw-8f6_kHSewVOdpMK",
      "_score" : 1.0,
      "_source" : {
        "@timestamp" : "2016-09-28T23:28:51.2287451+10:00",
        "level" : "Information",
        "messageTemplate" : "Engine Started at {TimeOfStart}",
        "message" : "Engine Started at 09/28/2016 23:28:51",
        "fields" : {
          "TimeOfStart" : "2016-09-28T23:28:51.2247447+10:00"
        }
      }
    }, {
      "_index" : "batmobile-2016.09.28",
      "_type" : "logevent",
      "_id" : "AVdw-8if_kHSewVOdpMO",
      "_score" : 1.0,
      "_source" : {
        "@timestamp" : "2016-09-28T23:28:51.2667489+10:00",
        "level" : "Information",
        "messageTemplate" : "Engine ran for {EngineRunDuration}",
        "message" : "Engine ran for 60",
        "fields" : {
          "EngineRunDuration" : 60
        }
      }
    }, {
      "_index" : "batmobile-2016.09.28",
      "_type" : "logevent",
      "_id" : "AVdw-8if_kHSewVOdpMM",
      "_score" : 1.0,
      "_source" : {
        "@timestamp" : "2016-09-28T23:28:51.2637486+10:00",
        "level" : "Information",
        "messageTemplate" : "Engine stats which will not work {WillNotWork}",
        "message" : "Engine stats which will not work \"Transportation.Engine\"",
        "fields" : {
          "WillNotWork" : "Transportation.Engine"
        }
      }
    }, {
      "_index" : "batmobile-2016.09.28",
      "_type" : "logevent",
      "_id" : "AVdw-8if_kHSewVOdpMN",
      "_score" : 1.0,
      "_source" : {
        "@timestamp" : "2016-09-28T23:28:51.2637486+10:00",
        "level" : "Information",
        "messageTemplate" : "Engine stats {@BatmobileEngine}",
        "message" : "Engine stats Engine { Temperature: 112, Rpm: 293, Thrust: 289, RpmLog: [150, 170, 187, 189, 200, 203, 208, 221, 206, 216, 236, 255, 267, 286, 291, 273, 267, 259, 264, 264, 277, 277, 297, 297, 304, 313, 326, 346, 326, 309, 314, 319, 307, 289, 293, 294, 282, 275, 293, 307, 316, 316, 308, 311, 293, 284, 293, 306, 295, 314, 300, 315, 299, 288, 294, 298, 281, 293, 311, 293] }",
        "fields" : {
          "BatmobileEngine" : {
            "_typeTag" : "Engine",
            "Temperature" : 112,
            "Rpm" : 293,
            "Thrust" : 289,
            "RpmLog" : [ 150, 170, 187, 189, 200, 203, 208, 221, 206, 216, 236, 255, 267, 286, 291, 273, 267, 259, 264, 264, 277, 277, 297, 297, 304, 313, 326, 346, 326, 309, 314, 319, 307, 289, 293, 294, 282, 275, 293, 307, 316, 316, 308, 311, 293, 284, 293, 306, 295, 314, 300, 315, 299, 288, 294, 298, 281, 293, 311, 293 ]
          }
        }
      }
    }, {
      "_index" : "batmobile-2016.09.28",
      "_type" : "logevent",
      "_id" : "AVdw-8if_kHSewVOdpML",
      "_score" : 1.0,
      "_source" : {
        "@timestamp" : "2016-09-28T23:28:51.2617484+10:00",
        "level" : "Information",
        "messageTemplate" : "RPM log {RpmLog}",
        "message" : "RPM log [150, 170, 187, 189, 200, 203, 208, 221, 206, 216, 236, 255, 267, 286, 291, 273, 267, 259, 264, 264, 277, 277, 297, 297, 304, 313, 326, 346, 326, 309, 314, 319, 307, 289, 293, 294, 282, 275, 293, 307, 316, 316, 308, 311, 293, 284, 293, 306, 295, 314, 300, 315, 299, 288, 294, 298, 281, 293, 311, 293]",
        "fields" : {
          "RpmLog" : [ 150, 170, 187, 189, 200, 203, 208, 221, 206, 216, 236, 255, 267, 286, 291, 273, 267, 259, 264, 264, 277, 277, 297, 297, 304, 313, 326, 346, 326, 309, 314, 319, 307, 289, 293, 294, 282, 275, 293, 307, 316, 316, 308, 311, 293, 284, 293, 306, 295, 314, 300, 315, 299, 288, 294, 298, 281, 293, 311, 293 ]
        }
      }
    } ]
  }
}

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.