How To Get Log Level in Json with Serilog - .NET Core 3.1 How To Get Log Level in Json with Serilog - .NET Core 3.1 elasticsearch elasticsearch

How To Get Log Level in Json with Serilog - .NET Core 3.1


According to 12 factor app, application should writes all logs to stdout/stderr.

Then you need collect all logs together, and route to one or more final destinations for viewing (Elasticserach). Open-source log routers (such as FluentBit, Fluentd and Logplex) are available for this purpose.

So, the app never concerns itself with routing or storage of its logs. In dotnet app You can easily achieve it using Serilog

Let's say we have the following logger settings in appsettings.json

"Logging": {    "OutputFormat": "console",    "MinimumLevel": "Information"}

We can create an extension method

private static IWebHostBuilder CreateWebHostBuilder() =>    WebHost.CreateDefaultBuilder()        .UseStartup<Startup>()        .UseLogging();}

that can write logs to the console in both plain text and elasticsearch format. Plain text logs will be useful for development, because it is more human readable. On Production we enable elasticsearch format and see all logs only in Kibana.

The code of extension with comments:

public static IWebHostBuilder UseLogging(this IWebHostBuilder webHostBuilder, string applicationName = null) =>    webHostBuilder        .UseSetting("suppressStatusMessages", "True") // disable startup logs        .UseSerilog((context, loggerConfiguration) =>        {            var logLevel = context.Configuration.GetValue<string>("Logging:MinimumLevel"); // read level from appsettings.json            if (!Enum.TryParse<LogEventLevel>(logLevel, true, out var level))            {                level = LogEventLevel.Information; // or set default value            }            // get application name from appsettings.json            applicationName = string.IsNullOrWhiteSpace(applicationName) ? context.Configuration.GetValue<string>("App:Name") : applicationName;            loggerConfiguration.Enrich                .FromLogContext()                .MinimumLevel.Is(level)                .MinimumLevel.Override("Microsoft", LogEventLevel.Warning)                .MinimumLevel.Override("System", LogEventLevel.Warning)                .Enrich.WithProperty("Environment", context.HostingEnvironment.EnvironmentName)                .Enrich.WithProperty("ApplicationName", applicationName);            // read other Serilog configuration            loggerConfiguration.ReadFrom.Configuration(context.Configuration);            // get output format from appsettings.json.             var outputFormat = context.Configuration.GetValue<string>("Logging:OutputFormat");            switch (outputFormat)            {                case "elasticsearch":                    loggerConfiguration.WriteTo.Console(new ElasticsearchJsonFormatter());                    break;                default:                    loggerConfiguration.WriteTo.Console(                        theme: AnsiConsoleTheme.Code,                        outputTemplate: "[{Timestamp:yy-MM-dd HH:mm:ss.sssZ} {Level:u3}] {Message:lj} <s:{Environment}{Application}/{SourceContext}>{NewLine}{Exception}");                    break;            }        });

When OutputFormat is elasticsearch the log will be like

{"@timestamp":"2020-02-07T16:02:03.4329033+02:00","level":"Information","messageTemplate":"Get customer by id: {CustomerId}","message":"Get customer by id: 20","fields":{"CustomerId":20,"SourceContext":"Customers.Api.Controllers.CustomerController","ActionId":"c9d77549-bb25-4f87-8ea8-576dc6aa1c57","ActionName":"Customers.Api.Controllers.CustomerController.Get (Customers.Api)","RequestId":"0HLTBQP5CQHLM:00000004","RequestPath":"/v1/customers","CorrelationId":"daef8849b662117e","ConnectionId":"0HLTBQP5CQHLM","Environment":"Development","ApplicationName":"API","Timestamp":"2020-02-07T14:02:03.4329033Z"}}

in other case (use only for debugging)

[20-02-07 13:59:16.16Z INF] Get customer by id: 20

Then you should configure log router to collect logs from container and send it to the Elasticsearch.

If all logs are structured, it improves searching and creating indexes in Kibana.