Is it a good idea to use serilog to write logs directly to the elasticsearch Is it a good idea to use serilog to write logs directly to the elasticsearch elasticsearch elasticsearch

Is it a good idea to use serilog to write logs directly to the elasticsearch


I had the same issue exactly. Our system worked with the "classic" elk-stack architecture i.e. FileBeat -> LogStash -> Elastic ( ->Kibana).but as we found out in big projects with a lot of logs Serilog is much better solution for the following reasons:

  1. CI\CD - when you have different types of logs with different structure which you want to have different types, Serilog power comes in handy. in LogStash you need to create a different filter to break down a message according to the pattern. which implies that there is big coupling in the log structure aspect and the LogStash aspect - very bug prone.
  2. maintenance - Because of the easy CI\CD and the one point of change, it is easier to maintain a large amount of logs.
  3. Scalability - FileBeat has a problem to handle big chunks of data because of the registry file which have a tend to "explode" - reference from personal experience stack overflow flow question ; elastic-forum question
  4. Less failure points - with serilog the log send directly to elastic when with Filebeat you have to path through LogStash. one more place to fail.

Hope it helps you with your evaluation.


There is now also a stand alone logger provider that will write .NET Core logging direct to Elasticsearch, following the Elasticsearch Common Schema (ECS) field specifications, https://github.com/sgryphon/essential-logging/tree/master/src/Essential.LoggerProvider.Elasticsearch

Disclaimer: I am the author

To use this from your .NET Core application, add a reference to the Essential.LoggerProvider.Elasticsearch package:

dotnet add package Essential.LoggerProvider.Elasticsearch

Then, add the provider to the loggingBuilder during host construction, using the provided extension method.

using Essential.LoggerProvider;// ...    .ConfigureLogging((hostContext, loggingBuilder) =>    {        loggingBuilder.AddElasticsearch();    })

The default configuration will write to a local Elasticsearch running at http://localhost:9200/.

Once you have sent some log events, open Kibana (e.g. http://localhost:5601/) and define an index pattern for "dotnet-*" with the time filter "@timestamp".

This reduces the dependencies even more, as rather than pull in the entire Serilog infrastructure (App -> Microsoft ILogger -> Serilog provider/adapter -> Elasticsearch sink -> Elasticsearch) you now only have (App -> Microsoft ILogger -> Elasticsearch provider -> Elasticsearch).

The ElasticsearchLoggerProvider also writes events following the Elasticsearch Common Schema (ECS) conventions, so is compatible with events logged from other sources, e.g. Beats.