How to log from .NET Core Web API into Elasticsearch on own index

17,842

Solution 1

Elasticsearch is "just" a log browser. In order to browse your logs, you gonna have to generate those logs.

Configure you application to work with Serilog for instance (https://stackify.com/serilog-tutorial-net-logging/). It will generates the log files.

Then, configure a sink to Elasticsearch (https://github.com/serilog/serilog-sinks-elasticsearch). It will write your logs where elasticsearch can read it.

Solution 2

Thank you Skrface for your support. I will summarize my code for others who pass by for the same issue. (For CLI and Solution Folders look below.)

Implementing in .NET Core Web API

add NuGet packages:

  • Serilog
  • Serilog.AspNetCore
  • Serilog.Sinks.Elasticsearch

add to appsettings.json:

"Serilog": {
"MinimumLevel": "Information",
"WriteTo": [
  {
    "Name": "RollingFile",
    "Args": {
      "pathFormat": "C:\\Temp\\log-{Date}.txt",
      "outputTemplate": "{Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} [{Level}] {Message}{NewLine}{Exception}"
    }
  }
],
"Properties": {
  "Application": "DataPicker.Api"
}

}

modify the Startup.cs

    public IConfiguration Configuration { get; }
    public Startup(IHostingEnvironment hostingEnvironment)
    {
        var builder = new ConfigurationBuilder()
            .SetBasePath(hostingEnvironment.ContentRootPath)
            .AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
            .AddJsonFile($"appsettings.{hostingEnvironment.EnvironmentName}.json", reloadOnChange: true, optional: true)
            .AddEnvironmentVariables();
        Configuration = builder.Build();
        var uri = Configuration["ConnectionStrings:ElasticSearchConnection"];
        Log.Logger = new LoggerConfiguration()
            .Enrich.FromLogContext()
            .WriteTo.Elasticsearch(new ElasticsearchSinkOptions(new Uri(uri))
            {
                AutoRegisterTemplate = true,
            })
        .CreateLogger();
    }

add to Startup.cs Configure(..)

loggerFactory.AddSerilog();

modify the Controller:

public class MyController : Controller
{
    private readonly ILogger<MyController > logger;

    public MyController (ILogger<MyController> logger)
    {
        this.logger = logger;
    }

and use the logging in the POST / PUT / GET / ... -method like that:

logger.LogDebug("My message");
logger.LogError("Exception: " + ex.Message);

Implementing in .NET Core CLI

add NuGet package:

  • Serilog.Sinks.Elasticsearch

add to Programm.cs into the Main(..)

Log.Logger = new LoggerConfiguration()
    .MinimumLevel.Debug()
    .MinimumLevel.Override("Microsoft", LogEventLevel.Information)
    .Enrich.FromLogContext()
    .WriteTo.Elasticsearch(new ElasticsearchSinkOptions(new Uri("myUri:myPort")) // e.g. "http://localhost:9200"
    {
        AutoRegisterTemplate = true,
    })
    .CreateLogger();

than use it like that:

Log.Debug("Start CLI !");
Log.Error("Can't create data base entry: " + ex.Message);

Implementing in .NET Core Solution Folder

Works just like in the CLI (see above), just use your constructor instead of the Main(..).

Solution 3

There is now also a stand alone logger provider that will write .NET Core logging direct to Elasticsearch, following the Elasticsearch Common Schema (ECS) field specifications, https://github.com/sgryphon/essential-logging/tree/master/src/Essential.LoggerProvider.Elasticsearch

Disclaimer: I am the author.

Add a reference to the Essential.LoggerProvider.Elasticsearch package:

dotnet add package Essential.LoggerProvider.Elasticsearch

Then, add the provider to the loggingBuilder during host construction, using the provided extension method.

using Essential.LoggerProvider;

// ...

    .ConfigureLogging((hostContext, loggingBuilder) =>
    {
        loggingBuilder.AddElasticsearch();
    })

You can then inject the ILogger into your controllers, etc, and write to it using the usual .NET logging, including scopes and semantic values (for a general introduction to logging see https://docs.microsoft.com/en-us/aspnet/core/fundamentals/logging/):

using (_logger.BeginScope("{CustomerId}", customerId))
{
  _logger.LogWarning("End of processing reached at {EndTime}.", end);
}

The default configuration will write to a local Elasticsearch running at http://localhost:9200/.

There is an example project that includes a docker-compose file to set up a local instance of Elasticsearch and Kibana if you need to, https://github.com/sgryphon/essential-logging/tree/master/examples/HelloElasticsearch

The example project also shows best practice for high performance logging, using the Microsoft LoggerMessage helper.

Once you have sent some log events, open Kibana (e.g. http://localhost:5601/) and define an index pattern for "dotnet-*" with the time filter "@timestamp" (this is the default index pattern for the logger provider).

Note: To use the index logging-*, as per the question, you will need to also change a configuration setting and add the following to your appsettings.json file:

{
  "Logging": {
    "Elasticsearch": {
      "Index": "logging-{0:yyyy.MM.dd}"
    }
  }
}

You can then discover the log events for the index. Some useful columns to add are log.level, log.logger, event.code, message, tags, and process.thread.id.

If you are running multiple applications or on multiple servers, you might want to include service.type, service.version, and host.hostname.

Additional fields are defined below, and all individual message and scope values are logged as labels.* custom key/value pairs, e.g. labels.CustomerId.

One benefit of the ElasticsearchLoggerProvider is that it follows the Elasticsearch Common Schema (ECS) for fields, so is compatible with other applications that log to Elasticsearch (e.g. Beats).

Example output: Example - Elasticsearch output via Kibana

Solution 4

Personally I use Filebeat to collect logs from different sources, add a custom field for each one of them (such as app: "myappp1") and output it to Elasticsearch. Then I create queries in Kibana based on these fields. Example:

filebeat.inputs:

- type: log
  enabled: true
  paths:
    - C:\ProgramData\Elastic\Elasticsearch\logs\*
  fields:
    app: "elasticsearch"

- type: log
  enabled: true
  paths:
    - C:\temp\log\myapp1\*
  fields:
app: "myapp1"

Although if you really want to have multiple indexes I recommend using Logstash, which can create an index using patterns or the name of a field. This question has good answers about using logstash to create multiple indexes.

Share:
17,842

Related videos on Youtube

Frank Mehlhop
Author by

Frank Mehlhop

Updated on September 15, 2022

Comments

  • Frank Mehlhop
    Frank Mehlhop over 1 year

    I have a .NET Web API written with C# and a Elasticsearch. On the Elasticsearch I have a index "logging" where I want to push my logs from the API into.

    I can not figure out how to get my logs from the C# API into the Elastic "logging". I read documentations like Logging with ElasticSearch..., but I have no logstash available at my Elasticsearch. So I'm searching for a Package which helps my logging in a easy way. I think need to hand over the Index "logging" ones, so it knows where to log into.

    Does somebody can recommend a documentation and / or Package for that?

    Or do I need to program it by my self?

  • Frank Mehlhop
    Frank Mehlhop about 5 years
    The first link is not really to use for .NET Core Web Api's, also it is logging into a file. The second link provides information. But using that code I get a exception. var logger = new LoggerConfiguration().ReadFrom.Configuration(configuration).‌​CreateLogger(); --> System.InvalidCastException: 'Invalid cast from 'System.String' to 'System.IFormatProvider'.'
  • Skrface
    Skrface about 5 years
    The first link is a global tutorial on how to use Serilog in a .NET application. The information are relevant for .NET Framework, Standard and Core. The article is also providing a link to the Serilog.AspNetCore project (github.com/serilog/serilog-aspnetcore). About your exception, it is hard to tell you what is wrong with your code without reading it. However, the serilog-sinks-elasticsearch repository's documentation is as correct as it could be. Keep trying with this sink and you will make it work.
  • Skrface
    Skrface about 5 years
    Glad to see you sorted it out ! Could you kindly mark my answer as valid answer if you managed to find the missing information thanks to it ? :)
  • Erdogan Kurtur
    Erdogan Kurtur over 3 years
    While this is almost perfect for lightweight logging to elastic, it does not support authentication which makes is useless for elastic cloud instances.
  • Sly Gryphon
    Sly Gryphon about 3 years
    Yes, the initial version did not include authentication. The code is being integrated in with the Elastic.NET project, including adding authentication (for elastic cloud, etc). The updated version is not yet released via Nuget, but the code is merged: github.com/elastic/ecs-dotnet/tree/master/src/…
  • Erdogan Kurtur
    Erdogan Kurtur about 3 years
    thanks. I discovered if I include username/password in the uri, then it works, but having them setup in options would be better. thank you for the library, it could not be any easier to log in to elastic.
  • Tobias J
    Tobias J about 3 years
    Really glad to see an ElasticSearch logging provider that doesn't require Serilog! Any idea when Elasticsearch.Extensions.Logging will be released to Nuget?
  • Roman
    Roman about 3 years
    Waiting for nuget package of Elasticsearch.Extensions.Logging!