What is it?
Structured logging is the practice of implementing a consistent, predetermined message format for application logs that allows them to be treated as data sets rather than text. (https://www.sumologic.com/glossary/structured-logging/)
I wanted to have a good, like really good, but free, system to move my logs to.
Over the years, I have used PM2, journald and some self made logging files by Serilog / nLog. Debugging was waiting for the issue to happen and check the logs.
On the other side there is Azure AppInsignts that I have been using for customers. Unfortunately this is a payed version and I\’m not in control (enough) of the data itself.
The I found Seq, made by Datalust. It has all the features I needed and has a free license option for solo use.
In this blog I will depict how my setup has been scaffolded.
Technical overview
I have used the docker version of Seq. Opened port 80 and 5341 only to the loopback network interface. On top of that I have configured nginx to proxy the requests and support SSL.
To start seq using docker, the following command was used:
PH=$(echo '<123>' | docker run --rm -i datalust/seq config hash)
docker run \
--name seq \
-d \
--restart unless-stopped \
-e ACCEPT_EULA=Y \
-e SEQ_FIRSTRUN_ADMINPASSWORDHASH="$PH" \
-v /home/log/data:/data \
-p 127.0.0.1:8090:80 \
-p 127.0.0.1:45341:5341 \
datalust/seq
nginx was configured to
- Listen on port 80 and redirect all traphic to https
- Listen on port 5341 and forward it to the docker’s opened port 45341 (ingestion enpoint of seq)
- Listen on port 443 and forward it to the web port of seq
- Certbot was used to get a LetsEncrypt certificate
server {
listen 80;
server_name log.server;
return 301 https://$host$request_uri;
}
server {
listen 5341 ssl;
server_name log.server;
location ~ {
proxy_pass http://localhost:45341;
}
}
server {
location ~ {
proxy_pass http://localhost:8090;
}
server_name log.server;
listen 443 ssl;
}
# for directadmin
nano /usr/local/directadmin/data/users/username/nginx.conf
# add the following
server {
listen 5341 ssl;
server_name logs.domain.tld;
ssl_certificate /usr/local/directadmin/data/users/username/domains/logs.domain.tld.cert.combined;
ssl_certificate_key /usr/local/directadmin/data/users/username/domains/logs.domain.tld.key;
location ~ {
proxy_pass http://localhost:18098;
}
}
# restart nginx
Test the setup
We can easily test the webpage, d0h.
The seq ingestion endpoint can be tested with postman, knowing that we get a response (even a forbidden response) indicates that the nginx and docker configuration are setup properly.
Next step was to really have a simple console application that was able to write logs to the seq ingestion endpoint. I strumbled upon https://jkdev.me/serilog-console/, and based on that I have created a small app.
using Serilog; using Serilog.Core; using System; using System.Threading; namespace SeqIngestionTester { class Program { static void Main(string[] args) { Serilog.Debugging.SelfLog.Enable(Console.Error); AppDomain.CurrentDomain.UnhandledException += AppUnhandledException; using (var logger = BuildSerilog()) { try { logger.Information("Hello world"); var demo = new Thread(() => { throw new Exception("It's a feature, I promise!"); }); //demo.Start(); //Task.Delay(10000).Wait(); logger.Error("Hello error"); } catch (Exception e) { UnhandledExceptions(e); } } } private static void AppUnhandledException(object sender, UnhandledExceptionEventArgs e) { if (Log.Logger != null && e.ExceptionObject is Exception exception) { UnhandledExceptions(exception); // It's not necessary to flush if the application isn't terminating. if (e.IsTerminating) { Log.CloseAndFlush(); } } } private static void UnhandledExceptions(Exception e) { Log.Logger?.Error(e, "Console application crashed"); } private static Logger BuildSerilog() { var logger = new LoggerConfiguration() .WriteTo.Seq("https://log.server:5341", apiKey: "<123>") .WriteTo.Console() .CreateLogger(); Log.Logger = logger; return logger; } } }
The result is immediately visible in the seq dashboard