Update
We now use Elmah.IO for centralised logging. It’s a great service and incredibly easy to set up. There’s even an API you can use for logging from non-web applications.
Over the past few months I’ve been developing the new version of fabrik (my portfolio and blogging startup for creatives).
The biggest change is that the entire product is now driven by a new publication API, something that I wasn’t quite sure could work but so far has proven to be quite successful.
The current version of fabrik is made up of two web applications, the dashboard (where customers manage their content) and the front end multi-tenant site (where the customer web sites are served).
The new version is made up of the following applications.
- Fabrik Publication API (ASP.NET Web API)
- Fabrik Management API (ASP.NET Web API)
- Fabrik Web (ASP.NET MVC) - main site where customers can sign up and manage their content.
- Fabrik Hosted (ASP.NET MVC) - where we server the customer sites
- Fabrik Media (ASP.NET) - where all media content is served
- Fabrik Management Site (ASP.NET MVC) - management/administration site.
So we’ve gone from 2 web applications to 6!
When it came to setting up our logging/monitoring, I realized that my normal practice of installing Elmah and calling it a day probably wouldn’t cut it. We needed centralized logging and we needed the logs to be easily accessible. The latter point is crucial and is often the reason why people end up exposing Elmah for all the world to see or just not checking it (because they can’t be bothered to remote onto the server - I’m in this camp).
I decided that I wanted to use Elmah for exception handling and NLog for everything else (diagnostics, debugging etc.). I chose NLog because it has the best documentation of all the .NET logging frameworks I’ve used. It also supports asynchronous logging - and I love the async!
Configuring Elmah
First of all I went through each application and installed Elmah:
PM> Install-Package elmah
Next I had to configure ASP.NET MVC and ASP.NET Web API to use it. For ASP.NET MVC:
PM> Install-Package Elmah.Contrib.Mvc
Then we just need to replace the existing global filter, HandleErrorAttribute
:
public class FilterConfig
{
public static void RegisterGlobalFilters(GlobalFilterCollection filters)
{
filters.Add(new Elmah.Contrib.Mvc.ElmahHandleErrorAttribute());
}
}
For ASP.NET Web API things are just as simple:
PM> Install-Package Elmah.Contrib.WebApi
Then to register the filter:
public static class WebApiConfig
{
public static void Register(HttpConfiguration config)
{
config.Filters.Add(new Elmah.Contrib.WebApi.ElmahHandleErrorApiAttribute());
}
}
Now we have in memory exception logging for all 6 applications.
Since the idea is to centralize logging, we of course need a central location for the logs. SQL Server is a good fit for this especially as we will have multiple instances of each application. The easiest way to configure Elmah to use SQL Server is to:
PM> Install-Package elmah.sqlserver
This will drop a script into your App_Readme folder that you can execute against your database. It will also add the necessary <errorLog>
element to your Elmah configuration in web.config. You then just need to create a connection string:
<connectionStrings>
<add name="loggingdb"
connectionString="Data Source=.\sqlexpress;User ID=logginguser;Password=loggingpwd;Initial Catalog=LoggingDb;" providerName="System.Data.SqlClient" />
</connectionStrings>
...
<elmah>
<security allowRemoteAccess="false" />
<errorLog type="Elmah.SqlErrorLog, Elmah"
connectionStringName="loggingdb"
applicationName="LoggingDemo.Api" />
</elmah>
To easily identify each application I also set the applicationName
attribute. Elmah will then display this as the title on the Elmah log and in the database:
So now we have centralized logging for Elmah and all it took was installing a few Nuget packages.
Whilst we may have centralized logging, we do not yet have centralized viewing and this is vital if you want your developers (me) to actually check the logs. It is possible to have Elmah send emails but in my experience, these just end up being ignored.
You should never configure Elmah to be publically accessible, at least not without some kind of authentication. One solution is to aggregate the RSS feeds for each error log and expose the aggregate feed (it would need to be protected of course). Currently this isn’t that easy to do since Elmah produces RSS 0.91 feeds (not supported by the System.ServiceModel.Syndication classes in .NET). I have however sent a pull request to the Elmah project that adds support for RSS 2.0 so hopefully this will get added soon.
An even better (and sexier) solution is to use ElmahR, which provides real time monitoring using SignalR. ElmahR enables you to monitor multiple applications running Elmah.
One thing that wasn’t particularly clear when I first looked at ElmahR is that the NuGet package does not include the dashboard. To install the dashboard you need to download a zip file from the downloads page on Bitbucket. I have started a conversation with the project author Roberto Vespa about how we can package the dashboard into a Nuget for easy inclusion into an existing application (which is what I wanted to do).
Configuring NLog
I wanted to abstract general purpose logging/debugging behind a lightweight logging interface. This was important as some of our projects (such as the API client) will be open sourced and I wanted to give people the option to use the logging framework of their choice. The first thing I tried was the Common.Logging framework but I was unable to get this working with NLog so decided I would just roll my own. This will be pushed into Fabrik.Common at some point:
public interface ILogger
{
void Debug(string message);
void Trace(string message);
void Info(string message);
void Warning(string message);
void Error(string message);
void Error(string message, Exception exception);
void Fatal(string message);
void Fatal(string message, Exception exception);
}
I also created extension methods for each log level to support formatted strings:
public static void Debug(this ILogger logger, string message, params object[] args)
{
Ensure.Argument.NotNull(logger, "logger");
Ensure.Argument.NotNullOrEmpty(message, "message");
logger.Debug(message.FormatWith(args));
}
The NLog implementation is pretty trivial:
public class NLogLogger : ILogger
{
private readonly Logger logger;
public NLogLogger(Type loggerType)
{
Ensure.Argument.NotNull(loggerType, "loggerType");
logger = LogManager.GetLogger(loggerType.FullName);
}
public void Debug(string message)
{
logger.Debug(message);
}
public void Trace(string message)
{
logger.Trace(message);
}
public void Info(string message)
{
logger.Info(message);
}
public void Warning(string message)
{
logger.Warn(message);
}
public void Error(string message)
{
logger.Error(message);
}
public void Error(string message, Exception exception)
{
logger.ErrorException(message, exception);
}
public void Fatal(string message)
{
logger.Fatal(message);
}
public void Fatal(string message, Exception exception)
{
logger.FatalException(message, exception);
}
}
One thing I like about NLog is that the configuration is very flexible. Even with the above implementation I know I can easily tweak log layouts and targets without having to change any code.
Notice that the NLog logger requires the logger type to be passed to it. We can have our DI tool (StructureMap) handle this automatically:
For<ILogger>()
.AlwaysUnique()
.Use(c => new NLogLogger(c.ParentType ?? c.BuildStack.Current.ConcreteType));
Now we just need to configure NLog to use a central location. We’ll use SQL Server again:
<target name="database" type="Database">
<dbprovider>mssql</dbprovider>
<connectionstring>loggingdb</connectionstring>
<commandText>
insert into LogTable(application, time_stamp,level,logger,message)
values(@application, @time_stamp, @level, @logger, @message);
</commandText>
<parameter name="@application" layout="LoggingDemo" />
<parameter name="@time_stamp" layout="${date}" />
<parameter name="@level" layout="${level}" />
<parameter name="@logger" layout="${logger}" />
<parameter name="@message" layout="${message}" />
</target>
Since NLog doesn’t require a specific table schema, you have complete control over the SQL that is used to write log entries. You can also add custom parameters like I am doing for the @application
name.
To view the logs we have a few options. NLog is compatible with a number of log viewers, for example Sentinel:
<target xsi:type="NLogViewer" name="viewer" address="udp://127.0.0.1:9999" />
The downside of desktop viewers is that I would still have to remote onto the server to use them. A better solution might be to just query the log table in our database and display the logs on a web page. Then I can view them without remoting onto the server. I expect this is what I will end up doing.
Real time diagnostics
I definitely see my usage of Elmah and NLog as two separate concerns. Elmah is all about logging unhandled exceptions whereas I’m using NLog for diagnostics. I’m the first to admit that I don’t normally bother with the latter. I have tests/specifications that document the behaviour of my application so I don’t often find the need to “log” behaviour. However, one thing that would be invaluable is real-time diagnostics.
Imagine one of our customers contacts us regarding an error they are getting within the Fabrik dashboard. We check the Elmah logs and sure enough we can see the exception that was thrown. After getting more information from the customer we are able to reproduce the error. Wouldn’t it be great if we could “trace” the sequence of events in real-time that lead up to the exception being thrown. With SignalR this is easily achievable.
Since ASP.NET Web API now has tracing OOTB, we can create a custom ITraceWriter
to send messages to a client using SignalR. Filip already has a great post on doing this so I’m not going to cover it again.
We can also apply the same concept to NLog by creating a custom Target:
[Target("NLogSignalR")]
public class SignalRTarget : TargetWithLayout
{
private readonly HubConnection cn;
private readonly IHubProxy hub;
public SignalRTarget()
{
cn = new HubConnection("http://localhost:59523");
hub = cn.CreateHubProxy("nlog");
}
protected override void InitializeTarget()
{
base.InitializeTarget();
cn.Start().Wait();
}
protected override void Write(AsyncLogEventInfo logEvent)
{
base.Write(logEvent);
string asyncBody = Layout.Render(logEvent.LogEvent);
hub.Invoke("SendLog", asyncBody);
}
}
By combining Web API’s tracing and the above NLog target we can get real-time diagnostics of exactly what happens in our application:
I now feel confident that we have the necessary logging and diagnostics in place to be able to support our product and quickly identify and resolve issues.
I hope this post helps others who may be looking to improve their logging and diagnostics infrastructure.