ServiceStack exception enricher for Serilog

ServiceStack exception enricher for Serilog

I've been using serilog recently and its proving to be extremely useful and easy to use. One of the nice features of serilog is enrichers. They allow you add properties to your log events. You can find out more about them here

An application I was troubleshooting the other day was getting a WebServiceException from a servicestack api request, the stacktrace was added to my logging events but it wasn't that easy to find the relevant exception information.

This is the perfect use-case for an enricher, so I wrote one. This allowed me to map the exception properties as log event properties in my structured logevent data.

To add this to your log config is as easy as

\\ using serilog;
var log = new LoggerConfiguration()  
    .Enrich.With<ServiceStackExceptionEnricher>()
    .WriteTo.Console()
    .CreateLogger();

Now I can see and search for all those exception properties in my log sinks.

Perfect! :)

Showing the 馃挄 for xunit.net and dotCover in Teamcity

Showing the 馃挄 for xunit.net and dotCover in Teamcity

I've been using teamcity as my CI server for about 6 or 7 years now. I've also been using xunit.net since it first appeared on codeplex.

I was also a long time user of nCover on teamcity for my code coverage. Following their change in direction after v3 which I didn't like, I sacrificed the trend functionality and switched to using dotCover around five years ago. I still miss trends though.. hint hint jetbrains!

Atrophy

However in all that time as xUnit.net usage has blossomed, Teamcity's native support for it in combination with dotCover has been non-existent, instead using nUnit only for shrink-wrapped integration. I'm sure there must be a pretty sizable user-base for native xunit functionality so it mystifies me why tickets like this are still not a priority for development.

There has been a meta-runner for xunit around for a while and a recent plugin competition produced an xunit plugin entry and there are a bunch of posts on the web offering dotnet executable runner solutions to setup xunit. All of these though have some limitations or do not include dotCover.

Ok so what now?

It's time to show some more 💕 to this super handy tooling m茅nage 脿 trois.
I wanted to tame a few of those limitations from the implementations out in the wild, so here's the highlights:

  • Support for local or remote nuget package feeds for those corporate-proxy worlds
  • Easy support for testing .net legacy activation libraries
  • Decent wildcard support for finding test libraries
  • Traits ...are just awesome, if you don't use them, you should start!
  • Test runner extensibility for all those edge-case scenarios when you need to pass an arg to the test runner
  • Full-fat filtering support

The future...

With new and awesome test runners like fixie I expect with a bit more work this meta-runner could be made runner agnostic but for now, this does everything I need.

Let me know how you get on and if you find any bugs/issues with it post them via the github gist.

Why no plugin?

Of course this would no doubt be cleaner as a plugin and I did consider trying to write one to enter the plugin competition, but I'm not a java native so I didn't :(

Update

The good folks Jetbrains have now merged this runner into the powerpack. You can get it here

ode to the next guy

The villain

The villain

So today I'm working on a project that I have checked out but not run on my desktop before. I am primarily interesting in finding a trouble spot that is spewing out a lot of exceptions and making it hard to separate the actual problems from the noise.

I open the project and armed with a rather vague exception stacktrace, I go a-wandering through the codebase in search of my foe. After a couple of minutes of squinting I am not nearer the jackpot so I decided to run and debug the tests.

When is green, not green?

Our CI server builds and runs these tests all the time, currently the builds are green so I know it won't reveal much. Same goes for the integration tests which we normally disable locally but run on a CI nightly build.

So I run the unit tests and of course they all pass.

hmm lots of tests failed and nothing to do with my foe. They failed as I do not have magic setup on my machine the project expects in order to have it's jollies execute.

more hat-swop than hot-swap :(

We sometimes keep scripts in the repo root to do various setup bits but I forget to look there. The partridge 'AHA' moment comes eventually once I see them. So like a good citizen, the first change I make is to add a git repo readme and reference the script so the next guy doesn't just have to know.

Now I run the script and all is well

ConsoleFlash ...wait... it did something but I can't read at console buffer-flush speed and now it's closed. I think I saw an error message.

So I open, read and mentally parse the batch script. It launches a powershell script to configure some stuff. Nothing terribly taxing but it doesn't help that it errored and I don't know why.

My next task as a good citizen is to add a pause at the end of the script. You know the one, the 'press any key to exit'. Now the next guy doesn't have to have an eidetic memory.

I run the script again and now there are no errors, I guess that the error the first time is expected and proceed.

So! I'm done with the setup and I run the tests again.

Of course this time they all pass

damn, what now!

ok, I dig into the failing tests and rediscover that the project needs more magic installed which isn't covered by the setup script. I know this already as I'm familiar with the project. I know lots of things, and I forget them too.

To deploy or not to deploy...

I'm wondering how our deployment server pushes this thing out to servers, I don't recall it needing all this ceremony around setup so I go off to check it out.

I'm opening, reading and parsing powershell steps in Octopus deploy.
Of course! there it is in the project deployment steps, the additional magic.

why doesn't the code just set up all this magic for me?

High maintenance honey badger...

Honey Badger

Back to the good citizen routine, I update the readme with my magic gems and open and read the config files to get the values I need. You know; for the next guy.

I install and configure more magic.

Back to update the readme as I manually create the magic that it requires.

finally I run the tests again and they all pass

oh for [email protected]%* sake, what now!
I'm not running Visual Studio as an administrator.

back to update the readme

I run the tests and they pass

ahhh!

except; I need the integration tests and they don't run locally, those are the tests I really need to run, to recreate the application conditions which are generating the exceptions. The exact change I need to make to enable them is in the test output, thank you previous guy!.

I change the config option to enable local running of the integration tests.

of course now they all pass

whhhhhhhhhhhhhhhhhhhyyyyyyyyyyyyy!
I read the test output and the code and find the integration tests require a whole bunch more magic installed and configured. I am now beyond the realm of our deployment. This stuff is only ever configured not created during deployment.

back to the update the readme...

more setup

Finally after an hour of setting up magic, I run the integration tests and they all pass. I can now get to the actual code to find and fix the issue I'm interested in.

epilogging

Epilog I start out with a simple task to find and fix in the code and end up several hours later creating a readme with a getting started guide and improve the setup, all before I actually get to do the thing I want. Hopefully the next guy will not have to deal with this.

If any of this is familiar to you, perhaps you should find a new machine, spin up a dev vm or even ask the dev sitting next to you to set up and run one of your projects from scratch, especially if it isn't your current project. Getting up and running on a project from scratch should be easy, or have instructions for anything that isn't because; you know....

for the next guy!