Saturday, June 18, 2011

OpenCover First Beta Release

Okay, the first post on a blog I created many, many months ago and still not got round to starting. Why the delay? Well, just been busy and not a lot to say; actually some would say I have too much to say it's just not publishable.

But now I am happy to announce that the first release of OpenCover is now available on GitHub https://github.com/sawilde/opencover/downloads.

"So what?" I hear you say, "we have NCover, dotCover and PartCover [and probably many others with the word cover in the name,] do we need another code coverage tool?" Well, I think the answer is "Yes!" but before I say why a brief history.

About a year ago I adopted PartCover when I found it lost and abandoned and only supporting .NET2; also PartCover has a large user base, SharpDevelop, Gallio, to name but two, and I felt it was a shame to just let it fall by the wayside. I had also done some work on CoverageEye (another OpenSource tool that was originally hosted on GotDotNet and has since vanished) whilst working for a client in the UK, so I felt I had a fighting chance to do the upgrade to .NET4; I don't know if my changes ever got uploaded to GotDotNet as I was not in charge of that.

The adoption was far from easy for a number of reasons, one of which was I was surprised just how little C++ I could actually remember and it's changed a bit since I last used it in anger. Also due to lack of communication with the original developers meant that I was on my own in working out a) how it worked and b) just what the issues are (a lot of the reported issues had long since been abandoned by the reporters).

At the beginning of the adoption I cloned the SourceForge repository to GitHub, git being the in-thing at the time, and after I was eventually admitted access to SourceForge I attempted to maintain both repositories. Due to the lack of permissions on SourceForge, no matter how many times I asked, I eventually abandoned SourceForge and kept all development to GitHub; I also updated the SourceForge repository with a lot of ReadMe posts to point to GitHub.

So upgrading PartCover progressed and thankfully bloggers such as David Broman had already covered the subject matter about upgrading .NET2 profilers to .NET4 and things to look out for. That, it would turn out, was the easy bit.

PartCover had 3 main issues (other than lack of .NET4 support)
1) Memory usage
2) 64 bit support
3) If the target crashed then you got no results.

I'll tackle each of these in turn:
1) Memory - PartCover builds a model of each assembly/method/instrumented point in memory; though I managed to cut down memory usage by moving some of the data gathering to the profiler host it wasn't enough - PartCover also added 10 IL instructions (23 bytes) for each sequence point identified + 4 bytes allocated memory for the counter.

2) 64 bit support - PartCover used a complex COM + Named Pipe RPC, which thankfully just worked but I couldn't work out how to upgrade it to 64 bit (a few other helpers have offered and then gone incommunicado, I can only assume the pain was too much).

3) Crashing == no results - this was due to the profiler being shutdown unexpectedly and the runtime not calling the ::Shutdown method and as such all that data not being streamed to the host process; thankfully people were quite happy to fix crashing code so not a major issue but still an annoyance.

All of this would take major rework of substantial portions of the code and the thought was unbearable. I took a few stabs at bits and pieces but got nowhere.

Thankfully I had received some good advice and though I tried to apply it to PartCover I realised the only way was to start again, taking what I had learned from the guys who wrote PartCover and some ideas I had come across from looking at other opensource tools such as CoverageEye and Mono.Cecil.

OpenCover was born.

This time I created a simple COM object supporting the interfaces and then made sure I could compile it in both 32 and 64 bit from day one.

I then decided to make the profiler as simple as possible, so it is maintainable and move as much of the model handling to the profiler host, thank heavens for Mono.Cecil. The only complex thing was deconstructing the IL and reassembling it after it had been instrumented. OpenCover only inserts 3 IL instruction (9/13 bytes depending on 32/64 bit) per instrumented point; it forces a call into the profiler assembly itself and this C++ code then records the 'hit'.

Finally I decided I had to get the data out of the profiler and into the host as soon as possible. I toyed with WCF and WWSAPI but this also meant I had no XP support, but at least I could test other ideas. However if my target/profiler crashed I would loose the last packet of data; not drastic but not ideal. Eventually I bit the bullet and switched to using shared memory.

The switch to shared memory has brought a number of benefits one of which is the ability to handle a number of processes under the same profiling session, both 64 and 32 bit and to aggregate the results as they all use the same shared memory. I have yet to work out how to set this up via configuration files but anyone wishing to experiment can do so via modifying the call to ProfilerManager::RunProcess in the OpenCover.Host::Program.

So this is where we are now, OpenCover has been released (beta obviously) and as of time of writing some people have actually downloaded it. I am now braced for the issues to come flooding/trickling in.

Feel free to download and comment, raise issues on GitHub, get involved; Daniel Palme, he of Report Generator fame, is hopefully going to upgrade his tool to include OpenCover.


9 comments:

  1. Is there a quick start to use this?

    ReplyDelete
  2. Not yet... Good idea though, I'll try to get something done in the next day or two. There is a batch file in the download that shows syntax and running opencover without arguments will give you a help.

    ReplyDelete
  3. There are entries on the wiki which should get you started. https://github.com/sawilde/opencover/wiki/Usage

    ReplyDelete
  4. Shaun,

    It is great to see the changes to this project. I have used PartCover and I am itching to download OpenCover as I write this. Prior to writing this I was about to look at adding historical trend information to the data collected by PartCover, have you given this much thought? I see it as a fairly simple addition that could add a great deal to the code analysis.

    Awsome tool!

    Louis Marceau

    ReplyDelete
  5. Hmmm, by historical trend do you mean how the coverage changes with time?

    I'd assume that was more of a reporting task than something I'd put in the profiler itself.

    There is a tool called NCoverCop (http://sourceforge.net/projects/ncovercop/) which compares 2 reports and shows which lines have not been covered or have become uncovered - which may be of interest.

    If you have any ideas and want to take it further please raise them on GitHub (not everyone comes here to read my rants :) )

    ReplyDelete
  6. Hi Shaun,

    I am working on a website, which will be kind of entry point for all the nunit and code coverage.

    I was just wondering can we run open cover with nunit from code using the api?

    I am able to use nunit and run the test through code, but not sure how to user opencover from .net code.

    I would be real great if you could guide me in that direction...

    ReplyDelete
    Replies
    1. Hi Adi, OpenCover runs as a console application that in turn executes your unit test suite (whether that be nunit, latest, xunit,...). To launch it just execute it as a process passing command-line args. OpenCover comes with docs that explain how to use it and also a msbuild task (code also available) which shows,how others have called OpenCover in a similar scenario.

      Delete
  7. I'm a little confused about why you would need any C++ code at all. I mean, yes, if the .NET process crashes it might be hard to get complete results out, but... I imagine if I were making a code coverage tool I would basically have one big array:

    class Coverage {
    . static uint[] HitCount;
    . public static void Hit(int i) {++HitCount[i];}
    }

    Every line of code would simply be instrumented with

    Coverage.Hit(L);

    via Cecil, with a different L for each line of code in an assembly. Sure, Hit() _could_ transmit the hit count to some external process, but for best performance I'd keep the data in-process. In any case why would any C++ code be involved? Hmm, I guess the instrumentation process must match source code lines with MSIL locations, is a C++ tool used for that?

    I haven't tried OpenCover yet, is it still in good shape? By the way, I'd really like a coverage tool that tells me the hit count, not just a boolean "hit" or "not hit", and I'd like to see a table of methods that contain large hit counts. That would make it a nice "poor-man's profiler" (useful for analyzing computational complexity, but not wall-clock time). Does OpenCover do that?

    ReplyDelete
  8. The C++ code is actually a profiler that instruments your code as it is loaded by the runtime - the profiler and the host work together to determine what needs to be instrumented. This means that OpenCover will work against any .NET code and does not need to pre-instrument your code beforehand (i.e. no special instrumentation builds). OpenCover uses Mono.Cecil to determine the sequence and branch points which are then sent to the profiler to insert the appropriate markers.

    You are right that your way is a possible solution and in fact PartCover does just that, in that it uses a big buffer to count the results and then delivers them to the host when the target process closes.

    An OpenCover feature is to track which test was running at the time an instrumentation point is hit and would help describe its architecture. Another feature is that you could use the raw stream to see what paths were taken through the code (I haven't got round to doing that bit as I have no need for it at the moment) so knowing what order the points were visited is important.

    OpenCover reports hits per sequence and branch point and provides quite a comprehensive report which I recommend you use ReportGenerator to visualize.

    ReplyDelete