Sunday, July 31, 2011

Developer Survey Results - Part 3

This is the third part of my walk through of the results of a survey I recently did (part 1, part 2). This time I focus on practices around continuous integration.

The survey shows that 58% of respondents use CI, which  is fewer than I expected.

Those who use CI have their CI servers do different things. The answers show respondents use their servers to:

Run unit tests86%
Run integration tests
Run system tests
Run UI tests
Run acceptance tests
Create release or deployment package
Create tag/branch in source control on success
Deploy to non-production environment (like test or staging)
Deploy to production

So - not surprisingly - people build their code on the CI server. Then they seem to run the tests they've written - the numbers for running different kinds of tests align nicely with the numbers for writing different kinds of tests (see part 1). Furthermore over half create releases of deployment packages. That's pretty high, I think - even if it's only just over half of the half of the respondents that use CI at all; i.e. about a quarter of all respondents. Only very few take the next step and actually deploy to production. This is as expected; continuous deployment is still pretty new.

Next question how often CI runs and it shows that by far most people have it running on every commit. 

Turning to how often things brake the distribution is as follows:

Most of the time11%
Several times a day3%
Once a day33%
Once a week42%
Once a month6%

So its seems something is broken sort of often. Is that a problem? -Not in my opinion. It just shows that the CI server finds stuff. What could be a problem is if this stuff isn't fixed. Asking about how long from something brakes to work to fix it is started the answer were:

Within 15 minutes
Within an hour
Within half a day
Within a day
Within a week
Within a month
which I thinks shows that the issues found by CI are taken seriously.

All in all, it's a mixed impression on this one: Too few do any CI, but those who do have it build, run the tests they have, and create releases. And they react quickly to issues.

Wednesday, July 6, 2011

Developer Survey Results - Part 2

This second part of my walk through of the result of a survey I did recently focuses on practices around source control.

Again this section of the survey started by asking if respondents use source control at all. Luckily most did, 95% (although, what do those last five percent do? -They also answered that they code more than 10% of their time at work).

So looking at the distribution among source control systems, the results were:


This is not to be seen as an indication of the popularity of each of these vcs's in general, just as background on what the respondents to this survey use. And, as can be seen, this is an SVN crowd. That may very well have an effect on the rest of the answers in this section.

How often do respondents commit, then? -I think this an important indicator of how developers work. Do they take small focused steps, or big leaps of faith? Well, here are the results:

All the time3%
Several times an hour6%
About once an hour or every other hour18%
About twice a day34%
About once a day23%
About once a week3%
About once a month0%

So the weight here is around the twice a day answer. Some a bit more often than that, and some a bit less often that than. That's OK. I've seen shops where the norm was committing once a month. Awful. But these result do - on the other hand - leave room for improvement. I don't think commiting once an hour is unreasonable for the vast majority of development tasks. And more often if you're using a distributed version control system.

Last question in this section was about how the VCS is used in terms of branching and the likes. Results are:

Feature branches39%
Experimentation branches35%
Release branches59%
None of the above15%

Don't know what to say about this. People use some amount branching generally it seems, but I suspect some of these would differ if most respondents had been Git users instead of SVN users. I don't know.

Next section of the survey was about continuous integration. I'll cover that in my next post.

Friday, July 1, 2011

Developer Survey Results - Part 1

I ran a survey recently asking about a variety of practices on the developer level. The survey is now closed and the results are in.
The areas covered in the survey were test automation, source control, continuous integration, and code quality assurance. Nothing too fancy, but a collection of practices that I consider important, and which are all quite well known. I was curious to see how wide spread these practices are.

In this post I'll cover the results of the first part. I got 62 responses to the survey. Not enough for statistical significance. And there's probably also some bias in the group, since they come mainly from my network. So don't take the results for more than they are.

Test Automation Practices

So 71% of respondents use automated tests. That's pretty high. And then again it's sort of low. The benefits of test automation are so widely accepted, that did actually expect this number to be even higher. The next few questions where only asked of the respondents that use automated tests. Here's how many respondents use each type of test automation:

Unit tests95%
Integration tests80%
Systems tests20%
UI tests14%
Acceptance tests16%
Performance tests   20%
Load tests11%

So unit tests and integration tests are wide used, while the rest are only sporadically used. No big surprise, I'd say.

Next I asked how much of the code is covered. This is an important question; there's a lot of difference in what tests gives you between having low coverage percentages, having high percentage and having 100%. So how much do respondents cover?

I can't estimate that   2%

Not too high. So who writes those tests?

I do98%
The testers do9%
The business people do   0%

...responds writes their own tests. That's good. So when are the tests written?

Before the production code (test first)3%
Along side the production code
(sometimes before, sometimes right after)
Shortly after the production
(after production code, but in the same workflow)
After the production code for a complete feature is finished     2%
During a separate test phase0% mainly alongisde or shortly after. But not before. This probably explains why the coverage percentages seem to be a bit low.

OK, I'll the rest of the interpretation to you.

Next post I'll go through the result for the source control questions.