Tag Archives: fail

Data Web Links

On consistency in reporting

When reporting any kinds of figures, consistency in the data reported is paramount.  It gives a much-needed clarity to your data, making comparisons between data sets simple and intuitive, and allows the user to easily see trends and changes in data over time and by proportion.

When reporting lacks consistency however, things start to fall apart.  I’ve a very recent example of this, which I want to talk about now.

On the 5th October, the BBC reported on Facebook surpassing 1 billion users per month.  Quite an achievement (if only 2toria had a billionth of that usage…).  I don’t have a big problem with the graphic used in this article (although I could find some if I tried harder, I’m sure.)

The big problem I have is with the data presented in the side panel entitled ‘Evolution of a network’.  For a change, this data is presented in text rather than a graphic, but I have issues with it.  Here it is:-

On first glance the information is relatively useful and offers a comparison of Facebook usage at key stages (25 million users, 50 million, 100 million, 500 million and 1 billion).  The problem I have with the figures reported however is that of consistency.  If you look at the detail for 25, 50, 100 and 500 million you can see the reports have been about the following:-

  • Median user age
  • Top countries
  • Average friends for users joining the site at this stage

When we get to the 1bn point however, things change.  Instead of being consistent with previous reporting, the figures shift:-

  • Median user age
  • Top countries
  • Number of mobile users
What?  There are two things wrong here.  Not only is there no continuance of the comparisons of number of friends, but a whole new metric has been introduced to highlight the number of mobile users.

For me, this fails…  I can maybe understand that there was no data for numbers of friends at the 1 billion mark, and this is fine.  It can happen, but for me this should have left the reporters with one of two choices.  1)  Don’t report it at all or 2)  Explain that the data was not available at the time of reporting.  I might have forgiven that.

Instead, the report shifts to focus on the number of mobile users without that data having been present throughout the other milestones.  If they were going to report on this metric it would have been useful to see the numbers of mobile users from the beginning too, wouldn’t it?

Or am I being picky because it’s Sunday afternoon and I’m grumpy?

Basically, my takeaway from this is that the report would have been a lot more meaningful for those wanting a true comparison if consistency in the figures reported had been taken into account.  I would urge you to do the same when reporting yourself.  If you are reporting a metric over time or over a key set of milestones, make sure that you use the same metrics for each, otherwise the data means very little.