28th Oct 2013
In previous posts I’ve talked about the importance to both students and managers of having access to good data in order to monitor and manage progress. I’d now like to consider the wider topic of management information, which is something that has benefits at the line manager level and in terms of the organisation as a whole.
I am quite deliberately using the term management information rather than reporting for several very good reasons.
When selecting and configuring an LMS, many organisations get fixated on an ability to run reports, often with little or no idea about the content of those reports or, more importantly, what they will do with the information in those reports once they have them.
A report simply tells me that something has (or hasn’t) happened. Management information is something that I can use to make decisions and direct action. Management information is meaningful data that is in some way actionable.
Often, learning data consists purely of attendance records, activity levels and course scores. That’s ok if it drives action, but that’s rarely its intended purpose. Frequently it is distributed to people who will look at it and do nothing. This kind of data falls into two groups:
- Vanity metrics - Reports that say, “Ooh, look how many courses we’ve run.”
- Comfort metrics - Reports distributed to senior managers to reassure them that something is happening, even if they don’t really understand what it is or whether it has any effect on organisational goals.
Vanity metrics and comfort metrics have one common trait; neither of them leads to action.
There is nothing intrinsically wrong with reporting on activity or completion rates, as long as you intend to use that data in some meaningful way. You might use it to make decisions about future training budgets or the location of future workshops. More interestingly, you might use that data to get a high level view of how effective your learning is.
For example, if you have issues with poor customer service you might attempt to address that, at least in part, through training. After a few months it would be interesting to compare the take up of training to any change in customer satisfaction metrics. The opportunities to demonstrate the effectiveness and usefulness of learning and training are almost without limit.
Of course you may be concerned that the data will show that there has been no measurable impact or perhaps that customer satisfaction metrics have dropped - and perhaps that’s secretly why we don’t see a whole lot of this kind of measurement going on. Whether the data shows an improvement or worsening of results, we have actionable data - we either have an indication that what we did worked and so perhaps we should do more, or we know that it didn’t work so we need to dig deeper to find out why and improve our solution.
It almost goes without saying that the LMS is an essential tool in recording and retrieving management information. Sadly, the reporting tools in many LMSs leave a lot to be desired, being difficult to use, static in their display of data and difficult to export or share. If you are considering a new LMS there are a few things that I would recommend looking for in its reporting capabilities:
- Tools that are easy for non-technical users to operate
- Interactive reports that allow you to drill down into the root data
- The ability to report on everything - which relies on the LMS recording everything - not just basic completion activity
- The option to export and share data in a range of formats such as Excel and PDF
- The ability to create custom reports without needing the services of a programmer