Skip to content
The UK's only independent think tank devoted to higher education.

Learning analytics

  • 3 November 2015
  • By Dean Machin

This guest blog post is written by Dean Machin, who works for Trilateral Research.

Learning analytics is arriving on university campuses. To universities it offers higher student retention rates and, at £9000pa per student, many are keen.

Learning analytics uses a combination of data about students and their university engagement to trigger interventions. In the simple case, if you haven’t logged on to the course homepage for two weeks expect a call from your tutor.

Most universities are only starting to think about learning analytics – and there is a lot to think about. First, what is the primary purpose they hope to achieve? Do they want to improve student satisfaction or improve student outcomes? These are not the same and universities should not assume that making students happier will make them more employable (or vice versa).

Without clarity about purpose universities will not know which data to use. The norm seems to be for universities to combine whatever data they can and assume that this is the correct data. But a lot of data will be irrelevant or positively misleading. In one project the University of Derby used 29 metrics. But it is very unlikely that most programmes will require this many metrics.

There are also privacy and data protection implications. It is bad data practice to process unnecessary or irrelevant personal data, especially when that data might include students’ ethnic status or caring responsibilities. Universities need to know what data is relevant in their case before processing it.

Second, do universities intend to use learning analytics data for other purposes? Student performance profiles can be used to differentiate the good from the bad teachers. Will learning analytics data be used to assess academic staff? Will learning analytics data be used in students’ references?

Perhaps the answer to both questions should be yes. But these purposes are clearly different from using learning analytics data to improve student satisfaction and outcomes. This must be remembered. Universities should not expect students to be as comfortable with potential employers seeing their engagement and progress data as they are with their tutors seeing it.

There are also legal rules around sharing data with third parties. If a student loses out on a job because of her learning analytics data, can she sue her university on the grounds that the data should not have been shared?

Equally, if universities use learning analytics data as a management tool staff need to know this. Policies must be put in place so that Heads of Departments cannot use it to remove difficult members of staff while ignoring the terrible teaching ratings of their star researchers.

There is also a broader question. There are over 150 universities in Britain and in the next 10-15 years we should expect competitive pressures to force institutions to differentiate themselves from each other in their student offering.

The level of student monitoring that learning analytics implies will not be compatible with all the business models that will emerge. Indeed, it is easy to imagine some business models predicated on eschewing learning analytics (‘Come to University X – we don’t monitor your every move.’)

Universities must decide whether learning analytics fits with their vision, and values and, of course, their students. While missing 40% of lectures may predict failure at some institutions, at others it may do no more than indicate that students are busy acting, writing for the student newspaper, or running a university society.

Also, if students know that a tutor will call when there is a problem, learning analytics may simply create more dependent learners. This is not in students’ long-term interests.

There will also be unintended consequences. Might students start to measure their progress through their learning analytics data-scores? What happens if a student gets a 2:2 but by all her analytics metrics – attendance, on-line engagement etc – she should have succeeded? Is this evidence that she was taught badly? Again: can she sue? Universities need a policy on this.

None of these implies that learning analytics should be rejected. But rather than seeing it as a technical solution to a retention problem, universities need to be very clear about why they want to use learning analytics and how it fits with their vision, values and offering.

Universities also need to think about their data protection and privacy policies. There will be horror stories and reputations will be damaged. Remember inBloom? No. It was a student data programme in the USA that received $100 million investment from the Gates Foundation. No-one seemed to think about potential privacy problems, people objected, and the programme collapsed. It would be a terrible waste if university learning analytics programmes went the same way.

 

 

1 comment

  1. Liz Morrish says:

    Nice, thoughtful argument. “It would be a terrible waste if university learning analytics programmes went the same way.” Not really I have to say. I have access to a system, but rarely use it. If I’m concerned about a student’s progress, I’m more likely to ask another lecturer who actually teaches them. Either way – learning analytics are an expensive white elephant. Good practice for the continuously surveyed workplace though. We just need to keep moving the criteria – keep them guessing what ‘performance’ is this week.

Leave a Reply

Your email address will not be published. Required fields are marked *