The recently published Horizon report noted that the current trending of learning
analytics will become standard fare in education institutions over the next few years. Surely, a plethora of variables affect a student's ability in experiencing a rounded and successful education, and it goes without saying that managing those is by no means a straightforward exercise from a provider point-of-view.
The idea, therefore, to carefully and consistently monitor students' engagement with the many different contact points in an education institution is a good starting point. Some illustrative examples of analytic variables include attendance (all types), use of the library (analogue and digital), VLE engagement, personal background (e.g. home or abroad student, age profile, level of language proficiency, commuting distance, etc.), timetable (shape of student's day), complaints (academic and otherwise), social profile (e.g participation in organised extracurricular activities) among many others.
Before bringing up a specific institutional case study which applied analytics as a means to identifying at-risk students, it makes sense to clarify the term learning analytics as "the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environment in which it occurs" (LAK).
Yet, it is important to acknowledge that learning analytics by itself is only one (important) ingredient within a holistic socio-technical analysis mix that aims to create a richer understanding about a student's educational journey. "Hard" data by itself is not sufficient. Instead, the desired informational end-state of educators is to create meaningful insight into at-risk students for the purpose of enacting proactive (rather than reactive) intervention at the right point in time. The grid below illustrates this idea of generating information and insights through analytics (source: Davenport et al, 2010)
Derby's SETL (Student Experience Traffic Lighting) project looked at students' experiences when interacting with the institution. The focus here was on "engagement" analytics through focus groups and interviews with students and staff whereby hard data and analytics initially featured in the background. The project was interested in finding out what genuinely mattered to students from a lived experience perspective and what data analytics staff believe should feature in a student's profile.
The initial project questions revolved around 1) what is actually happening to students, how can we find out?, 2) what are the touch points with between students and the institution?, 3) what are the institutional "digital footprints" of our students?, 4) what really matters to students?
Effectively, SETL established the requirements for a data dashboard that utilises a traffic-light system (green,amber,red) monitoring individual students' progression and alerting staff about any anomalies that might negatively impact their study journey. Meaningful/informed conversations can then be held between tutors and students based on what Derby calls "engagement analytics".
Yet, it is important to acknowledge that learning analytics by itself is only one (important) ingredient within a holistic socio-technical analysis mix that aims to create a richer understanding about a student's educational journey. "Hard" data by itself is not sufficient. Instead, the desired informational end-state of educators is to create meaningful insight into at-risk students for the purpose of enacting proactive (rather than reactive) intervention at the right point in time. The grid below illustrates this idea of generating information and insights through analytics (source: Davenport et al, 2010)
Past
|
Present
|
Future
|
|
Information
|
What happened?
|
What’s happening now?
|
What will happen?
|
Insight
|
How and why did it happen?
|
What’s the next best action?
|
What’s the best/worst that can happen?
|
Derby's SETL (Student Experience Traffic Lighting) project looked at students' experiences when interacting with the institution. The focus here was on "engagement" analytics through focus groups and interviews with students and staff whereby hard data and analytics initially featured in the background. The project was interested in finding out what genuinely mattered to students from a lived experience perspective and what data analytics staff believe should feature in a student's profile.
The initial project questions revolved around 1) what is actually happening to students, how can we find out?, 2) what are the touch points with between students and the institution?, 3) what are the institutional "digital footprints" of our students?, 4) what really matters to students?
Effectively, SETL established the requirements for a data dashboard that utilises a traffic-light system (green,amber,red) monitoring individual students' progression and alerting staff about any anomalies that might negatively impact their study journey. Meaningful/informed conversations can then be held between tutors and students based on what Derby calls "engagement analytics".
From a library/information service perspective, the collection of analytics is nothing new. Examples include borrowing of books (down to student level), footfall via access control gates (down to student level), IM reference traffic, participation in organised IL instruction classes (down to student level), e-journal use (Athens) etc.
The challenge is to break down and pool analytic data from different collection systems (library, registrar, VLE, student services...) into one container which can then be accessed by staff and utilised on a per-student level. Ethical and legal aspects in collecting personalised analytic data must also be considered.
There are out-of-the-box solutions which combine generic reporting with predictive analytics. However, they might not suit just any institutional context with regard to conditions on the ground and related requirements. An example would be the Desire2Learn system, which also includes an analytics component. The screenshot on the right illustrates what personalised student engagement data can look like.
It is a rounded and student focused approach towards analytics that creates the conditions for an environment of constructive teaching and learning.
Jean Mutton (University of Derby) offers the following advice regarding analytics related projects:
- Keep the end user firmly in mind,
- Promote a sense of excitement and fun,
- Talk to people "the best conversation can happen in corridors",
- Work at a cross-institutional level as much as possible,
- Have access points, e.g. in-house web pages/forums where people can get involved,
- Shock people with data
- Get people to think outside their immediate area of focus,
- Keep an open mind as to what information your university/school does/could/can't capture and how it could be used,
- Try to ensure that any data being collected can be used by as many people and systems as possible
References and further reading:
JISC. 2013. cetis Publications. [ONLINE] Available at: http://publications.cetis.ac.uk/c/analytics. [Accessed 01 April 13].
Essa, A., Ayad, H.. Improving student success using predictive models and data visualisations. Research in Learning Technology, North America, 20, aug. 2012. Available at: http://www.researchinlearningtechnology.net/index.php/rlt/article/view/19191. [Accessed: 01 April 2013].
0 comments:
Post a Comment