One of the questions asked for this week about learning analytics is, “is it a fad?”. I agree with Ian Reid’s comment on an earlier post, it’s almost certainly going to be another fad. The following offers some evidence for this, some insights into why it will be the case, and suggests one way it might be avoided.
First, the slideset that was used for the presentation and then the “extended” abstract of the talk.
Fashions and Learning Analytics
Baskerville and Myers (2009) define a management fashion as “a relatively transitory belief that a certain management technique leads rational management progress” (p. 647). For a variety of reasons, it appears that learning analytics will become the next fashion in educational technology. Siemens and Long (2011) position learning analytics as “essential for penetrating the fog that has settled over much of higher education” (p. 40) in part because making decisions based on data seems “stunningly obvious” (p. 31). The 2012 Horizon technology outlook for Australian Tertiary Education (Johnson, Adams, & Cummins, 2012) placed learning analytics into the “one year or less” time-frame for adoption. Anecdotal reports suggest that every Australian higher education institution has at least one, if not more, learning analytics projects underway. However, just two years ago the 2010 Horizon technology outlook for Higher Education in Australia and New Zealand (Johnson, Smith, Levine, & Haywood, 2010) included no mention of learning analytics.
If institutions are going to successfully harness learning analytics to address the challenges facing the higher education sector, then it is important to move beyond the slavish adoption of the latest fashion and aim for more mindful innovation. Swanson and Ramiller (2004) define mindfulness “as the nuanced appreciation of context and ways to deal with it lies at the heart … of what it means to manage the unexpected in innovating with IT” (p. 556). Hirschheim, Murungi and Pena (2012) argue that the introduction of social considerations at an early stage in discussions “may help moderate the adoption of a new IS innovation and replace the sudden and short-lived bursts of interest with a more enduring application of the innovation” (p. 76).
The following seeks to identify a range of broader considerations that are necessary to move learning analytics beyond being just the next fashion. It proposes three likely paths Australian universities may take in their adoption of learning analytics. It will argue that at least one of these paths is dominant and that the best outcomes will be achieved when institutions combine all three paths into a contextually appropriate strategy. It will identify a range of pitfalls specific to each path and another set of pitfalls common to all three paths. This is informed by experience from a four year old project exploring learning analytics within an Australian university (Beer, Clark, & Jones, 2010; Beer, Jones, & Clark, 2009; Beer, Jones, & Clarke, 2012), broader experience in e-learning, and insights from the learning analytics, education, management and information systems literature. This work will inform the next round of design-based research that is seeking to explore how and with what impacts academics can be encouraged to use learning analytics to inform individual pedagogical practice.
Three likely paths
The paths do not have clear and distinct boundaries, however, at the core of each path there is a distinct set of fundamental assumptions and common practices. The three paths are listed below in decreasing order of prevalence and increasing distance from the learning context. The three paths are:
- Do it to the academics and students.
While Siemens and Long (2011) define this path as academic analytics, it is common to hear such projects described as learning analytics. This path involves the implementation and use of a data warehouse (DW) and associated business intelligence (BI) tools. It is the current dominant path (Dawson, Bakharia, Lockyer, & Heathcote, 2011; Johnson & Cummins, 2012).
- Do it for the academics and students.
Researchers, vendors and institutional learning and teaching organizations design and implement methods, models, tools and professional development intended to be used by academics to harness learning analytics to inform their pedagogical practice.
- Do it with the academics and students.
This path focuses on working closely with academics and students to explore how learning analytics can be helpful. It recognizes the complexity and contextual nature of teaching practice and the limited knowledge around how to effectively use learning analytics to inform the individual practice. It assumes that the best way to change how people think about education is to “have experiences that enable appropriation of new modes of teaching and learning that enable them to reconsider and restructure their thinking and practice” (Cavallo, 2004, p. 97).
Potential pitfalls specific to each path are listed in the following table with a brief description and references.
|Do it to|| Complex implementation requiring significant organisational changes (Ramamurthy, Sen, & Sinha, 2008) and facing a range of problems (Arnold, 2010; Campbell, 2012; Campbell, DeBlois, & Oblinger, 2007; Greenland, 2011).
Subsequent high failure rates and limited use (Macfadyen & Dawson, 2012; Ramamurthy et al., 2008; Schiller, 2012).
Over-use of simplistic, quantitative measures leading to compliance cultures, ineffective responses, and limited engagement from staff (Jones & Saram, 2005; Knight & Trowler, 2000; Palmer, 2012)
Few, if any, insight how learning analytics can help academics inform the design and evaluation of their teaching practice (Dawson et al., 2011).
|Do it for|| Tendency to focus on abstract representations that are detached from practice, distorting the intricacies of practice, and limiting how well it can be understood and enhanced (Seely Brown, Collins, & Duguid, 1989).
Limited adoption because of an ignorance of the diversity of academic staff and the subsequent homogeneity of technology implementation and support (Geoghegan, 1994).
Assumption that academics design their teaching using a rational planning model. Lattuca and Stark (2009) suggest that this is not the case.
|Do it with|| Such approach spends time and energy discovering what to do and consequent can be seen as inefficient (Introna, 1996).
Systems that engage pre-dominantly in exploration and too little in exploitation exhibit too many undeveloped new ideas and too little distinctive competence (March, 1991).
A decentralized approach can lead to problems include a lack of resources and a feeling of operating either outside of or in opposition to the institution’s policies and processes (Hannah, Jenny, Ruth, & Jane, 2010).
Other potential pitfalls
Beyond path specific pitfalls, there are a number of common pitfalls. The session will use the literature to identify and describe a range of these. A sample of these common pitfalls is listed in the following table.
|We’re not rational||Data-driven decision making “does not guarantee effective decision making” (Marsh, Pane, & Hamilton, 2006, p. 10).
Organisational and political conditions and the interpretations of individuals and collectives shape and mediate this process (Marsh, Pane, & Hamilton, 2006, p. 3).
Given a complex environment, there are limits to the ability of human beings to adapt optimally, or even satisfactorily (Simon, 1991).
Even the best Decision Support System “cannot overcome poor managerial decision making” (Hosack, Hall, Paradice, & Courtney, 2012, p. 321).
|Issues from informing fields||Learning analytics draws on a number of established fields with active research programs that are identifying relevant issues that will impact upon learning analytics projects. Examples include big data (Bollier & Firestone, 2010; Boyd & Crawford, 2012) and Decision Support Systems (Arnott & Pervan, 2005; Hosack et al., 2012)|
|Learning is diverse||There is no one best way of developing instruction (Davies, 1991) and instructional design can only progress with the recognition that “learning is a human activity quite diverse in its manifestations from person to person and even from day to day” (Dede, 2008, p. 58).
Both students (Beer, Jones, & Clark, 2012) and academics (Clark, Beer, & Jones, 2010) show diversity in how they interact within the LMS for different courses. The simple patterns of analytics hide significant complexity (Beer, Jones, & Clark, 2012).
|Measuring the wrong thing||While LMS adoption is almost universal at the institutional level, it is limited at the course level (Jones & Muldoon, 2007) with the majority of use focused on content distribution (Malikowski, 2010).
Analysis of existing LMS data is measuring limited, poor quality learning and teaching.
The absence of high-quality data leads to data becoming misinformation and subsequently invalid inferences (Marsh, Pane, & Hamilton, 2006).
The trend away from institutional information systems is likely to further reduce the level of data available for analysis.
|Forgetting about action||In considering data driven decision making, “equal attention needs to be paid to analysing data and taking action based on data” (Marsh, Pane, & Hamilton, 2006, p. 10). In the context of learning analytics, taking action doesn’t receive the same level of attention.
Being able to interpret the patterns provided by learning analytics and apply them to practice is difficult, time-consuming, requires additional support, and is worthy of further investigation (Dawson et al., 2011; Dawson & McWilliam, 2008).
|Novelty and dynamic contexts||Transforming an institution as complex as the university is neither linear nor predictable (Duderstadt, Atkins, & Van Houweling, 2002).
Operating in a dynamic context requires organisational structures that adjust and become far more responsive to change (Mintzberg, 1989)
Instructional design is an archetypal example of an ill-structured problem (Jonassen, 1997)
Arnold, K. E. (2010). Signals: Applying Academic Analytics. Educause Quarterly, 33(1).
Arnott, D., & Pervan, G. (2005). A critical analysis of decision support systems research. Journal of Information Technology, 20(2), 67–87.
Baskerville, R., & Myers, M. (2009). Fashion waves in Information Systems research and practice. Mis Quarterly, 33(4), 647–662.
Beer, C., Clark, K., & Jones, D. (2010). Indicators of engagement. Curriculum, technology and transformation for an unknown future. Proceedings of ASCILITE Sydney 2010 (pp. 75–86). Sydney.
Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity : Learning and leading for the future The hidden complexity behind simple patterns. ASCILITEÕ2012. Wellington, NZ.
Beer, C., Jones, D., & Clark, K. (2009). The indicators project identifying effective learning, adoption, activity, grades and external factors. Same places, different spaces. Proceedings ascilite Auckland 2009 (pp. 60–70). Auckland, New Zealand.
Beer, C., Jones, D., & Clarke, D. (2012). Analytics and complexity: Learning and leading for the future. ASCILITEÕ2012. Wellington, NZ.
Bollier, D., & Firestone, C. (2010). The promise and peril of big data. Washington DC: The Aspen Institute.
Boyd, D., & Crawford, K. (2012). Critical questions for big data. Information, Communication & Society, 15(5), 662–679.
Campbell, G. (2012). Here I Stand. Retrieved April 2, 2012, from https://sas.elluminate.com/site/external/jwsdetect/playback.jnlp?psid=2012-03-01.1231.M.0728C08DFE8BF0EB7323E19A1BC114.vcr&sid=2008104
Campbell, J., DeBlois, P., & Oblinger, D. (2007). Academic analytics: A new tool for a new era. EDCAUSE Review, 42(4), 40–42.
Cavallo, D. (2004). Models of growth – Towards fundamental change in learning environments. BT Technology Journal, 22(4), 96–112.
Clark, K., Beer, C., & Jones, D. (2010). Academic involvement with the LMS : An exploratory study. In C. Steel, M. Keppell, P. Gerbic, & S. Housego (Eds.), Curriculum, technology & transformation for an unknown future. Proceedings ascilite Sydney 2010 (pp. 487–496).
Clegg, S., & Smith, K. (2008). Learning, teaching and assessment strategies in higher education: contradictions of genre and desiring. Research Papers in Education, 25(1), 115–132.
Davies, I. (1991). Instructional development as an art: One of the three faces of ID. Performance and Instruction, 20(7), 4–7.
Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). ÒSeeingÓ networks : visualising and evaluating student learning networks Final Report 2011. Main. Canberra.
Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicator of learning and teaching performance. Canberra: Australian Learning and Teaching Council.
Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), (pp. 43–59). New York: Springer.
Duderstadt, J., Atkins, D., & Van Houweling, D. (2002). Higher education in the digital age: Technology issues and strategies for American colleges and universities. Westport, Conn: Praeger Publishers.
Geoghegan, W. (1994). Whatever happened to instructional technology? In S. Bapna, A. Emdad, & J. Zaveri (Eds.), (pp. 438–447). Baltimore, MD: IBM.
Greenland, S. (2011). Using log data to investigate the impact of (a) synchronous learning tools on LMS interaction. In G. Williams, P. Statham, N. Brown, & B. Cleland (Eds.), ASCILITE 2011 (pp. 469–474). Hobart, Australia.
Hannah, F., Jenny, P., Ruth, L., & Jane, M. (2010). Distance education in an era of eLearning: challenges and opportunities for a campus-focused institution. Higher Education Research & Development, 29(1), 15–28.
Hirschheim, R., Murungi, D. M., & Pe–a, S. (2012). Witty invention or dubious fad? Using argument mapping to examine the contours of management fashion. Information and Organization, 22(1), 60–84. doi:10.1016/j.infoandorg.2011.11.001
Hosack, B., Hall, D., Paradice, D., & Courtney, J. F. (2012). A Look Toward the Future : Decision Support Systems Research is Alive and Well. Journal of the Association for Information Systems, 13(Special Issue), 315–340.
Introna, L. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20–39.
Johnson, L., Adams, S., & Cummins, M. (2012). Technology Outlook for Australian Tertiary Education 2012-2017: An NMC Horizon Report Regional Analysis. New Media Consortium. Austin, Texas.
Johnson, L., & Cummins, M. (2012). The NMC Horizon Report: 2012 Higher Education Edition (p. 42). Austin, Texas.
Johnson, L., Smith, R., Levine, A., & Haywood, K. (2010). The horizon report: 2010 Australia-New Zealand Edition. Austin, Texas.
Jonassen, D. (1997). Instructional design models for well-structured and ill-structured problem-solving learning outcomes. Educational Technology Research and Development, 45(1), 65–94.
Jones, D., & Muldoon, N. (2007). The teleological reason why ICTs limit choice for university learners and learning. In R. J. Atkinson, C. McBeath, S. K. A. Soong, & C. Cheers (Eds.), ICT: Providing choices for learners and learning. Proceedings ASCILITE Singapore 2007 (pp. 450–459). Singapore.
Jones, J., & Saram, D. D. D. (2005). Academic staff views of quality systems for teaching and learning: a Hong Kong case study. Quality in Higher Education, 11(1), 47–58. doi:10.1080/13538320500074899
Knight, P., & Trowler, P. (2000). Department-level Cultures and the Improvement of Learning and Teaching. Studies in Higher Education, 25(1), 69–83.
Lattuca, L., & Stark, J. (2009). Shaping the college curriculum: Academic plans in context. San Francisco: John Wiley & Sons.
Macfadyen, L., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Educational Technology & Society, 15(3), 149–163.
Malikowski, S. (2010). A Three Year Analysis of CMS Use in Resident University Courses. Journal of Educational Technology Systems, 39(1), 65–85.
March, J. (1991). Exploration and exploitation in organizational learning. Organization Science, 2(1), 71–87.
Marsh, J., Pane, J., & Hamilton, L. (2006). Making sense of data-driven decision making in education: Evidence from recent RAND research. Santa Monica, CA.
Mintzberg, H. (1989). Mintzberg on Management, Inside our Strange World of Organisations. New York: Free Press.
Palmer, S. (2012). Student evaluation of teaching: keeping in touch with reality. Quality in Higher Education, 18(3), 297–311. doi:10.1080/13538322.2012.730336
Ramamurthy, K., Sen, A., & Sinha, A. P. (2008). Data warehousing infusion and organizational effectiveness. Systems, Man and É, 38(4), 976–994. doi:10.1109/TSMCA.2008.923032
Schiller, M. J. (2012). Big Data Fail : Five Principles to Save Your BI. CIO Insight. Retrieved from http://www.cioinsight.com/c/a/Expert-Voices/Big-Data-Fail-Five-Principles-to-Save-Your-BI-Butt-759074/
Seely Brown, J., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.
Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. EDUCAUSE Review, 46(5).
Simon, H. (1991). Bounded rationality and organizational learning. Organization Science, 2(1), 125–134.
Swanson, E. B., & Ramiller, N. C. (2004). Innovating mindfully with information technology. MIS Quarterly, 28(4), 553–583.