Annotated Bibliography

**Indicates readings that were assigned as prep-work for one of the sessions.

General Learning Analytics

**U.S. Department of Education, Office of Educational Technology, Enhancing Teaching and Learning Through Educational Data Mining and Learning Analytics: An Issue Brief, Washington, D.C., 2012. Retrieved from http://www.ed.gov/edblogs/technology/files/2012/03/edm-la-brief.pdf

  • (77 pages) This brief was written to help policymakers and administrators understand how learning analytics and data mining have been -- and can be -- applied for educational improvement. The substantive chapters are: Data Mining and Analytics: The Research Base (describes the differences between data mining, learning analytics, and visual data analytics), Data Use in Adaptive Learning Systems (describes the major features of a prototypical online learning system), Educational Data Mining and Learning Analytics Applications (broadly describes areas of application, e.g., modeling user knowledge, behaviors, and experiences; user profiling; trend analysis, etc.), Implementation Challenges and Considerations (describes the challenges of implementing data mining and learning analytics in K-20 settings including a discussion of FERPA), and Recommendations (includes a short list of areas that R&D folks should consider)

**van Barneveld, A., Arnold, K. E., & Campbell, J. P. (2012, January). Analytics in higher education: Establishing a common language. Educause Learning Initiative. Retrieved from http://www.educause.edu/library/resources/analytics-higher-education-establishing-common-language

  • (11 pages) This paper discusses the fundamentals of learning analytics. What is it? How is it defined? The breadth of analytics definitions in the literature are presented, and a converged set of definitions is also proposed. It is useful that the intersection of the Scholarship of Teaching and Learning and learning analytics is described.

**Gere, A., Aull, L., Green, T., & Porter, A. (2010). Assessing the validity of directed self-placement at a large university. Assessing writing, 15, 154-176. Retrieved from http://dx.doi.org/10.1016/j.asw.2010.08.003

  • (23 pages) Abstract: Following Messick’s definition of validity as a multi-faceted construct that includes contextual, substantive, structural, generalizable, external, and consequential dimensions, this study examined an established directed self-placement (DSP) system that had been functioning for ten years at a large university. The goal was to determine the extent to which this manifestation of DSP could be described as a valid assessment system for students choosing between a developmental and a first-year writing course. Analysis of data, including details of students’ academic records, course materials, DSP questions, surveys, and interviews, led to the conclusion that DSP at this university does not have strong validity. Because validity is always embedded in a local context, the profession needs further investigations of the validity of DSP in a variety of other college and university settings, and this study includes an analytical framework that can be used in such work.

**Long, P, & Siemens, G. (2011, September/October). Penetrating the Fog: Analytics in Learning and Education. Educause Review, 31-40. (Frame this article with the editorial for this issue: Oblinger, D. G. (2011, September/October). Doing Better with Less. Educause Review, 4-6.)

  • (6 pages) This article is part of a whole Educause Review issue devoted to Learning Analytics. The article claims that the greatest impact on higher education today is big data and analytics. The data explosion is described (including it’s three defining elements -- speed, scale, and sensors), big data is defined, and learning analytics is contrasted with academic analytics.

**Parry, M. (2012, July 18). Big data on campus. The New York Times. Retrieved from http://www.nytimes.com/2012/07/22/education/edlife/colleges-awakening-to-the-opportunities-of-data-mining.html?smid=pl-share.

  • (10 pages) This article discusses the idea of mining the the digital data that students “leave behind”. A number of different examples are given, and some ethical concerns are raised.

**Tanes, Z., Arnold, K.E., King, A.S., & Remnet, M.A. (2011). Using Signals for appropriate feedback: Perceptions and practices. Computers & Education, 57 (4), 2414-2422. Retrieved from: http://ac.els-cdn.com/S0360131511001229/1-s2.0-S0360131511001229-main.pdf?_tid=e02f9882-2503-11e2-97f4-00000aab0f6b&acdnat=1351871115_64eb0d2775e89c10fc791aa83b6895fe

  • (9 pages) Abstract: Feedback is a crucial form of information for learners. With the availability of new educational technologies, the manner in which feedback is delivered has changed tremendously. Existing research on the learning outcomes of the content and nature of computer mediated feedback is limited and contradictory. Signals is an educational data-mining technology for student success where instructors can send students elaborate feedback. Hence, this paper examines the content and nature dimensions of feedback in Signals with two studies. Study one identifies how the instructors who used this technology define its feedback function. Study two identifies the types of feedback that were included in the messages sent to students. Both studies employ content analysis method; study one examines the transcripts of interviews with instructors, and study two examines the feedback messages composed by instructors. Results indicate that while instructors perceive Signals as a tool to primarily provide motivational and summative feedback, student success was related to the type, and performance or outcome orientation of both summative and formative feedback students received. The results and implications of both studies are further discussed and future directions are proposed.

Dringus, L. P. (2012). Learning analytics considered harmful. Journal of Asynchronous Learning Networks, 16, 87-100.

  • (14 pages) Abstract: This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through the lens of questioning the current status of applying learning analytics to online courses. The goal of the discussion is twofold: (1) to inform online learning practitioners (e.g., instructors and administrators) of the potential of learning analytics in online courses and (2) to broaden discussion in the research community about the advancement of learning analytics in online learning. In recognizing the full potential of formalizing big data in online courses, the community must address this issue also in the context of the potentially “harmful” application of learning analytics.

Ferguson, R. (2012). The State Of Learning Analytics in 2012: A Review and Future Challenges. Technical Report KMI-12-01, Knowledge Media Institute, The Open University, UK. Retrieved from http://kmi.open.ac.uk/publications/pdf/kmi-12-01.pdf

  • (18 pages) Abstract: Learning analytics is a significant area of technology-enhanced learning that has emerged during the last decade. This review of the field begins with an examination of the technological, educational and political factors that have driven the development of analytics in educational settings. It goes on to chart the emergence of learning analytics, the rise of learning-focused perspectives and the influence of national economic concerns. It next focuses on the relationships between learning analytics, educational data mining and academic analytics. Finally, it sets out the current state of learning analytics research, and identifies a series of future challenges.

West, D.M. (2012, September). Big Data for Education: Data Mining, Data Analytics, and Web Dashboards. Governance Studies at Brookings. Retrieved from http://www.brookings.edu/~/media/research/files/papers/2012/9/04%20education%20technology%20west/04%20education%20technology%20west.pdf

  • (11 pages) From the Executive Summary: In this report, I examine the potential for improved research, evaluation, and accountability through data mining, data analytics, and web dashboards. So-called “big data” make it possible to mine learning information for insights regarding student performance and learning approaches. 1 Rather than rely on periodic test performance, instructors can analyze what students know and what techniques are most effective for each pupil. By focusing on data analytics, teachers can study learning in far more nuanced ways. 2 Online tools enable evaluation of a much wider range of student actions, such as how long they devote to readings, where they get electronic resources, and how quickly they master key concepts.

Non-Cognitive Factors

**Farrington, C.A., Roderick, M., Allensworth, E., Nagaoka, J., Keyes, T.S., Johnson, D.W., & Beechum, N.O. (2012). Teaching adolescents to become learners. The role of noncognitive factors in shaping school performance: A critical literature review. Chicago: University of Chicago Consortium on Chicago School Research.

  • (complete report - 108 pages) This report makes no mention of learning analytics. Rather, this report addresses noncognitive factors that are related to students’ academic performance. The five general categories are:

- Academic behaviors (going to class, doing homework, organizing materials, etc.)

- Academic perseverance (grit, tenacity, delayed gratification, self-discipline, etc.)

- Academic mindsets (I belong in this community, I can succeed at this, etc.)

- Learning strategies (study skills, metacognitive strategies, goal-setting, etc.)

- Social skills (interpersonal skills, empathy, cooperation, assertion, etc.)

**Morisano, D. Hirsh, J.B., Peterson, J.B., Pihl, R.O., & Shore, B.M. (2010). Setting, Elaborating, and Reflecting on Personal Goals Improves Academic Performance. Journal of Applied Psychology, 95(2), 255-264.

  • (10 pages) Abstract: Of students who enroll in 4-year universities, 25% never finish. Precipitating causes of early departure include poor academic progress and lack of clear goals and motivation. In the present study, we investigated whether an intensive, online, written, goal-setting program for struggling students would have positive effects on academic achievement. Students (N = 85) experiencing academic difficulty were recruited to participate in a randomized, controlled intervention. Participants were randomly assigned to 1 of 2 intervention groups: Half completed the goal-setting program, and half completed a control task with intervention-quality face validity. After a 4-month period, students who completed the goal-setting intervention displayed significant improvements in academic performance compared with the control group. The goal-setting program thus appears to be a quick, effective, and inexpensive intervention for struggling undergraduate students.

Ethics

**Parry, M. (2011, July 10). Harvard researchers accused of breaching students' privacy. The Chronicle of Higher Education. Retrieved from http://chronicle.com/article/Harvards-Privacy-Meltdown/128166/?sid=wb&utm_source=wb&utm_medium=en

  • (7 pages) The story of these Harvard researchers being accused of breaching student privacy sheds light on the ethical challenges that those using big data may encounter.

**U.S. Department of Education, Office of Educational Technology, Enhancing Teaching and Learning Through Educational Data Mining and Learning Analytics: An Issue Brief, Washington, D.C., 2012. Retrieved from http://www.ed.gov/edblogs/technology/files/2012/03/edm-la-brief.pdf

  • (7 pages) The Implementation Challenges and Considerations chapter from this Department of Education Brief describes the challenges of implementing data mining and learning analytics in K-20 settings, including a discussion of FERPA.

Duhigg, C. (2012, February 16). How Companies Learn Your Secrets. The New York Times. Retrieved from http://www.nytimes.com/2012/02/19/magazine/shopping-habits.html?pagewanted=all

  • (15 pages) This is an article about how Target knows you’re pregnant and can predict your due date very accurately based on buying and viewing history. Not exactly relevant to teaching and learning but definitely relevant in terms of thinking about the ethical concerns of companies (and universities) using big data.

van Wel, L., & Royakkers, L. (2004). Ethical issues in web data mining. Ethics and Information Technology, 6, 129-140.

  • (12 pages) Abstract: Web mining refers to the whole of data mining and related techniques that are used to automatically discover and extract information from web documents and services. When used in a business context and applied to some type of personal data, it helps companies to build detailed customer profiles, and gain marketing intelligence. Web mining does, however, pose a threat to some important ethical values like privacy and individuality. Web mining makes it difficult for an individual to autonomously control the unveiling and dissemination of data about his/her private life. To study these threats, we distinguish between ‘content and structure mining’ and ‘usage mining.’ Web content and structure mining is a cause for concern when data published on the web in a certain context is mined and combined with other data for use in a totally different context. Web usage mining raises privacy concerns when web users are traced, and their actions are analysed without their knowledge. Furthermore, both types of web mining are often used to create customer files with a strong tendency of judging and treating people on the basis of group characteristics instead of on their own individual characteristics and merits (referred to as de-individualisation). Although there are a variety of solutions to privacy-problems, none of these solutions offers sufficient protection. Only a combined solution package consisting of solutions at an individual as well as a collective level can contribute to release some of the tension between the advantages and the disadvantages of web mining. The values of privacy and individuality should be respected and protected to make sure that people are judged and treated fairly. People should be aware of the seriousness of the dangers and continuously discuss these ethical issues. This should be a joint responsibility shared by web miners (both adopters and developers), web users, and governments.

Interventions/Actionable Initiatives

**Brown, M. (2012, July). Learning Analytics: Moving from Concept to Practice. Educause Learning Initiative. Retrieved from http://net.educause.edu/ir/library/pdf/ELIB1203.pdf.

  • (5 pages) Main Points: (1) Amid a long list of measurable factors, some have been shown to correlate strongly with academic outcomes, while others are not strong indicators of student success. (2) The representations—often graphical—of the patterns and insights gleaned from analytics are a central component of how that information is understood and used. (3) The most e!ective learning analytics programs will be institution-wide e!orts, taking advantage of a wide range of resources and possible interventions. This article would be a nice reading for the session about taking analytics projects to actionable initiatives.

Lonn, S., & Teasley, S. D. (2009). Saving time or innovating practice: Investigating perceptions and uses of learning management systems. Computers & Education, 53 (3), 686-694. Retrieved from doi.org/10.1016/j.compedu.2009.04.008

  • (9 pages) Abstract: Learning Management Systems (LMS) are web-based systems that allow instructors and/or students to share materials, submit and return assignments, and communicate online. In this study, we explored the uses and perceived benefits of using a LMS to support traditional classroom teaching as reported by instructors and students at a large American Midwestern university. We examined two years of survey data focusing on specific uses of the LMS that emphasized either efficient communication or interactive teaching and learning practices. We matched aggregate user log data with corresponding survey items to see if system use was consistent with patterns seen in the survey results. Findings suggest that instructors and students value tools and activities for efficient communication more than interactive tools for innovating existing practices. However, survey item analysis reveals that instructors and students also highly value the teaching and learning tools within the LMS. An example of a learning analytics project that we could discuss in the Fellows program.

Pinder-Grover, T., Green, K. R., Millunchick, J. M. (2011, Winter). The efficacy of screencasts to address the diverse academic needs of students in a large lecture course. Advances in Engineering Education. American Society for Engineering Education. Retrieved from http://advances.asee.org/vol02/issue03/papers/aee-vol02-issue03-p09.pdf

  • (28 pages) Abstract: In large lecture courses, it can be challenging for instructors to address student misconceptions, supplement background knowledge, and identify ways to motivate the various interests of all students during the allotted class time. Instructors can harness instructional technology such as screencasts, recordings that capture audio narration along with computer screen images, to supplement the lecture with content that addresses the diversity in student academic backgrounds, motivations, and interests, to extend the classroom experience, and reach the individualized needs of students. This study documents the strategic use of screencasts in a large introductory Materials Science and Engineering (MSE) course, and examines their impact on student usage and course performance. To assess the efficacy of screencasts, students were surveyed to determine how they used screencasts and whether they perceived these resources to be helpful. In addition, we correlated student usage based on website hits with student performance (e.g. final grade) to determine statistical significance. Since the course is comprised of students from different academic and social backgrounds, we also analyzed usage and performance patterns for particular student subgroups. The results indicate that students perceive the screencasts to be helpful and tend to use the resources as a study supplement. Overall, usage of screencasting in its various forms is positively and significantly correlated with course performance as indicated by the final grade. The most substantial gains were found for students with the least amount of prior exposure to concepts in the course material. These results indicate a potential for screencasts to address the various academic needs of students in a large lecture environment.

Value Added Measures

**Harris, D.N. (2012). How do value-added indicators compare to other measures of teacher effectiveness? Carnegie Knowledge Network. What We Know Series: Value-Added Methods and Applications. Carnegie Foundation for the Advancement of Teaching. Retrieved from http://www.carnegieknowledgenetwork.org/briefs/value-added/value-added-other-measures/

  • (11 pages) Highlights:

    • Value-added measures are positively related to almost all other commonly accepted measures of teacher performance such as principal evaluations and classroom observations.

    • While policymakers should consider the validity and reliability of all their measures, we know more about value-added than others.

    • The correlations appear fairly weak, but this is due primarily to lack of reliability in essentially all measures.

    • The measures should yield different performance results because they are trying to measure different aspects of teaching, but they differ also because all have problems with validity and reliability.

    • Using multiple measures can increase reliability; validity is also improved so long as the additional measures capture aspects of teaching we value.

    • Once we have two or three performance measures, the costs of more measures for accountability may not be justified. But additional formative assessments of teachers may still be worthwhile to help these teachers improve.

**McCaffrey, D.F. (2012). Do value-added methods level the playing field for teachers? Carnegie Knowledge Network. What We Know Series: Value-Added Methods and Applications. Carnegie Foundation for the Advancement of Teaching. Retrieved from http://www.carnegieknowledgenetwork.org/briefs/value-added/level-playing-field/

  • (13 pages) Highlights:

    • Value-added measures partially level the playing field by controlling for many student characteristics. But if they don’t fully adjust for all the factors that influence achievement and that consistently differ among classrooms, they may be distorted, or “confounded.”

    • Simple value-added models that control for just a few tests scores (or only one score) and no other variables produce measures that underestimate teachers with low-achieving students and overestimate teachers with high-achieving students.

    • The evidence, while inconclusive, generally suggests that confounding is weak. But it would not be prudent to conclude that confounding is not a problem for all teachers. In particular, the evidence on comparing teachers across schools is limited.

    • Studies assess general patterns of confounding. They do not examine confounding for individual teachers, and they can’t rule out the possibility that some teachers consistently teach students who are distinct enough to cause confounding.

    • Value-added models often control for variables such as average prior achievement for a classroom or school, but this practice could introduce errors into value-added estimates.

    • Confounding might lead school systems to draw erroneous conclusions about their teachers – conclusions that carry heavy costs to both teachers and society.

Office of Science and Technology Policy Press Release. (2012, March 29). Obama Administration Unveils “Big Data” Initiative: Announces $200 Million in New R&D Investments. Executive Office of the President. Retrieved from http://www.whitehouse.gov/sites/default/files/microsites/ostp/big_data_press_release_final_2.pdf

  • (4 pages) This press release describes a recent federal initiative to use and learn from “big data” sources in a variety of contexts. One of the goals of this initiative is to improve teaching and learning, specifically via the NSF. Teaching and learning is only briefly mentioned in a long list of applications and projects using “big data”.

Soares, L. (2012, May/June). The Rise of Big Data. Educause Review. Retrieved from http://net.educause.edu/ir/library/pdf/ERM1237.pdf

  • (2 pages) Excerpt: Higher education institutions gather [big] data from these processes largely for the purposes of reporting to public policymakers. Evidence suggests that very little of it is used to create the data-driven enrollment, instruction, or student-support practices that could promote college completion and success. Meanwhile, emerging technologies not only are providing institutions with data that could facilitate the development of these practices but also are giving students the opportunity to see the data about their journeys, successes, and failures. When provided to students in useful ways, this data can allow students to become better managers of their own educational experiences and can also, perhaps, improve collective outcomes across all of higher education. In short, the era of big data has arrived in higher education.