Author: Josh Baron
The Open Academic Analytics Initiative’s (OAAI) mission can be summarized by one number: 38%. Of students starting bachelors’ degrees at four-year institutions in 2004, only 38% completed their degree in four years (see http://nces.ed.gov/programs/digest/d11/tables/dt11_345.asp). Yes, some of these students transferred to other institutions and may have completed their degree but even when this is considered, an average of 38% is a sign of a major crisis in higher education.
To help combat this growing “completion epidemic,” the OAAI, which was supported through a Wave I EDUCAUSE Next Generation Learning Challenges grant, developed and deployed an open-source academic early alert system for higher education that leverages the power of big data and learning analytics. Although not a panacea, this system allows us to predict which students are “at risk” of failing a specific course and then, more importantly, to deploy interventions designed to help at-risk students complete the course successfully.
The project was multi-dimensional involving three major related efforts: (1) technical work to develop the actual early alert system, (2) analytics needed to create our predictive model, and (3) research into the effectiveness of different intervention strategies. The technical work that was done, using all open-source software, as well as the predictive model that was created, which has also been released under an open license, is fully documented on our project’s wiki (https://confluence.sakaiproject.org/x/8aWCB). And we are now able to share research findings related to the third effort: the interventions we tested, their impact on student success, and some lessons learned.
As part of our larger goal of researching factors in the scaling of learning analytics across all of higher education, we investigated the effectiveness of two different intervention strategies: “awareness messaging” and the Online Academic Support Environment (OASE). In our “awareness messaging” intervention, students identified by our predictive model as being “at risk” of not completing a course received a standardized message from their instructor, making them aware of his or her concern and suggesting how they might improve (e.g., meet with a tutor). With the OASE intervention, students received a similar message but were invited to join an online support community that was designed to aid academic success. The OASE uses a Sakai Project Site to provide students with resources for skill remediation, including Open Educational Resources such as Khan Academy and services such as openstudy.com support by a professional academic support specialist, and interactions with student mentors who serve as peer coaches.
To research the effects of these intervention strategies, we deployed the OAAI early alert system to more than 2,200 students at four partner institutions, two community colleges and two historically Black colleges and universities (HBCUs). We designed the study so that the instructor we worked with, in most cases, taught three sections of the same course. This design allowed us to use one section as a control group while assigning the other two sections, or treatment groups, to one of the two interventions (either “awareness” or OASE).
After extensive analysis of the results, we have determined that our interventions had a statistically significant positive affect on student success as measured by final course grades and “content mastery” (grade of C or better). For example, students in our treatment groups who received interventions got a final course grade, on average, that was four percentage points higher than those in our controls (see graph and table). We found similar trends when we looked only at “low income” students (those receiving Pell Grants), indicating that these interventions were effective among this sub-set of the student population. Interestingly, we did not find any significant differences between the two treatment groups (awareness and OASE): they appear to have nearly identical mean course grades. Apparently both intervention strategies had an impact on student performance, but neither intervention was more effective than the other. Since in both interventions students receive the same message from their instructor expressing concern over their performance, it appears to us that the messaging to the student is the driving factor in improving his or her performance rather than the specific actions recommended.
Of the many lessons learned during our grant period, one of the most significant was the importance of a robust research design and methodology. Although research in education is never perfect, our study design - using one instructor teaching three sections of the same course - allowed us to control a number of potentially confounding variables such as teaching style, course content, and assessment strategies. If we had instead had different instructors teaching different treatment groups, it would have been impossible to show that our interventions, rather than one of the other variables, were the primary cause for the improvements in student success. In addition to this design issue, the fact that we ran pilots across four different institutions and with a relatively large number of students helped ensure a sample size which was big enough for us to have “statistical significance” in our findings.
Given the success of our OAAI work, we are now in the process of engaging with institutions that are members of the open-source Apereo Foundation, as well as others in the learning analytics community, to plan a larger-scale open learning analytics ecosystem. Such a system will allow us to bring data in from many different sources, including a range of learning management systems, and provide institutions with a library of openly licensed predictive models.
More details on our research methodology and lessons learned can be found in the recently published EDUCAUSE Learning Initiative (ELI) Seeking Evidence of Impact Case Study, just released to the general public: https://library.educause.edu/resources/2013/5/scaling-learning-analytics-across-institutions-of-higher-education
Josh Baron is the Senior Academic Technology Officer at Marist College where he oversees the office of Academic Technology and eLearning which is responsible for a wide range of instructional technology initiatives, including distance learning, faculty professional development, and learner support. He has been heavily involved in the Sakai open-source community over the past eight years and is currently serving as chair of the Apereo Foundation (which resulted from the merger of Sakai and Jasig) Board of Directors. Josh served as principal investigator for the OAAI project and can be reached at email@example.com.