WASHINGTON — Colleges and universities must make better use of the vast amount of student data that is available in order to bring about more timely graduations for all students, a data mining expert exhorted his colleagues Monday at an annual gathering of education researchers.
“Equal access should also focus not only on enrollment but completion,” said Huzefa Rangwala, an associate professor in the Department of Computer Science And Engineering at George Mason University.
Rangwala is principal investigator for a $766,000 “big data” project funded by the National Science Foundation to use data analytics to find “actionable insights” to boost college success.
Rangwala shared lessons from the project Monday during a panel about using “big data on campus” at the annual meeting of the American Educational Research Association. The conference drew a record 16,000 attendees, organizers said.
Among other things, Rangwala said institutions of higher learning should be looking for ways to develop course recommendation programs that mimic how Netflix — a movie and TV series streaming service — recommends movies based on the person’s user profile.
“Even if you disagree it will still try to force that movie on you,” Rangwala said, arguing that course selection platforms could be designed the same way.
Guided suggested that course selection is important because students may try to take courses that they are not ready for based on their prior grades and the courses they’ve taken thus far.
But as advances are made in the field of “learning analytics,” questions arise about how much information should be shared with a student because it could potentially have a negative impact.
David Knight, an assistant professor in the Department of Engineering Education at Virginia Tech and an affiliate faculty member with the Center for Human-Computer Interaction, said it’s important to take a “human-centered” approach to data analytics and to design systems with input from students and faculty.
In designing such a system for engineering students at Virginia Tech, Knight said students and faculty alike raised concerns about how using a student’s prior grades to predict future course success could “pigeonhole” the student as a poor performer.
“Instructors are saying it’s hard to eliminate preconceptions,” Knight said. “If you know one is a ‘C’ student in high school, it will pigeonhole the student in the instructor’s perception. So they’re really careful about this idea of creating stereotypes of students.”
He also said there’s a risk of “demotivating” students by telling them they have a 60 percent chance of passing a given course based on their prior grades — even if that prediction is grounded in data.
Jennifer Deboer, an assistant professor of engineering education at Purdue, spoke of lessons she is learning as principal investigator of a $500,000 NSF grant project that focuses on how on how to evaluate and improve online courses for engineering undergraduates from diverse backgrounds.
Among other things, Deboer and her research team analyzed how frequently students used a “checkable answer” feature to see if they had gotten the correct answer.
Initially, she said she thought a few different types of groups would emerge, but students ended up falling into one of two categories: high performing and low performing.
Surprisingly, she said, higher-performing students had gotten a smaller portion of problems correct the first time than lower-performing students.
“That was surprising to us and particularly to the instructors,” Deboer said.
After interviewing a small sample of the nearly 500 students in the program, she discovered that students who had gotten the answer wrong the first time were “forced to reflect on their problem-solving process,” which she said might help explain why they ultimately ended up doing better than those who had gotten the answer right the first time.
The annual AERA meeting, which ends today, featured talks and presentations on a wide variety of topics that ranged from guns and violence on campus to different interpretations of civil unrest on campus. It also gave young emerging scholars a chance to showcase their work.
For instance, Bradley Quarles and Alisha Butler, two second-year Ph.D. students in education policy at the University of Maryland, College Park, presented a highly nuanced paper — “Critique of Gentrification Scholarship” — that led them to conclude that the voices of minorities were being “muted” in the literature on gentrification.
“If policymakers are serious about improving the circumstances of all residents through neighborhood change, then their policy frames ought to be attentive to the cultural constructions, as filtered through space, of all residents,” Quarles and Butler wrote in their paper.
“In particular, understanding the effects of gentrification on schools requires a multi-vocal body of research that incorporates the perspectives of all who experience gentrification.”
Jamaal Abdul-Alim can be reached at email@example.com or you can follow him on Twitter @dcwriter360.
Could training in implicit bias be helpful at your institution?