Create a free Diverse: Issues In Higher Education account to continue reading

Under IPEDS Measures, Some Graduates and Transfers Still Considered Dropouts

When Chantel Hampton earned her bachelor’s degree in history from Towson University in Maryland, she was surprised to learn she would be forever considered a college dropout.

“It’s disconcerting,” says Hampton, now 31 and a designer in Maryland. “They’re going by archaic standards.”

Hampton is one of millions of college graduates who are considered college dropouts based on the rigid — many now say unrealistic — criteria used by the federal government’s principal postsecondary education data collection program.

Under the Integrated Postsecondary Education Data System, known in academia as IPEDS and established in 1992 as part of the federal Higher Education Act, a four-year institution is not allowed to count a student in its graduation rate if the student was a transfer student from another four-year institution or a community college, like Hampton. Hampton originally enrolled in Drexel University in Philadelphia, but transferred to Towson to complete her education.

First-time students who enroll as part-time students also can never be counted as a graduate, even if they graduate, under the IPEDS measures. Additionally, community college students who transfer without a degree to a four-year institution and eventually earn a degree cannot be counted among that four-year institution’s graduates.

“IPEDS is telling only one small part of the story” of the state of higher education, says Dr. Doug Shapiro, executive research director of the National Student Clearinghouse Research Center, the nation’s principal data exchange for colleges registrars. “They need to get the rest of the story.”

Time for change

Colleges and universities across the nation are calling for changes in the IPEDS program as they’ve realized what initially seemed like a practical idea for measuring higher education from several perspectives has now become a meaningless government survey.

Still, IPEDS data is used regularly by state and federal agencies, college rating surveys, parents, teachers and administrators. Public and private funders use IPEDS data to make key decisions from what to fund and what schools are “best,” as well as to criticize institutions for high dropout and low graduation rates.

“IPEDS wasn’t designed to be a measure of quality, but it’s being asked to,” says Melanie Corrigan, director of national initiatives for the American Council on Education, the major umbrella organization of higher education organizations.

“It’s a flawed measure,” Corrigan adds. “But other folks would argue that, in the absence of anything better, it’s what we have now.”

Under IPEDS, any public or private institution that receives federal funds must file a report each year with the Department of Education’s National Center on Education Statistics on a variety of topics to provide the nation with an idea about the status of higher education.

When IPEDS was first established, the idea to count an institution’s annual data on first-time, first-year enrollment and count the same class or cohort at the end of a six-year period was based on the historical assumption that high school graduates pick a school, stay in it for four years and graduate.

“One of the strengths of IPEDS is the metrics are consistent — same definitions, same time frames and same metrics,” says Shapiro. “The way to measure these other things is not so well defined.”

The list of those “other things” is getting longer and longer, say higher education administrators, data gatherers and researchers like Shapiro.

The IPEDS enrollment measuring rules did not consider higher education becoming a different animal in less than a quarter of a century, they say. IPEDS made no provisions for the surge in student transfers, “stop-outs” (temporary withdrawals), community college transfers and first-time, part-time student enrollment.

A different picture

The changing characteristics of the American college population have helped fuel debates over the validity of retention and graduation rates at institutions of all kinds and sizes. To determine how much IPEDS is out of touch will depend on whose research raises new questions.

A 2011 Harvard University Graduate School of Education report, Pathways to Prosperity, used IPEDS data to assert that only 56 percent of college students complete four-year degrees within six years. Shapiro says his data, supplied by 95 percent of the nation’s institutions, indicates that more than 20 percent of college graduates earned their degrees at institutions other than the one they initially enrolled in. Corrigan says the inclusion rate at some institutions, based on the IPEDS criteria, means less than 10 percent of students can be counted.

“If you’re not in the denominator, you’re not going to be in the numerator,” says Shapiro, echoing colleagues who say today’s IPEDS program is giving a far-from-accurate picture of enrollment, advancement and graduation situations at many institutions.

In 2005, when the Department of Education proposed tracking all students by some unit number, the idea was widely criticized by politicians and many higher education leaders as an invasion of privacy. They believed the original idea would track students by their Social Security numbers. The proposal was quickly shelved, leaving in tact the flawed IPEDS.

Student focus

Florida A&M University mirrors the on-the-ground shortcomings of IPEDS.

About 20 percent of its students are from out of state, which helps sustain FAMU’s annual enrollment of some 10,000 students. By the same token, FAMU has “a lot of stop-outs,” says Dr. William Hudson Jr., FAMU vice president for student affairs. “We don’t know if they transferred or dropped out,” he says.

“If they come for one year and leave, it counts against us,” continues Hudson, sharing experiences similar to those voiced by officials at other institutions. “If they transfer in and graduate, it counts against us,” even if the numbers lost from that cohort are made up by graduation time by transfer students from other four-year institutions and area community colleges.

The withdrawal of thousands of students who have had difficulty with college costs since the 2008 recession has only made matters worse, says Hudson, adding that the accelerated loss of students in recent years will impact graduation rates for years to come.

There are some signs that the shortcomings of IPEDS are starting to be addressed. Congress has awarded Maryland and several other states federal funding to begin setting up state-level longitudinal data systems that can track students from kindergarten to the workforce. The new approach focuses on students, not institutions as IPEDS does, explain backers of the longitudinal campaign. It also does not rely on Social Security numbers.

Meanwhile, a coalition of higher education groups, backed with two years of funding from the Bill and Melinda Gates Foundation and the Carnegie Corporation, are set to launch the Student Achievement Measure (SAM) in December. Coordinated by the American Council on Education, the SAM project “provides an improved” look at the status of higher education by tracking the progress of students regardless of state or institution, the groups say.

Meanwhile, the government is considering a major expansion of IPEDS. This year, the government said it was seeking comment on a proposal to begin counting part-time students in IPEDS reports.

A New Track: Fostering Diversity and Equity in Athletics
American sport has always served as a platform for resistance and has been measured and critiqued by how it responds in critical moments of racial and social crises.
Read More
A New Track: Fostering Diversity and Equity in Athletics