Arnold Mitchem, president of the Council for Opportunity in Education.I am writing in response to the March 13, 2013 article on the testimony of Dr. Cheryl Dozier, president of Savannah State University, on behalf of the Federal TRIO Programs at the Public Witness Hearing of the House Subcommittee on Labor, Health and Human Services, and Education Appropriations. Your article referenced, but did not cite, an evaluation of one of the TRIO programs and suggested that the programs are ineffective.
In fact, there have been four major evaluations of the Federal TRIO programs; three of which produced very positive results. However, I believe the evaluation to which your reporter was referring was the 2009 assessment report, The impacts of regular Upward Bound on postsecondary outcomes 7-9 years after scheduled high school graduation. Produced by Mathematica Policy Research, Inc., this evaluation did not demonstrate positive results for the Upward Bound program due to a number of problems in the research methods utilized in that study. The flaws uncovered in that evaluation prompted numerous scholars to ask that the Department of Education re-examine its conclusions. As of yet, the Department of Education has refused to do so.
It is absolutely true that we live in an era of heightened accountability and that evaluation plays a major role in ensuring that accountability. The Council for Opportunity in Education (COE) — and educators working within the Federal TRIO Programs — has a long history of supporting evaluations of college access and success efforts, including TRIO. Although COE has successfully argued for the inclusion of language to fund ongoing evaluations of the TRIO programs in the Higher Education Act since the mid-1980s, few within the higher education policy community have followed that example. COE sincerely believes that adequately resourced evaluations, conducted in an open and forthright fashion, can be an important component of program improvement.
However, the limits of evaluations conducted by contract researchers for the federal government must also be recognized, particularly by progressives and those organizations with a special responsibility to minority and low-income communities. In parallel situations, such as academic research or the release of financial data to the market, some safeguards are generally in place to assure a thorough review and debate of the data. In most such instances, a critical audience tests the premises and underlying assumptions of these reports in an open and ongoing fashion. Consumers of the information have the competency, the time, and the motivation to push back, seek additional information, discuss, and debate the ideas presented. Complexity is recognized and valued. Named individuals — whether business leaders or academics — are personally associated with the information presented and individual reputations are affected by the conclusions drawn from the research.
In most instances, this is not the case within federal departments, particularly in programs focused on low-income and minority individuals. While the Office of Management and Budget and the federal departments have established a system that purports to be able to determine the worth of a program or approach by commissioned evaluations, individuals within these agencies most often have neither the time, nor the information, nor the expertise to thoroughly examine evaluation results submitted to them and conveyed by them to Congress. Nor has anyone been able to prevent those in the executive branch, whether Democrats or Republicans, from selecting evaluation results that favor their priorities and ignoring those that do not.
Finally, by and large, program advocates have not been effective in using positive evaluations to ensure expansion of programs that have demonstrated their effectiveness. Contrast this with the record of those who argue against particular programs, especially those geared toward reducing inequity or widening opportunity. They are relentless in ignoring important nuances and using evaluations with any vehicle available to them to eliminate social programming.
Diverse: Issues In Higher Education stands in the unique position of not only reporting on the actions of government institutions as they affect minority students and institutions serving these students, but also in presenting and encouraging thoughtful discussion of the impact of government actions on these students. Contract research and its use by the government currently play an important role in defining the availability of resources to provide educational opportunity to minority students at every level from preschool through graduate education. I would urge the editors to examine ways in which your publication can promote a more thorough and open discussion of contract evaluations and their use by various interests in expanding or reducing college opportunity.
Arnold L. Mitchem is the president of the nonprofit Council for Opportunity in Education, which through its membership services works in conjunction with colleges, universities and agencies that host TRIO programs.
Thank you Dr. Mitchem for your article. Unfortunately, once it is out regardless of how incorrect it is, it becomes fact. Wouldn’t it be great if we could use the same argument for de-funding prisons. If sending people to prison is the deterrent, for so called “crimes”. Then all of them should be shut down. If imprisoning people is a deterrent for what ever, is has obviously fail. Especially, those individuals were black, brown and poor.
April 16, 2013 at 5:39 pm
I am not familiar with the particular studies you mention, but I completely agree that we need to be skeptical of findings from any one study. In order to know whether a type of social program will work, we need evidence from multiple sources. We need to look at the nature of the problem that the program is addressing, how the program compares with alternative approaches, all the impacts that the program has, and how those impacts are achieved. Thanks.
April 22, 2013 at 5:48 pm
Your email address will not be published. Required fields are marked *
Save my name, email, and website in this browser for the next time I comment.
Could training in implicit bias be helpful at your institution?