Few People Seem to be Aware of This Well Intentioned but Questionably Executed Tool
The idea of having a platform where consumers can easily compare colleges seems to be a good one, but there are far more subtleties to what differentiates one institution of higher education from another than are reflected in the President's College Scorecard.
As has been repeatedly the case with so many regulations aimed at education, the idea seems generally good, but insufficient attention to detail produces a result that leaves a lot to be desired, as attested to by many:
- ‘A Blunt Instrument' (Inside Higher Ed): “the scorecard does not include information about learning outcomes, long-term student success or student satisfaction, factors that many in higher education say are equally valuable and are areas where institutions that value general education would likely perform well.”
-
Scorecard for Colleges Needs Work, Experts Say (New York Times): “… the information is presented as averages and medians that might have little relevance to individual families.”
- How to Use Obama's College Scorecard (US News & World Report): “The new tool is a good starting point to weigh college options, but lacks context …”
Observations like those above abound across the Web. One such commentary that I found particularly refreshing is this one, reflecting a vital part of what colleges like The College of Westchester excel at, from the Forbes article, “Rating President Obama's College Ratings“:
“We also need to consider the populations who enroll in different kinds of colleges. Some colleges make it their mission to educate students from disadvantaged backgrounds who are less well prepared than their peers. Helping 50% of these students to earn a bachelor’s degree within six years may represent a greater achievement than a 90% rate at a selective college that enrolls mostly students with high test scores and transcripts full of A’s, honors and AP credits.”
Below is an example of the content available for all U.S. colleges and universities. The information is certainly useful, but little context is offered. For example, we see information about the rate of increase in net price but it is based on the period 2009 to 2011 – dating back over 4 years ago. A lot can change in four years. And what is the time frame for the Graduation Rate provided? Why is there nothing whatsoever here to give a better sense of the kind of student this school typically serves, as that is vital to the context of the information?
As many of those quoted in the articles offered above note, much of the information presented here is available elsewhere on the web. This tool isn't really bringing new information to the discussion. Additionally, few people, even those working in higher education, are even aware of it, or if they've heard of it, they are not really familiar with it.
The Scorecard may be well intentioned, but much like many other efforts to ‘oversee' and regulate education, a significant expenditure of taxpayer dollars produces an end result that is but a mere fraction of what was originally intended. Wouldn't those dollars be better spent supporting education, or keeping student loan interest rates down?
So what do you think about the Scorecard?
Hey, it’s a start. It’s also maybe less biased than some corporate supported sites and directories, and *definitely* more reliable than the phishing sites that covertly push for-profits. IHE and USNWR have their own rankings, of course they’ll be critical. Costs are cited as a negative, but no mention of what the costs are or how cost relates to benefit – in fact, no mention of benefit at all. Not used much yet? Not a reason to ditch it, give it time. Overall, I think it’s a good service for students and families.