Last month, higher education headlines blared about George Washington University’s inflation of class-rank data for entering freshmen, one factor in U.S. News & World Report’s calculation of college rankings. GWU, which had been ranked 51st nationally and perceived broadly as having been on a multi-year ascent, found itself pulled from this year’s U.S. News rankings altogether.
Students and parents who place stock in national rankings should be aware that these rankings can be manipulated, and many times they are. Colleges and universities fixated on national rankings should view GWU’s misstep as part of a series of cautionary tales. Emory University announced in August that the institution had been misreporting data for more than a decade. Last January, news broke that a senior admission official at Claremont McKenna College in California had been falsifying data since 2005. All of these institutions are highly competitive for admission and well regarded. Yet senior officials massaged data in hopes of achieving still higher status.
Why take the risk?
You may ask yourself the same question when news surfaces about insider trading investigations. Why would hedge-fund billionaires risk enormous fortunes for still more? Pressure to perform can lead individuals to lose their moral compass. It’s easy to rationalize that moving the line a bit is for a purpose that’s ultimately good and productive.
At GWU, application reviewers were to estimate class ranks if they were not provided by high schools. Many high schools have abandoned class rank because of grade inflation, which can result in significant clumping of grade point averages, and due to competition. Many estimated ranks were optimistic yet they were reported in the U.S. News annual survey, despite instructions that only actual ranks be included.
It’s easy to imagine junior admission officers estimating class ranks precisely as instructed by senior staff. They may not have known that their estimates would affect the institution’s national ranking in a dishonest way. Responsibility lies with senior administrators to train staff thoroughly and with integrity and to report data in a like fashion to publications requesting statistics.
Imagine the pressure felt by senior members of a college’s administration to produce an incoming-class profile that could result in a stronger national ranking. We’ve heard much discussion about the mandate for the University of Kentucky to achieve top 20 status. How can it all be kept above board?
Local data reporting efforts
At UK, according to Jay Blanton, executive director for public relations/marketing, “Staff in both enrollment management and institutional research regularly review student demographic, admissions, registration and graduation information for accuracy. Student data also goes through an external edit process with the Council on Postsecondary Education before official data submission.”
The Office of Institutional Research takes care of data reporting for all surveys. Class ranks are never estimated for applicants whose applications do not include them.
Transylvania University follows a similar practice, according to Brad Goan, vice president for enrollment. Goan works with Transy’s director of institutional research and assessment, who reviews data for inconsistencies and irregularities. Goan said that survey instructions tend to leave little room for interpretation, except in one area: that of SAT and ACT reporting.
“Our middle 50 percent ranges include all first-year students who submit a test score. If a student submits an ACT score and an SAT score, his/her best ACT is counted in the ACT middle 50 percent and his/her best SAT is counted in the SAT middle 50 percent, even though we only use the student’s best score for admission and scholarship purposes.”
Goan continued, “While I find that most surveys are very specific in how these data should be calculated, this is one of the gray areas in reporting admissions data. I have talked with colleagues at other colleges who drop scores not used for admission, or they ‘superscore’ the ACT composite.”
Centre College’s Bob Nesmith, dean of admission and student financial planning, said that “to quote [President] John Roush, there is no single ‘keeper of the keys.’ Centre has a half-time director of institutional research who coordinates much of what we send in response to surveys.”
Nesmith views class-rank reporting as problematic, because so many students’ high schools do not provide class rank. Regarding test scores, “All enrolling first-year students are included, as long as they have an ACT or SAT-I score. [There are sometimes a small number of international students for whom we accept the TOEFL in lieu of the SAT-I or ACT.]”
Many sets of eyes review data at Centre, particularly for surveys going to important publications, before the data are released from campus.
The takeaway
Without an external audit procedure imposed on all colleges, we cannot know each institution’s approach. Thus, it’s hard to have complete trust in rankings published. We hope that all of our local institutions will continue to report data with integrity, regardless of the intensity of their national-rankings aspirations.
Students and parents: Do not value rankings unduly as you choose which colleges to consider. Rankings do have some utility, but only when coupled with your own careful research.
Jane S. Shropshire guides students and families through the college search process and is Business Lexington’s Higher Ed. Matters columnist. Contact her at Jshrop@att.net.