UB must monitor rankings info, Senate told
By SUE WUETCHER
But UB also must develop its own system to monitor and measure program performance in order to make "hard decisions on resource allocation" in times of limited university resources, Susan Hamlen, associate professor of accounting and law and chair of the subcommittee, told the Faculty Senate at its Oct. 8 meeting.
The subcommittee of the senate's Budget Priorities Committee was charged with examining and explaining the methodology used to generate the ratings "numbers" in Provost Thomas E. Headrick's academic planning document, Hamlen said.
In the document, Headrick stated that one of UB's goals was to have at least one-quarter of its doctoral programs and research rated in the top quartile of the NRC rankings, and almost all in the top half.
Given the importance of the NRC data in Headrick's report, Hamlen said, the subcommittee decided to analyze "the appropriateness" of the NRC rankings, as well as the Stony Brook Productivity Ranking-another system that Hamlen said is used to rate programs.
The NRC rankings-the most recent of which are based on 1992-93 academic-year data-are reputational rankings that look at two dimensions of program quality: the scholarly quality of program faculty and the effectiveness of the program in educating research scholars and scientists, she said.
Faculty raters are provided with lists of faculty involved in each program and are asked to rate each program on the quality and effectiveness dimensions, she said.
The NRC report on the rankings includes some observations on the attributes of the rankings, including that reputation is correlated with the size of the university and that there is agreement among raters on the best and worst programs, but "rankings in the middle quartiles can be unreliable," she said.
The Stony Brook rankings were developed by Lawrence Martin, dean of the Graduate School at Stony Brook, to determine a system of ranking on productivity that is comparable to the NRC ranking system on reputation, Hamlen said. The Stony Brook system takes NRC information on four factors-percentage of faculty publishing, publications per faculty, percentage of faculty with research support and number of citations per faculty-to rank each program on each factor, then averages the four ratings.
Don Schack, professor of mathematics and a member of the Budget Priorities Committee, told senators he had analyzed the Stony Brook rankings for mathematics programs and found "huge deficiencies with the Stony Brook index as a measure of almost anything."
Independent of the subcommittee, he conducted a direct comparison of the Stony Brook index with reputational indices in the field of mathematics. He called it "absurd" that the Stony Brook rankings placed Clarkson and Southern Methodist University-both at the bottom of the NRC rankings-among first-quartile institutions and "worse than absurd" that Yale placed in the third quartile and the University of Chicago in the fourth. Yale and Chicago are in the top 10 percent of programs, according to the NRC.
Schack also decried the reliability of the data used in the Stony Brook rankings, noting that there are incredible inaccuracies in program sizes that significantly impact the raw scores that are used. "If you want to go from 50 percent of your program publishing to 100 percent publishing, just omit a judiciously chosen selection of your faculty," he advised. "It's a very, very suspicious index, indeed."
Hamlen presented the subcommittee's key conclusions about using rankings such as NRC and Stony Brook in academic planning at UB:
- These rankings apply only to doctoral education, and may not be appropriate in evaluating undergraduate and master's-level programs.
- There are flaws in these rankings that may limit their effectiveness.
"However, external rankings such as NRC are influential in shaping the perceptions of UB and they cannot be ignored," Hamlen said.
Current Issue | Comments? | Archives | Search UB Home | UB News Services | UB Today |