About

This page contains a single entry from the blog posted on October 13, 2005 1:54 AM. The previous post in this blog was I want one of these. The next post in this blog is Curious cross-pollination. Many more can be found on the main index page or by looking through the archives.

E-mail, Feeds, 'n' Stuff

Thursday, October 13, 2005

Rank incompetence

Most academics loathe the U.S. News & World Report rankings of colleges and universities. This annual exercise purports to take a look at nearly every undergraduate and graduate school progam in the country, and to rank them all, from the "best" to the "worst." In the lower echelons, the rankings are just large groups, listed alphabetically, but at the top, there's a precise-looking Top 40 countdown that would make Casey Kasem proud. The publications that include the rankings make U.S. News a ton of money every year.

The reasons why many in the higher education business detest this process (and I'm speaking here for myself and other academics individually, not for my employer or any other institution) are many and varied. Many academic leaders have spoken out against the practice. Occasionally a school is brave enough to refuse to submit information to U.S. News for use in its survey. For example, Colin Diver, the president of Reed College here in Portland, has aleady gotten a fair amount of play out of this new commentary in the Atlantic Monthly, blasting U.S. News and explaining why Reed won't send them the data they request from all the colleges. I'll let you read what Diver has to say, and if you're interested in reading other grounds for distrusting the U.S. News ranking "system," you can find a rich literature of criticism just by using some basic Google smarts.

Like it or hate it, though, U.S. News is the 6,000-pound gorilla of college and university recruiting. Schools may turn up their noses at the rankings, but you can bet that they are well aware of where they stand on the charts, and are ever happy to make moves that they think will jump them up. Unfortunately, a lot of game-playing comes out of this, and I doubt that it ever brings about much improvement in the participants' academic programs.

The most notable byproduct of the annual beauty contest is the flood of full-color glossy brochures that the schools send out to all those who are likely voters in the U.S. News surveys. During U.S. News polling season, I get one or two such publications in the mail from different law schools every day. I never look at them any more. Hundreds of thousands of dollars in production, printing, and snail-mail costs, and each brochure goes directly from my mailbox to the recycling bin in the split second it takes me to recognize it as the junk mail it is.

My personal involvement in the U.S. News process has proven to me that it is ludicrous. From time to time, they get my name on one of their lists, and they send me a survey form to rank all the law schools in America. Sometimes the assigned task has been to rank the schools' overall programs, but lately, since I'm a tax professor, they ask me to rate all the law schools based on the educational opportunities they provide in the tax area.

I get the form, and I stare at it in disbelief. There are nearly 200 law schools listed there. How many of them could I possibly know anything meaningful about? O.k., I teach at one of them. I attended another one, 30 years ago. I have friends who teach at maybe a dozen more. I have read recent books and articles by professors at maybe a dozen more beyond that. That totals up to around one eighth of the sample. How does that qualify me to say anything at all about who's the "best" and the "worst" in the much larger group?

And how many law schools have I actually set foot in in the past five years beyond my own? Five at most. How many have I visited recently to teach regular courses in? None. How many other schools' faculty meetings have I attended? None. What do I know about the true atmosphere for learning at other schools? Nothing.

Plus, am I going to say anything good about my school's competitors? Our admissions officers fight tooth and nail for good applicants sometimes, and for better or worse, U.S. News can be a deciding factor in the prospects' decisonmaking. Doesn't that make me just a little biased? It's like sending a survey out to the auto makers and asking them who makes the best car.

The same silliness applies to the other major constituencies that U.S. News polls about the law schools: practicing lawyers and judges. What do they know about the vast majority of the 191 listed schools? Indeed, what does anybody know about the current educational programs of more than a few schools?

This week, though, the U.S. News game reached a new depth in my eyes. In my mailbox was another annual peer survey package from them, and when I opened it, I found this:

Note what they're asking me to rank the schools about: trial advocacy. That's a subject I have never taught in my 20 years in academe, and about which I know precious little. I have coached a moot court team for a while, but that's appellate advocacy, not trial advocacy. And so to send me a trial advocacy survey is the height of incompetence.

Hmmm, what do I do with this form? I guess I'll throw it away. But if I marked it up and sent it in, it would count just as much as every other form being submitted by other academics, including those who had a clue.

My votes would be utterly meaningless. And theirs wouldn't be much better.

Comments (14)

Well said.

I don't mind so much that US News ranks schools; However, their criteria for the rankings is ridiculous.

It is amusing to watch schools invent little tricks to improve their rankings in certain areas. For example, I went to a well-regarded liberals arts college, and our ranking was lower than it should be because we were pretty low in "alumni satisfaction," which USN&WR determined by the % of alumni giving to the annual fund (an absurd way to determine alumni satisfaction, by the way).

Ok, so what the school did was to mail each of its alumni a $1 bill with the annual donation form, telling them that if they could not give, to please just return the dollar. Then they counted that dollar as an alumni gift. So while the total $$$ remained the same, alumni giving went way up.

I would expect that many other categories in the magazine are equally able to be gamed by savvy administrators.

Thanks for an honest assessment of the validity of these surveys. It seems, in reality, they are no more useful to a prospective student than is lawyer advertising for a prospective client. One important caveat however is the extent to which employers value these rankings in making hiring decisions???

Ridiculous. In my (limited) experience, rara is the pre-1L avis that even knows what "trial ad" is, let alone factors the strength of a school's trial advocacy program into his or her decisionmaking process.

And, seriously, how many law schools have more than 1-3 substantive courses in their trial ad programs? A few at most?

Frankly you can't blame USN&WR for this. The blame lies squarely on the shoulders of the leaders of your institution and most of the others around the country who aid and abet and publicize this nonsense. How many university web sites brag about their rankings? A whole lot of them. Yet they would no doubt fire any academic who did such shoddy work in any other area of scientific research.

The only thing that amazes me about this is that so few universities have the integrity to opt-out of this nonsense. At least my alma mater (Reed) doen't go along.

PS, my wife is a Doctor and they consider the USN&WR annual hospital rankings to be about as valid as the school rankings, or even worse than the school rankings as they don't even have as many objective criteria to use.

Scientific-looking rankings sell magazines. See the recent Portland Monthly on K-12 schools in Portland, and the one on the "best doctors" in town.

That "best doctors" issue is kind of funny. My father-in-law showed up in that a couple of years ago and he had no idea how. He'd never received a survey, didn't hear anything about it until one of his colleagues showed him the issue.

How many university web sites brag about their rankings? A whole lot of them.

When I arrived at my alma mater, we were ranked fourth in the nation. YAY! Much cheering and whatnot. The next year, we went down to seventh. BOO! Much dark talk about how ratings don't really give you an accurate picture of the school, etc. The next year, up to 3rd. Full press release!

Very interesting; thanks for putting it together. I tend to like self-educated, blue-collar intellectual types myself. Having been stabbed in the back by Ivy Leaguers(who couldn't win on the strength of their arguments) more than once, I am wondering if some people don't use the rank of the institutions with which they are associated as an excuse to abandon critical thinking altogether?

I went to Lewis & Clark because my most treasured college professor had a daughter there and told me I would love it. And he was right, and I could not have made a better choice, certainly by going off of some kind of absurd ranking involving decimal points, of all the ridiculous BS.

Exactly as you point out, Jack, people are passing judgment on professors and schools they know nothing about. There's no such thing as a single meaningful ranking of schools. It depends entirely on what you want.

I, personally, am enormously happy that I went through a school that had a night program, so that I hung out with and took classes with people who had day jobs. But that's partly because with the exception of a year of temping, I was K-through-JD, so who's to say what that's worth to somebody else?

My mother's been writing college recommendations for about 30 years, and she always tells kids that the point is not to pick the best school, but the best school for YOU. It's a huge cliche, but everybody knows it's true, which is why this kind of ranking nonsense strikes me as so absurd.

Justin,
What specifically do you object to in the rankings? Have you looked at their methodology?

I have, and while I don't like the way schools game the system, I don't find the criteria that they use all that unreasonable. Average SAT, four year graduation rate, number of library books, average per student expenditure, faculty recourses, acceptance rates ... which of these SPECIFICALLY do you object to?

Jack points to peer review, but his own example actually shows how the system DOES work well. Assuming Jack is not lying (he implies he would if he misrepresents his own school) and that he does not rank schools he knows nothing about, US News gets 1/8th of the sample from jack. Then we overlap this with hundreds of other peer reviewers nationwide and voila! you have a valid peer review scale.

Look, no one forces parents to rely on US News solely. Parents that do are foolish. But I think US News does perform a valuable service. Of course it's better to visit every school personally or review each school's materials. But with more than 4000 educational institutions in the county, this is very hard to do. US News, Princeton Review, etc. help make the process a lot more comprehensible.

This farce should be the impetus for a national bar exam. The results, sorted by law school and subject, would provide the meaningful objectivity that the US News criteria lack.

Many of the "objective" factors used to determine rank are meaningless. For example, the number of volumes that the library possesses. Three examples to show the invalidity of this measure. First, the University of South Carolina law library is very small, because it is two blocks from the main University's library. Many law related treatises are contained in that library, but are not included in USC Law's book count. Second, Lewis & Clark law has a majority of its books in off-site storage. They are included in the school's bookcount. Many are old editions, outdated volumes, and other useless material. Third, why should a University devote monies to purchasing a variety of journals and reporters when a majority of studies use WestLaw, LexisNexis, LoisLaw, FindLaw, HeinOnline and JSTOR (among others) to access these materials?

Every factor has these problems. Per student expenditures? It's more expensive in New York City, NY, than South Royalton, VT. Time to graduate rates? School with a high number of "traditional" full-time 3 year students will do better than school who focus on "non-traditional" night students. Average LSAT? This only works if law students go to the "best" school they can get into ... not into the school that fits them best (money, location, program, et al).

The numbers are worse than meaningless. They are misleading and deceptive.

Chris:

(By the way, the things I mentioned above relate to Ugrad rankings; I don't know specifically about law school rankings.)

Adjust per student expenditures by cost of living.

LSAT: are you actually arguing that average LSAT is not related to quality of the school? Even accounting for fit, don't you expect a strongly positive correlation between average LSAT and school quality?

Time to graduate. You are right, schools with non traditional students do more poorly. But by most measures of educational assessment, getting students through the program in a reasonable amount of time is a measure of success. A program with a large number of non-traditional students may be successful at what it does, but do you think it deserves some higher ranking as a result?




Clicky Web Analytics