Menu Close

Session Summary – US News Participation Panel

Submitted by Rishi Batra

On Thursday, July 27th, a number of law school deans and faculty members met for a free flowing and thoughtful discussion on h organized by Lucas Osborn (Cambell University). Given the recent discussion around the US News and World Report (USNWR) rankings of law schools spurred by a number of schools publicly or privately “withdrawing” from participating in the rankings, the panelists and participants considered multiple questions regarding the purposes, past, and future of these and other rankings.

Panelists first discussed how their school decided to stay in or out of the rankings and what the process was regarding that decision, as well as their general thoughts. Schools obviously varied in how they made their decisions, with some schools surveying the faculty, others involving the admissions deans, the provost and other higher administration, and in some cases, it being a dean only decision. This led to the question of whether “withdrawal” from participation was really a permanent decision, since a new dean with a different set of priorities could reverse that decision.

It was also pointed out that withdrawing from participation can mean a number of different things at each school, from not submitting any data to the USNWR, not submitting only some data but submitting other data, and then how much purchase of the USNWR product to purchase. Not submitting some data can be seen as a “soft” withdrawal. When the rankings come out, schools can decide to purchase advertising on the rankings page, purchase data analysis from the Insights program, or pay for a badge on their website, among other products, and some schools may register their objections in now purchasing these products or services. It was noted that even schools that fully withdrew from the rankings in terms of not submitting their data worked with USNWR when they previewed the rankings and the schools felt the data was inaccurate and so impacted their rankings.

Several criticisms of the rankings themselves arose during the session, including moral, financial, and practical. Among the moral objections were the incentives the rankings create for schools to “buy” good classes, moving resources away from perhaps the most deserving or students with the most financial need to those with the highest incoming indicators. Financial objections included the idea that the cost in time and staff to submit this data was significant for schools and that could be better used for other purposes.

However, the largest criticisms came on the practical side, frequently regarding the validity of the data and the rankings. One school pointed out that they were able to impact their rankings significantly by focusing on the “bench and bar” score, by submitting judges and attorneys to rank their school highly and follow up with those rankers to make sure they submitted these scores. This came as a surprise to many in the room that this was even a possibility. Others questioned how the much larger weight given to bar pass and employment will create a high amount of variability in the rankings and so may further impact the trust given to these rankings, since a small variation in bar pass from year to year can vastly increase or decrease a schools ranking. One participant brought up the fact that USNWR admitted that they run a number of different models to make sure the top schools don’t vary too much from year to year before deciding on the official rankings for that year, reminding us that this is not some objective, exogenous ranking but a manipulated one. Other criticisms noted that since the criteria and weighting change from year to year, that makes it difficult to compare rankings from year to year and creates a game without any rules at all.

An underlying focus of the discussion was what the purpose of the rankings were for. The panelists mentioned that ostensibly, these rankings are to help students identify the best schools for them, and ideally avoid those that will not serve them well in terms of allowing them to pass the bar and get a job as a lawyer without incurring unsupportable amounts of debt. This is particularly important for those students who do not come from backgrounds where they have access to families or friends that can coach them about the differences among law schools, such as those that are first generation law students or first generation undergraduates. However, participants noted that these rankings may not actually help those students make the distinctions they need to. First, because the rankings don’t capture things such as mentorship, quality of teaching, or potential aid. Second, because most students are deciding among a regional group of schools, and the rankings do not capture regional strength vs. national reputation. (This may also be a reason for so- called regional schools or other specialty schools, such as religious schools, to drop out of the rankings). Some alternative rankings were suggested, including the FLARE rankings from Matthew Sag, as well as a tool by AccessLex that allows students to rank schools based on their own criteria.

Despite the fact that students are the supposed targets for the rankings, a number of panelists noted that they were used for many other purposes. One dean noted that deans have a love hate relationship with the rankings, with deans disavowing and downplaying the rankings, but having an incentive to rise in the rankings regardless. Especially since deans have been fired for a perceived responsibility for a drop in the rankings, even if the drops are not statistically significant. This creates and incentive to “play the game” if they want to survive. Similarly another panelist pointed out that professors themselves use the rankings in hiring decisions, as well as decisions in where to publish or to evaluate the quality of published papers as a shortcut in lieu of reading.

The impact of the rankings on bar passage was a recurring theme as well. The increased weight on bar pass in the rankings (25% first time pass rate, combined with 7% for ultimate bar pass rate, for almost a third of the rank) may also increase the number of students who are excluded after the first year in order to improve bar pass rates. This is tempered by the fact that the ABA will monitor a school for too high an exclusion rate as well, since schools should not be admitting students who they do not think will pass the bar for accreditation and for moral reasons. “Bar passage” as a criteria itself is in question, since different states can have different cut scores on the Unform Bar Exam, so may be considered a failure in the state the school is in, but can practice in another state with that same bar pass score and so may be considered a passing score on the ultimate bar pass rate. The Next Gen bar will also have an impact on this criteria, especially when some states have moved to the new exam and others have not, so comparing “apples to apples” will become even more difficult.

A number of participants had thoughts on the future of the rankings, and of schools’ participation in them. There was an idea that the large number of schools pulling out may grind the rankings to a halt, but the continuation of them does not seem to have borne that out. The question then arose if this was meant to send a message to USNWR, and the fact that they have significantly changed some of the weighting may be evidence of some reaction to the feedback from schools. If this continues, there may be impact on the diversity of classes that schools may be able to admit without impacting their rankings, although that may not have been the goal of initial schools who withdrew. Also, since many schools have dropped out without a significant impact on their overall rank, more schools may drop out in order to avoid the time and expense of gathering the data. However, this brings into question whether this may impact the specialty rankings, especially for those schools that have a high concentration of faculty in a particular discipline. It also remains to be seen what an individual faculty member can decide to do in submitting their ranks, if asked, vs. the school’s choice to withdraw. Schools may choose to jump back in after a few years depending on how the rankings change. Most in the room agreed that it would be better to have alternative ranking systems, but since those that do exist have not seem to have had the impact of USNWR, it does seem like USNWR is here to stay for many years to come.