RWU Law students have come from almost every state, hundreds of undergraduate institutions, and nations as diverse as Canada, China, Guatemala, India, Ireland, Liberia, Russia, South Korea and Zimbabwe.
Faculty Scholarship Study
This study was conducted during the fall of 2012 and the spring of 2013. The objective was to update our dataset, which we have done each year since 2007, when this study debuted. The dataset consists of an inventory of the scholarly output in top law journals of the faculties at “non-elite” law schools. It thus provides some objective information to assess the relative strength of the “non-elite” schools in one form of scholarly research. It is the basis for the ranking of “Per Capita Productivity of Articles in Top Journals, 1993-2012: Law Schools Outside the U.S. News Top 50.” In that ranking we compare the scholarly output in selected journals of all law schools that are ABA-accredited, members of the Association of American Law Schools, and were ranked below the 50th spot in the U.S. NEWS & WORLD REPORT 2013 Rankings.
To build the original and updated dataset we employed the methodology used by Professor Brian Leiter in his study of per capita faculty productivity based on articles in top journals. Professor Leiter focused exclusively on schools he determined might likely rank in the top 50 nationally, see Brian Leiter, Measuring the Academic Distinction of Law Faculties, 29 J. LEGAL STUD. 451, 461-68 (2000) (describing the methodology and results); http://www.leiterrankings.com/faculty/2000faculty_product_journals.shtml (same), which created the void we hope our studies fill.
For each school we studied, we updated the faculty lists we created last year. The resulting faculty lists, like Professor Leiter’s, were intended to include all full-time tenured and tenure-track academic faculty in 2011-2012 who were expected to produce scholarship as a major part of their duties. The names on each list were then searched in the Westlaw JLR database as AU (“Law Professor Name”). In Professor Leiter’s study, qualifying articles were those that appeared in what he determined were the 20 leading law journals. For our study, in light of the reality of where faculty who are not at "elite" law schools publish their work, we modified his methodology. As we have since the first iteration of the study, we included the general law reviews published by the 54 schools receiving the highest peer assessment scores in the 2008 U.S. NEWS RANKINGS (47 schools had a peer assessment score of 2.9 or higher; 7 had a score of 2.8) and an additional 13 journals that appeared in the top 50 of the Washington & Lee Law Journal Combined Rankings in June 2007. An alphabetical listing of those journals can be found on this website, as can the U.S. NEWS & WORLD REPORT RANKINGS and Washington & Lee Law Journal Combined Rankings on which that list of 67 journals is based.
Qualifying articles were those published since 1993. (Of course, forthcoming articles could not be included – if an article did not appear in Westlaw before August 1, 2012 it was not included in the study). For each qualifying article, we used Professor Leiter’s system: 0 points for articles under 6 pages; 1 point for articles 6-20 pages in length; 2 points for articles 21-50 pages in length; and 3 points for articles exceeding 50 pages. For articles appearing in a journal published by the faculty member’s home institution at the time of publication, the points assigned were reduced by one-half. No credit was given for articles published while the faculty member was a student. The total number of points for all members of a faculty was divided by the number of faculty, yielding the institution’s per capita score.
Thereafter, we sent an e-mail to each dean and associate dean at the schools covered by the study, informing them of the study and inviting them to review the preliminary results for their faculty. We attached to the e-mail a spreadsheet with results for each faculty member at that school, total points, and the school's per capita score. Schools were given at least 30 days to inform us of any errors in the preliminary data. Over one-third of the schools we studied responded, and we adjusted our preliminary findings accordingly. The final rankings were derived from the resulting dataset.
- Lucinda Harrison-Cox
- Raquel M. Ortiz
- Michael J. Yelnosky
March 19, 2013
Bristol, Rhode Island