On September 28, 2010, the National Research Council (NRC) published a report on the Assessment of Research-Doctorate Programs describing survey data and analysis from across the country. The report, the NRC’s first since 1995, charts data from 212 universities with programs in 62 different fields of academic study. All research-based academic departments and programs will likely find useful peer analysis data contained within this report. While past reviews have focused on ranking programs based upon reputation, a major conclusion in this report was that the ability to provide precise rank numbers was not possible, due to the subjective nature of peer-based ranking.
The report showed five important example criteria and stated that the variance in the metrics was specific to the 5th and 95th percentile values of the rank distribution. The five example ways to rank programs were based upon peer-reputation, survey data, faculty research activity, student support and outcomes, and diversity. The rank bounds shown for each of these categories were the highest and lowest possible rank of each program. For example, a program could have 5th and 95th percentile rank values of 10 and 22 in diversity, indicating they have a high probability for ranking between 10th and 22nd amongst peer doctoral departments in diversity for that particular field.
The complexity of the data and the multi-parametric nature make it a challenge to interpret with precise values. However, several websites have graphically displayed the data for a comprehensive analysis. These range from The Chronicle of Higher Education which offers subscribers a full range of tools to examine the data, to www.Phds.org which provides this service free of charge (a website founded by a former Dartmouth Assistant Professor Geoff Davis, now senior researcher at Google Inc.). On these sites, users can select sets of criteria of interest to them and obtain ranked lists of doctoral granting programs at the institutions that match their goals. These sites are available for prospective PhD students or faculty who might consider factors such as research activity, student support, or diversity with varying levels of importance. This freedom of analysis was an explicit goal of the report.
The NRC report goes into detail about “Reputation-based” ranking approaches, which have dominated program assessments in the past, and contrasts this with its data-based analysis using the 20 metrics obtained from the participating programs. The reputation-based ranking approach (R-analysis) resulted from a selected survey of faculty opinions in each field of study. The results of the survey correlated to the size of the program. Therefore, the NRC concluded that previous rankings were distorted disproportionately to program size. This new survey of a data-based ranking system (S-analysis) was proposed as a more objective approach using 20 different quantitative measures of each program. Ultimately, however the NRC did not endorse the data-based approach as the “best” method, but rather as just one method of comparing across programs.
One of the challenges of a ranking approach is that it only works well when programs can be defined as part of a single discipline. One anomaly in the current system is that interdisciplinary programs are more strongly weighted, and yet interdisciplinary programs without obvious peer programs cannot be ranked. Such was the case with Thayer School of Engineering’s doctoral program which has always had a focus of integrating all engineering branches into one division of engineering science.
Overall, the NRC report concluded that faculty ranked “Research Activity” as the most important factor in the strength of a program. The four measures included in this category were percentage of faculty with research grants, number of publications per faculty per year, number of citations per paper, and percentage of faculty receiving prestigious awards. These are factors cross all academic departments and are not exclusive to doctoral granting programs.
Again, although this research activity metric was identified from the data, the report emphasized that individual students or programs may be at least as interested in other comparative measures (e.g., measures that assess student support or diversity). For example, Student Support and Outcomes (outcomes were measured as the percentage of graduates who secured academic positions) was one method of comparison considered important by faculty. Faculty and student diversity was another factor highlighted in the report, with the recognition that faculty diversity is very low—around 5 percent—in doctoral-granting programs, but that student diversity has grown and is nearly 10 percent overall. Participation of women in doctoral programs is up but varies considerably between fields, ranging from 11 percent in engineering at the low end to 39 percent in the humanities.
Concerns about the validity of the data in the report remain, as it was largely self-reported with publication and citation data being gathered from online indices. Individual programs and even entire fields of study feel their data is fundamentally flawed, and so it is hard to trust even the most rigorously gathered data sets. As with all data surveys, the results must be interpreted thoughtfully to avoid overstating the results.
The report also focuses on what has changed since the 1995 report, which was based on 1993 data, and discusses overall trends with a small percentage growth being seen in doctoral programs nationwide. It is challenging to directly relate these small national changes to Dartmouth, which has experienced significant growth in research. From 1993 to2005-06, sponsored research increased from $67 million to $187 million. Dartmouth’s STEM graduate programs have experienced change in response to this increased infusion of funded research, with increase in size and overall scholarly productivity.
This NRC report is rich with information that helps us reflect on and potentially identify ways to improve the Dartmouth research experience for all departments. I encourage you to explore this database, and use it to think about ways we can keep striving for innovative improvements to our academic programs.
Brian W. Pogue, Ph.D.,
Dean of Graduate Studies