Tag: Peer Review Process

Impact Score Paragraph in Summary Statements, Plain Language in Public Sections of Grant Applications

0 comments

Extramural NexusThe August issue of NIH’s Extramural Nexus includes two announcements that might interest you.

Impact Score Paragraph in Summary Statements

Starting with September grant application reviews, reviewers will include a summary paragraph to explain what factors they considered in assigning the overall impact score. This should help investigators better understand the reasons for the score.

Plain Language in Public Sections of Grant Applications

The director’s column talks about the importance of communicating research value in your grant application.

Your grant title, abstract and statement of public health relevance are very important. Once a grant is funded, these items are available to the public through NIH’s RePORTER database. Many people are interested in learning about research supported with taxpayer dollars, so I encourage you to be clear and accurate in writing these parts of your application. Reviewers are being told to expect plain language in these sections.

The Nexus column includes links to these helpful resources:

Scoring Analysis: 1-Year Comparison

12 comments

I recently posted several analyses (on July 15, July 19 and July 21) of the relationships between the overall impact scores on R01 applications determined by study sections and the criterion scores assigned by individual reviewers. These analyses were based on a sample of NIGMS applications reviewed during the October 2009 Council round. This was the first batch of applications for which criterion scores were used.

NIGMS applications for the October 2010 Council round have now been reviewed. Here I present my initial analyses of this data set, which consists of 654 R01 applications that were discussed, scored and percentiled.

The first analysis, shown below, relates to the correlation coefficients between the overall impact score and the averaged individual criterion scores.

Correlation coefficients between the overall impact score and averaged individual criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round. The corresponding scores for a sample of 360 NIGMS R01 applications reviewed during the October 2009 Council round are shown in parentheses.

Correlation coefficients between the overall impact score and averaged individual criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round. The corresponding scores for a sample of 360 NIGMS R01 applications reviewed during the October 2009 Council round are shown in parentheses.

Overall, the trend in correlation coefficients is similar to that observed for the sample from 1 year ago, although the correlation coefficients for the current sample are slightly higher for four out of the five criterion scores.

Here are results from a principal component analysis:

Principal component analysis of overall impact score based on the five criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round. The corresponding scores for a sample of 360 NIGMS R01 applications reviewed during the October 2009 Council round are shown in parentheses.

Principal component analysis of overall impact score based on the five criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round. The corresponding scores for a sample of 360 NIGMS R01 applications reviewed during the October 2009 Council round are shown in parentheses.

There is remarkable agreement between the results of the principal component analysis for the October 2010 data set and those for the October 2009 data set. The first principal component accounts for 72% of the variance, with the largest contribution coming from approach, followed by innovation, significance, investigator and finally environment. This agreement between the data sets extends through all five principal components, although there is somewhat more variation for principal components 2 and 3 than for the others.

Another important factor in making funding decisions is the percentile assigned to a given application. The percentile is a ranking that shows the relative position of each application’s score among all scores assigned by a study section at its last three meetings. Percentiles provide a way to compare applications reviewed by different study sections that may have different scoring behaviors. They also correct for “grade inflation” or “score creep” in the event that study sections assign better scores over time.

Here is a plot of percentiles and overall impact scores:

A plot of the overall impact score versus the percentile for 654 NIGMS R01 applications reviewed during the October 2010 Council round.

A plot of the overall impact score versus the percentile for 654 NIGMS R01 applications reviewed during the October 2010 Council round.

This plot reveals that a substantial range of overall impact scores can be assigned to a given percentile score. This phenomenon is not new; a comparable level of variation among study sections was seen in the previous scoring system, as well.

The correlation coefficient between the percentile and overall impact score in this data set is 0.93. The correlation coefficients between the percentile and the averaged individual criterion scores are given below:

Correlation coefficients between the percentile and the averaged individual criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round.

Correlation coefficients between the percentile and the averaged individual criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round.

As one would anticipate, these correlation coefficients are somewhat lower than those for the overall impact score since the percentile takes other factors into account.

The results of a principal component analysis applied to the percentile data show:

Principal component analysis of percentile data based on the five criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round.

Principal component analysis of percentile data based on the five criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round.

The results of this analysis are very similar to those for the overall impact scores, with the first principal component accounting for 72% of the variance and similar weights for the individual averaged criterion scores.

Our posting of these scoring analyses has led the NIH Office of Extramural Activities and individual institutes to launch their own analyses. I will share their results as they become available.

The New Scoring System

2 comments

At the recent meeting of the National Advisory General Medical Sciences Council, our Council members had their first opportunity to examine summary statements using the new peer review scoring system.

Many aspects of the new scoring system are unfamiliar, including the use of overall impact scores as integers from 10 (best) to 90 (worst). A summary of the new scoring system is well described in a scoring system and procedure document, and an earlier version of this was shared widely with reviewers.

As background, I compiled some data for approximately 300 NIGMS R01 applications reviewed under the new system.

This plot shows the distribution of overall impact scores along with the corresponding percentiles.

This plot shows the distribution of overall impact scores along with the corresponding percentiles. Note the relative spread of percentile scores at a given impact score. This spread is due to the fact that percentiles are determined independently for each study section that considered 25 or more R01 applications. Otherwise, percentiles are determined across the overall pool of R01 applications reviewed by the Center for Scientific Review.

For comparison, here is a plot of a similar number of NIGMS R01 applications reviewed using the old scoring system.

A plot of a similar number of NIGMS R01 applications reviewed using the old scoring system.

Note the similar spread of percentiles at a given score due to study section-specific percentiling.

I would like to mention another major change as a result of the NIH Enhancing Peer Review effort. You must use restructured application forms and instructions, including a 12-page length limit for R01s, for applications due on or after January 25, 2010. For details, see the recent NIH Guide notice. We plan to post updates about these changes as key dates approach.