Year: 2011

Productivity Metrics and Peer Review Scores

18 comments

A key question regarding the NIH peer review system relates to how well peer review scores predict subsequent scientific output. Answering this question is a challenge, of course, since meaningful scientific output is difficult to measure and evolves over time–in some cases, a long time. However, by linking application peer review scores to publications citing support from the funded grants, it is possible to perform some relevant analyses.

The analysis I discuss below reveals that peer review scores do predict trends in productivity in a manner that is statistically different from random ordering. That said, there is a substantial level of variation in productivity metrics among grants with similar peer review scores and, indeed, across the full distribution of funded grants.

I analyzed 789 R01 grants that NIGMS competitively funded during Fiscal Year 2006. This pool represents all funded R01 applications that received both a priority score and a percentile score during peer review. There were 357 new (Type 1) grants and 432 competing renewal (Type 2) grants, with a median direct cost of $195,000. The percentile scores for these applications ranged from 0.1 through 43.4, with 93% of the applications having scores lower than 20. Figure 1 shows the percentile score distribution.

Figure 1. Cumulative number of NIGMS R01 grants in Fiscal Year 2006 as a function of percentile score.

Figure 1. Cumulative number of NIGMS R01 grants in Fiscal Year 2006 as a function of percentile score.

These grants were linked (primarily by citation in publications) to a total of 6,554 publications that appeared between October 2006 and September 2010 (Fiscal Years 2007-2010). Those publications had been cited 79,295 times as of April 2011. The median number of publications per grant was 7, with an interquartile range of 4-11. The median number of citations per grant was 73, with an interquartile range of 26-156.

The numbers of publications and citations represent the simplest available metrics of productivity. More refined metrics include the number of research (as opposed to review) publications, the number of citations that are not self-citations, the number of citations corrected for typical time dependence (since more recent publications have not had as much time to be cited as older publications), and the number of highly cited publications (which I defined as the top 10% of all publications in a given set). Of course, the metrics are not independent of one another. Table 1 shows these metrics and the correlation coefficients between them.

Table 1. Correlation coefficients between nine metrics of productivity.

Table 1. Correlation coefficients between nine metrics of productivity.

How do these metrics relate to percentile scores? Figures 2-4 show three distributions.

Figure 2. Distribution of the number of publications as a function of percentile score. The inset shows a histogram of the number of grants as a function of the number of publications.

Figure 2. Distribution of the number of publications as a function of percentile score. The inset shows a histogram of the number of grants as a function of the number of publications.

Figure 3. Distribution of the number of citations as a function of percentile score. The inset shows a histogram of the number of grants as a function of the number of citations.

Figure 3. Distribution of the number of citations as a function of percentile score. The inset shows a histogram of the number of grants as a function of the number of citations.

Figure 4. Distribution of the number of highly cited publications as a function of percentile score. Highly cited publications are defined as those in the top 10% of all research publications in terms of the total number of citations corrected for the observed average time dependence of citations.

Figure 4. Distribution of the number of highly cited publications as a function of percentile score. Highly cited publications are defined as those in the top 10% of all research publications in terms of the total number of citations corrected for the observed average time dependence of citations.

As could be anticipated, there is substantial scatter across each distribution. However, as could also be anticipated, each of these metrics has a negative correlation coefficient with the percentile score, with higher productivity metrics corresponding to lower percentile scores, as shown in Table 2.

Table 2. Correlation coefficients between the grant percentile score and nine metrics of productivity.

Table 2. Correlation coefficients between the grant percentile score and nine metrics of productivity.

Do these distributions reflect statistically significant relationships? This can be addressed through the use of a Lorenz curve to plot the cumulative fraction of a given metric as a function of the cumulative fraction of grants, ordered by their percentile scores. Figure 5 shows the Lorentz curve for citations.

Figure 5. Cumulative fraction of citations as a function of the cumulative fraction of grants, ordered by percentile score. The shaded area is related to the excess fraction of citations associated with more highly rated grants.

Figure 5. Cumulative fraction of citations as a function of the cumulative fraction of grants, ordered by percentile score. The shaded area is related to the excess fraction of citations associated with more highly rated grants.

The tendency of the Lorenz curve to reflect a non-uniform distribution can be measured by the Gini coefficient. This corresponds to twice the shaded area in Figure 5. For citations, the Gini coefficient has a value of 0.096. Based on simulations, this coefficient is 3.5 standard deviations above that for a random distribution of citations as a function of percentile score. Thus, the relationship between citations and the percentile score for the distribution is highly statistically significant, even if the grant-to-grant variation within a narrow range of percentile scores is quite substantial. Table 3 shows the Gini coefficients for the all of the productivity metrics.

Table 3. Gini coefficients for nine metrics of productivity. The number of standard deviations above the mean, as determined by simulations, is shown in parentheses below each coefficient.

Table 3. Gini coefficients for nine metrics of productivity. The number of standard deviations above the mean, as determined by simulations, is shown in parentheses below each coefficient.

Of these metrics, overall citations show the most statistically significant Gini coefficient, whereas highly cited publications show one of the least significant Gini coefficients. As shown in Figure 4, the distribution of highly cited publications is relatively even across the entire percentile score range.

Resubmissions: More Time for New Investigators, General Clarification

1 comment

There is now a shortened review cycle for applications from new investigators. To give these applicants more time to prepare resubmissions (A1) between review cycles, NIH will accelerate the release of summary statements for the initial applications (A0). It has also pushed back the special submission date for new investigators to give them at least 30 days to prepare the revised application. Please refer to NOT-OD-11-057 for more details.

While we’re on this topic, I’d like to clear up confusion about when to submit a new application versus a resubmission. New applications and resubmissions typically differ in several important aspects (due date, introduction, etc.). For most funding opportunity announcements (FOAs), deciding whether to prepare a new submission (A0) or a resubmission (A1) is straightforward. But sometimes it’s not!

Here’s a little clarification. All applications in response to a request for applications (RFA) are considered new, unless the RFA says that applications to previous versions of the RFA may be submitted as resubmissions. For example, if NIGMS has issued an RFA and then decides to continue it via a non-RFA FOA, all applications to the FOA must be new the first time they are submitted. Similarly, if a PI continues an awarded U series application as an R series application (e.g., U01 to R01 after the original U01 FOA expired), then the R series application must be new.

Finally, applications to continue work started in a special Recovery Act activity code (RC1, RC2, RC3, RC4, etc.) should be submitted as new applications. Recovery Act competitive revision (S1) applications that were submitted to an existing FOA are an exception. They must be submitted as resubmissions (S1A1) and are subject to the NIH resubmission limit policy.

For those of you interested in a broader discussion of resubmissions, see the “Early Data on the A2 Sunset” blog post from OER director Sally Rockey.

Moving Cell Migration Forward: Meeting Highlights Progress

0 comments

Frontiers in Cell Migration and Mechanotransduction meeting posterLast week, NIGMS hosted the Frontiers in Cell Migration and Mechanotransduction meeting. It brought together an impressive group of scientists working at many levels, from molecules to cells, tissues and organs.

The overall sense from the meeting is that a wide variety of tools, approaches and even fields have converged on this topic of how and why cells move and that this convergence has become a source of collaboration between communities that historically have not interacted.

Several important themes emerged, including:

  • Events, such as cell signaling, are highly localized and closely coordinated. In a fascinating talk, Klaus Hahn (University of North Carolina, Chapel Hill) presented new experimental data using a photoactivable and completely reversible probe that he developed for RhoG. Important in wound healing, RhoG turns on immediately at the leading edge when the cell moves and seems to regulate the direction of cell migration and whether a cell can turn.
  • It’s all about forces—once invisible to most techniques that biologists have used, forces are now being deduced and measured internally. Chris Chen (University of Pennsylvania) showed us a stem cell’s response to the dish surface generates cellular forces and that these forces affect whether the cell rounds up or spreads. Chen’s data suggests that cell spreading is essential for triggering distinct differentiation pathways—and that whether a stem cell becomes a brain or a bone cell is driven by this contractility.
  • Cells communicate and influence each other. In a talk that linked basic biology to clinical research, Anna Huttenlocher (University of Wisconsin-Madison) showed that leukocyte migration can be an immune response to cell wounding. By using photo-caged biosensors developed by Klaus Hahn, she found that neutrophil cells are more active and recruited more quickly after subsequent wounding events. They also recruit other cells to the wound sites.

The talks, as well as the poster session, pointed to additional conclusions: Cell migration is a collective behavior, and feedback mechanisms control many cell migration events.

This was the third and final meeting organized by the NIGMS-funded Cell Migration Consortium (CMC), whose project will sunset in August 2011. The consortium developed a cell migration gateway that will continue to exist as a resource for updates in the field; you can subscribe at http://cellmigration.org/cmg_update/ealert/signup.shtml (no longer available).

Jim Deatherage contributed to this post.

Fiscal Year 2012 Budget Update

0 comments

The Senate hearing on the Fiscal Year 2012 President’s budget request for NIH happened on May 11, and you can watch the archived Webcast.

My written testimony and NIH Director Francis Collins’ written testimony on next year’s budget are also now available. The ultimate outcome will be a bill appropriating funds for NIH and its components (view history of NIH appropriations).

For more on the proposed NIGMS budget, see my update from February 15.

Fiscal Year 2011 Funding Policy

3 comments

As you may be aware, the Full-Year Continuing Appropriations Act of 2011 (Public Law 112-10) enacted on April 15 provides Fiscal Year 2011 funds for NIH and NIGMS at a level approximately 1% lower than those for Fiscal Year 2010.

NIH has released a notice outlining its fiscal policy for grant awards. This notice includes reductions in noncompeting awards to allow the funding of additional new and competing awards.

For noncompeting awards:

  • Modular grants will be awarded at a level 1% lower than the current Fiscal Year 2011 committed level.
  • Nonmodular grants will not receive any inflationary increase and will be reduced by an additional 1%.
  • Awards that have already been issued (typically at 90% of the previously committed level) will be adjusted upward to be consistent with the above policies.

For competing awards:

  • Based on the appropriation level, we expect to make approximately 866 new and competing research project grant awards at NIGMS, compared to 891 in Fiscal Year 2010.
  • It is worth noting that we received more new and competing grant applications this fiscal year—3,875 versus 3,312 in Fiscal Year 2010.

The NIGMS Fiscal Year 2011 Financial Management Plan (link no longer available) has additional information, including about funding of new investigators and National Research Service Award stipends.

NIGMS Glue Grant Assessment Report

0 comments

Last November, I announced that NIGMS was conducting an assessment of its Large-Scale Collaborative Project Awards (glue grant) program and solicited your input.

We have now posted the report of this assessment, which is based on an analysis of input from six different sources, including comments we received from the scientific community.

The assessment’s conclusion is that the glue grant program has had mixed results. All of the projects accomplished some of their goals, and some of the projects had a substantial impact in their fields. However, the assessment also found that the program as a whole had not achieved outcomes commensurate with the scope of the awards and the overall investment in them.

The panel members felt that “the successes and challenges of the Glue Grant Awards Program provide a useful guide for the development of future programs.” While they recommended discontinuing the program as it currently exists, they did not recommend abandoning all support for collaborative research, even in the face of tighter budgets. Rather, they suggested a number of ways to improve support for larger-scale projects and indicated that these projects cannot be accomplished with R01 grant support alone.

Last week, I presented the outcomes of the assessment to our Advisory Council, which embraced the recommendations of the assessment panel and encouraged NIGMS to develop alternative mechanisms to support the varied accomplishments that were supported through the glue grant program. We will take the report and Council’s advice into consideration as we develop future plans for funding collaborative research.

Feedback Loop Feedback: Tell Us What You Want

0 comments

The Feedback Loop blog, with its 165 posts and 418 comments, has become an important tool for communicating with you.

As the blog enters its third year, we will continue to use it to share news of NIGMS funding opportunities, meetings and activities, job openings and grant-related changes. But, as with any blog, we really want to generate posts that spark an open dialogue.

Tell us what posts you want to read by e-mailing me, adding a comment here or using the “Suggest a Post” option near the top of the site. Is there a policy or process we can demystify, a trend we can explain or an area of funding we should highlight? You can propose any topic that might interest our broader NIGMS grantee and applicant audience. While you’re at it, you can also tell us what you don’t want to read about!

Council Tribute to Director Berg

1 comment

At today’s National Advisory General Medical Sciences Council meeting, member Howard Garrison offered the following statement to Jeremy Berg on behalf of the entire Council:

“In appreciation of your 7 years of leadership at NIGMS, the members of the Council express their profound gratitude to you for your distinguished service to science and the nation. We recognize your outstanding work in the pursuit of excellence in research and education, mentoring and advocacy for basic research. Your willingness to deal directly with challenging issues has earned you our respect and admiration. It has been a pleasure and an honor to work with you, and we will miss you. We wish you continued success in your new endeavors.”

Implementation Under Way for Training Strategic Plan

2 comments

Strategic Plan for Biomedical and Behavioral Research TrainingWe have just posted the final version of Investing in the Future: National Institute of General Medical Sciences Strategic Plan for Biomedical and Behavioral Research Training. Fifteen months in the making, the plan reflects our long-standing commitment to research training and the development of a highly capable, diverse scientific workforce.

In an earlier post, I shared the plan’s four key themes.

As you’ll see, the plan considers training in the broadest sense, including not just activities supported on training grants and fellowships, but also those supported through research grants. It strongly encourages the development of training plans in all R01 and other research grant applications that request support for graduate students or postdoctoral trainees. And it endorses the use of individual development plans for these trainees as well as the overall importance of mentoring.

Finally, the plan acknowledges that trainees may pursue many different career outcomes that can contribute to the NIH mission.

My thanks to the dedicated committee of NIGMS staff who developed the plan and to the hundreds of investigators, postdocs, students, university administrators and others who took the time to give us their views throughout the planning process.

We’ve already started implementing the plan’s action items. While there are some that we can address on our own, others will require collaboration. We will again be reaching out to our stakeholders, and we look forward to continued input from and interactions with them to achieve our mutual goals.

Preview Our New Web Site Design

0 comments

A new look and feel for the NIGMS Web site will launch in the next few weeks. The design incorporates feedback we received during usability testing with individuals representing key target audiences—scientists, administrators, educators, news media and the public—as well as from our recent user satisfaction survey and other data.

The new design features a fresh color palette based on the NIGMS logo, fewer top-level tabs and a reorganization of how some content is presented. We also make it easier for users to connect with us through social media like Twitter and Facebook.

Even with these latest enhancements, the site is always a work in progress. We welcome your input and suggestions at any time.

UPDATE: The redesigned NIGMS Web site has launched. While redirects are in place, please be sure to check your bookmarks as some page URLs may have changed.