More on Assessing the Glue Grant Program

0 comments

NIGMS is committed to thoughtful analysis before it initiates new programs and to careful, transparent assessment of ongoing programs at appropriate stages to help determine future directions. In this spirit, we have been assessing our large-scale programs, starting with the Protein Structure Initiative and more recently, the glue grant program.

For the glue grants (which are formally known as Large-Scale Collaborative Project Awards), we engaged in both a process evaluation and an outcomes assessment. These assessments, conducted by independent, well-qualified groups, articulated the strengths and accomplishments, as well as the weaknesses and challenges, of the glue grant program. The main conclusion of the outcomes assessment was that NIGMS should discontinue the current program and in its place create “a suite of modified program(s) to support innovative, interdisciplinary, large scale research” with the recommendation “that awards should be considerably smaller, but larger in number.”

The glue grant program was developed beginning in 1998 in anticipation of substantial increases in the NIH budget. At that time, members of the scientific community were expressing concern about how to support a trend toward interdisciplinary and collaborative science. This was discussed extensively by our Advisory Council. Here’s an excerpt from the minutes of a January 1999 meeting:

“Council discussion centered on why current mechanisms do not satisfy the need; the fact that these will be experiments in the organization of integrative science; the usefulness of consortia in collecting reference data and developing new technology that are difficult to justify on individual projects; the need to include international collaborations; and the need for rapid and open data sharing. Council voted concurrence with NIGMS plans to further develop and issue these initiatives.”

This summary captures many of the key challenges that emerged over the next 12 years in the glue grant program. I’d like to give my perspective on three key points.

First, these grants were always intended to be experiments in the organization of integrative science. One of the most striking features of the glue grants is how different they are from one another based on the scientific challenge being addressed, the nature of the scientific community in which each glue grant is embedded and the approach of the principal investigator and other members of the leadership team. The process evaluation expressed major concern about such differences between the glue grants, but in fact this diversity reflects a core principle of NIGMS: a deep appreciation for the role of the scientific community in defining scientific problems and identifying approaches for addressing them.

Second, as highlighted in both reports, the need for rapid and open data sharing remains a great challenge. All of the glue grants have included substantial investments in information technology and have developed open policies toward data release. However, making data available and successfully sharing data in forms that the scientific community will embrace are not equivalent. And of course, effective data and knowledge sharing is a challenge throughout the scientific community, not just in the glue grants.

Third, the timing for these assessments is challenging. On one hand, it is desirable to perform the assessments as early as possible after the initiation of a new program to inform program management and to indicate the need for potential adjustments.  On the other hand, the impact of scientific advances takes time to unfold and this can be particularly true for ambitious, larger-scale programs. It may be interesting to look at the impact of these programs again in the future to gauge their impact over time.

During my time at NIGMS, I have been impressed by the considerable efforts of the Institute staff involved with the glue grants. They have approached the stewardship of this novel program with a critical eye, working to find oversight mechanisms that would facilitate the impact of the grants on the relevant, broad components of the scientific community. As the current program ends, we will continue to think creatively about how best to support collaborative research, bearing in mind the glue grant assessment panel’s recommendation that we explore appropriate mechanisms to support large-scale, collaborative approaches.

Early Notice: Reissue of RFA for Centers for AIDS-Related Structural Biology

0 comments

For 25 years, NIGMS has supported AIDS-related structural biology research that has provided fundamental insights into the replication of HIV and contributed toward the development of essential therapeutics.

As Joe Gindhart discussed in an earlier Feedback Loop post, NIGMS marked the anniversary of this program with a special meeting in March. Many participants expressed excitement and offered overwhelmingly positive feedback about the program, in particular the progress and achievements reported from the three P50 Centers for the Determination of Structures of HIV/Host Complexes: Center for the Structural Biology of Cellular Host Elements in Egress, Trafficking and Assembly of HIV, HIV Accessory and Regulatory Complexes and Center for HIV Protein Interactions.

HIV RFA

Following endorsement by our Advisory Council at its meeting last month, we plan to reissue the centers RFA this summer. The new RFA will encompass the goals of the previous one, but will be broadened to include RNA/protein and membrane/protein interactions and a push to move beyond static structures to characterize the dynamics of complexes as a way to improve future drug design. This reissued RFA will also include a new requirement for a collaborative development program intended to recruit skilled investigators, especially early stage investigators, into AIDS-related structural biology research. We expect to fund four or five centers for a 5-year period.

I will post more information about this funding opportunity once it has been published in the NIH Guide.

Productivity Metrics and Peer Review Scores

18 comments

A key question regarding the NIH peer review system relates to how well peer review scores predict subsequent scientific output. Answering this question is a challenge, of course, since meaningful scientific output is difficult to measure and evolves over time–in some cases, a long time. However, by linking application peer review scores to publications citing support from the funded grants, it is possible to perform some relevant analyses.

The analysis I discuss below reveals that peer review scores do predict trends in productivity in a manner that is statistically different from random ordering. That said, there is a substantial level of variation in productivity metrics among grants with similar peer review scores and, indeed, across the full distribution of funded grants.

I analyzed 789 R01 grants that NIGMS competitively funded during Fiscal Year 2006. This pool represents all funded R01 applications that received both a priority score and a percentile score during peer review. There were 357 new (Type 1) grants and 432 competing renewal (Type 2) grants, with a median direct cost of $195,000. The percentile scores for these applications ranged from 0.1 through 43.4, with 93% of the applications having scores lower than 20. Figure 1 shows the percentile score distribution.

Figure 1. Cumulative number of NIGMS R01 grants in Fiscal Year 2006 as a function of percentile score.

Figure 1. Cumulative number of NIGMS R01 grants in Fiscal Year 2006 as a function of percentile score.

These grants were linked (primarily by citation in publications) to a total of 6,554 publications that appeared between October 2006 and September 2010 (Fiscal Years 2007-2010). Those publications had been cited 79,295 times as of April 2011. The median number of publications per grant was 7, with an interquartile range of 4-11. The median number of citations per grant was 73, with an interquartile range of 26-156.

The numbers of publications and citations represent the simplest available metrics of productivity. More refined metrics include the number of research (as opposed to review) publications, the number of citations that are not self-citations, the number of citations corrected for typical time dependence (since more recent publications have not had as much time to be cited as older publications), and the number of highly cited publications (which I defined as the top 10% of all publications in a given set). Of course, the metrics are not independent of one another. Table 1 shows these metrics and the correlation coefficients between them.

Table 1. Correlation coefficients between nine metrics of productivity.

Table 1. Correlation coefficients between nine metrics of productivity.

How do these metrics relate to percentile scores? Figures 2-4 show three distributions.

Figure 2. Distribution of the number of publications as a function of percentile score. The inset shows a histogram of the number of grants as a function of the number of publications.

Figure 2. Distribution of the number of publications as a function of percentile score. The inset shows a histogram of the number of grants as a function of the number of publications.

Figure 3. Distribution of the number of citations as a function of percentile score. The inset shows a histogram of the number of grants as a function of the number of citations.

Figure 3. Distribution of the number of citations as a function of percentile score. The inset shows a histogram of the number of grants as a function of the number of citations.

Figure 4. Distribution of the number of highly cited publications as a function of percentile score. Highly cited publications are defined as those in the top 10% of all research publications in terms of the total number of citations corrected for the observed average time dependence of citations.

Figure 4. Distribution of the number of highly cited publications as a function of percentile score. Highly cited publications are defined as those in the top 10% of all research publications in terms of the total number of citations corrected for the observed average time dependence of citations.

As could be anticipated, there is substantial scatter across each distribution. However, as could also be anticipated, each of these metrics has a negative correlation coefficient with the percentile score, with higher productivity metrics corresponding to lower percentile scores, as shown in Table 2.

Table 2. Correlation coefficients between the grant percentile score and nine metrics of productivity.

Table 2. Correlation coefficients between the grant percentile score and nine metrics of productivity.

Do these distributions reflect statistically significant relationships? This can be addressed through the use of a Lorenz curve Link to external web site to plot the cumulative fraction of a given metric as a function of the cumulative fraction of grants, ordered by their percentile scores. Figure 5 shows the Lorentz curve for citations.

Figure 5. Cumulative fraction of citations as a function of the cumulative fraction of grants, ordered by percentile score. The shaded area is related to the excess fraction of citations associated with more highly rated grants.

Figure 5. Cumulative fraction of citations as a function of the cumulative fraction of grants, ordered by percentile score. The shaded area is related to the excess fraction of citations associated with more highly rated grants.

The tendency of the Lorenz curve to reflect a non-uniform distribution can be measured by the Gini coefficient Link to external web site. This corresponds to twice the shaded area in Figure 5. For citations, the Gini coefficient has a value of 0.096. Based on simulations, this coefficient is 3.5 standard deviations above that for a random distribution of citations as a function of percentile score. Thus, the relationship between citations and the percentile score for the distribution is highly statistically significant, even if the grant-to-grant variation within a narrow range of percentile scores is quite substantial. Table 3 shows the Gini coefficients for the all of the productivity metrics.

Table 3. Gini coefficients for nine metrics of productivity. The number of standard deviations above the mean, as determined by simulations, is shown in parentheses below each coefficient.

Table 3. Gini coefficients for nine metrics of productivity. The number of standard deviations above the mean, as determined by simulations, is shown in parentheses below each coefficient.

Of these metrics, overall citations show the most statistically significant Gini coefficient, whereas highly cited publications show one of the least significant Gini coefficients. As shown in Figure 4, the distribution of highly cited publications is relatively even across the entire percentile score range.

Resubmissions: More Time for New Investigators, General Clarification

1 comment

There is now a shortened review cycle for applications from new investigators. To give these applicants more time to prepare resubmissions (A1) between review cycles, NIH will accelerate the release of summary statements for the initial applications (A0). It has also pushed back the special submission date for new investigators to give them at least 30 days to prepare the revised application. Please refer to NOT-OD-11-057 for more details.

While we’re on this topic, I’d like to clear up confusion about when to submit a new application versus a resubmission. New applications and resubmissions typically differ in several important aspects (due date, introduction, etc.). For most funding opportunity announcements (FOAs), deciding whether to prepare a new submission (A0) or a resubmission (A1) is straightforward. But sometimes it’s not!

Here’s a little clarification. All applications in response to a request for applications (RFA) are considered new, unless the RFA says that applications to previous versions of the RFA may be submitted as resubmissions. For example, if NIGMS has issued an RFA and then decides to continue it via a non-RFA FOA, all applications to the FOA must be new the first time they are submitted. Similarly, if a PI continues an awarded U series application as an R series application (e.g., U01 to R01 after the original U01 FOA expired), then the R series application must be new.

Finally, applications to continue work started in a special Recovery Act activity code (RC1, RC2, RC3, RC4, etc.) should be submitted as new applications. Recovery Act competitive revision (S1) applications that were submitted to an existing FOA are an exception. They must be submitted as resubmissions (S1A1) and are subject to the NIH resubmission limit policy.

For those of you interested in a broader discussion of resubmissions, see the “Early Data on the A2 Sunset” blog post from OER director Sally Rockey.

Moving Cell Migration Forward: Meeting Highlights Progress

0 comments

Frontiers in Cell Migration and Mechanotransduction meeting posterLast week, NIGMS hosted the Frontiers in Cell Migration and Mechanotransduction meeting. It brought together an impressive group of scientists working at many levels, from molecules to cells, tissues and organs.

The overall sense from the meeting is that a wide variety of tools, approaches and even fields have converged on this topic of how and why cells move and that this convergence has become a source of collaboration between communities that historically have not interacted.

Several important themes emerged, including:

  • Events, such as cell signaling, are highly localized and closely coordinated. In a fascinating talk, Klaus Hahn (University of North Carolina, Chapel Hill) presented new experimental data using a photoactivable and completely reversible probe that he developed for RhoG. Important in wound healing, RhoG turns on immediately at the leading edge when the cell moves and seems to regulate the direction of cell migration and whether a cell can turn.
  • It’s all about forces—once invisible to most techniques that biologists have used, forces are now being deduced and measured internally. Chris Chen (University of Pennsylvania) showed us a stem cell’s response to the dish surface generates cellular forces and that these forces affect whether the cell rounds up or spreads. Chen’s data suggests that cell spreading is essential for triggering distinct differentiation pathways—and that whether a stem cell becomes a brain or a bone cell is driven by this contractility.
  • Cells communicate and influence each other. In a talk that linked basic biology to clinical research, Anna Huttenlocher (University of Wisconsin-Madison) showed that leukocyte migration can be an immune response to cell wounding. By using photo-caged biosensors developed by Klaus Hahn, she found that neutrophil cells are more active and recruited more quickly after subsequent wounding events. They also recruit other cells to the wound sites.

The talks, as well as the poster session, pointed to additional conclusions: Cell migration is a collective behavior, and feedback mechanisms control many cell migration events.

This was the third and final meeting organized by the NIGMS-funded Cell Migration Consortium (CMC), whose project will sunset in August 2011. The consortium developed a cell migration gateway that will continue to exist as a resource for updates in the field; you can subscribe at http://cellmigration.org/cmg_update/ealert/signup.shtml (no longer available).

Jim Deatherage contributed to this post.

Fiscal Year 2012 Budget Update

0 comments

The Senate hearing on the Fiscal Year 2012 President’s budget request for NIH happened on May 11, and you can watch the archived Webcast.

My written testimony and NIH Director Francis Collins’ written testimony on next year’s budget are also now available. The ultimate outcome will be a bill appropriating funds for NIH and its components (view history of NIH appropriations).

For more on the proposed NIGMS budget, see my update from February 15.

Fiscal Year 2011 Funding Policy

3 comments

As you may be aware, the Full-Year Continuing Appropriations Act of 2011 (Public Law 112-10) enacted on April 15 provides Fiscal Year 2011 funds for NIH and NIGMS at a level approximately 1% lower than those for Fiscal Year 2010.

NIH has released a notice outlining its fiscal policy for grant awards. This notice includes reductions in noncompeting awards to allow the funding of additional new and competing awards.

For noncompeting awards:

  • Modular grants will be awarded at a level 1% lower than the current Fiscal Year 2011 committed level.
  • Nonmodular grants will not receive any inflationary increase and will be reduced by an additional 1%.
  • Awards that have already been issued (typically at 90% of the previously committed level) will be adjusted upward to be consistent with the above policies.

For competing awards:

  • Based on the appropriation level, we expect to make approximately 866 new and competing research project grant awards at NIGMS, compared to 891 in Fiscal Year 2010.
  • It is worth noting that we received more new and competing grant applications this fiscal year—3,875 versus 3,312 in Fiscal Year 2010.

The NIGMS Fiscal Year 2011 Financial Management Plan (link no longer available) has additional information, including about funding of new investigators and National Research Service Award stipends.

NIGMS Glue Grant Assessment Report

0 comments

Last November, I announced that NIGMS was conducting an assessment of its Large-Scale Collaborative Project Awards (glue grant) program and solicited your input.

We have now posted the report of this assessment, which is based on an analysis of input from six different sources, including comments we received from the scientific community.

The assessment’s conclusion is that the glue grant program has had mixed results. All of the projects accomplished some of their goals, and some of the projects had a substantial impact in their fields. However, the assessment also found that the program as a whole had not achieved outcomes commensurate with the scope of the awards and the overall investment in them.

The panel members felt that “the successes and challenges of the Glue Grant Awards Program provide a useful guide for the development of future programs.” While they recommended discontinuing the program as it currently exists, they did not recommend abandoning all support for collaborative research, even in the face of tighter budgets. Rather, they suggested a number of ways to improve support for larger-scale projects and indicated that these projects cannot be accomplished with R01 grant support alone.

Last week, I presented the outcomes of the assessment to our Advisory Council, which embraced the recommendations of the assessment panel and encouraged NIGMS to develop alternative mechanisms to support the varied accomplishments that were supported through the glue grant program. We will take the report and Council’s advice into consideration as we develop future plans for funding collaborative research.

Feedback Loop Feedback: Tell Us What You Want

0 comments

The Feedback Loop blog, with its 165 posts and 418 comments, has become an important tool for communicating with you.

As the blog enters its third year, we will continue to use it to share news of NIGMS funding opportunities, meetings and activities, job openings and grant-related changes. But, as with any blog, we really want to generate posts that spark an open dialogue.

Tell us what posts you want to read by e-mailing me, adding a comment here or using the “Suggest a Post” option near the top of the site. Is there a policy or process we can demystify, a trend we can explain or an area of funding we should highlight? You can propose any topic that might interest our broader NIGMS grantee and applicant audience. While you’re at it, you can also tell us what you don’t want to read about!

Council Tribute to Director Berg

1 comment

At today’s National Advisory General Medical Sciences Council meeting, member Howard Garrison offered the following statement to Jeremy Berg on behalf of the entire Council:

“In appreciation of your 7 years of leadership at NIGMS, the members of the Council express their profound gratitude to you for your distinguished service to science and the nation. We recognize your outstanding work in the pursuit of excellence in research and education, mentoring and advocacy for basic research. Your willingness to deal directly with challenging issues has earned you our respect and admiration. It has been a pleasure and an honor to work with you, and we will miss you. We wish you continued success in your new endeavors.”