Category: Director’s Messages

More on Assessing the Glue Grant Program

0 comments

NIGMS is committed to thoughtful analysis before it initiates new programs and to careful, transparent assessment of ongoing programs at appropriate stages to help determine future directions. In this spirit, we have been assessing our large-scale programs, starting with the Protein Structure Initiative and more recently, the glue grant program.

For the glue grants (which are formally known as Large-Scale Collaborative Project Awards), we engaged in both a process evaluation and an outcomes assessment. These assessments, conducted by independent, well-qualified groups, articulated the strengths and accomplishments, as well as the weaknesses and challenges, of the glue grant program. The main conclusion of the outcomes assessment was that NIGMS should discontinue the current program and in its place create “a suite of modified program(s) to support innovative, interdisciplinary, large scale research” with the recommendation “that awards should be considerably smaller, but larger in number.”

The glue grant program was developed beginning in 1998 in anticipation of substantial increases in the NIH budget. At that time, members of the scientific community were expressing concern about how to support a trend toward interdisciplinary and collaborative science. This was discussed extensively by our Advisory Council. Here’s an excerpt from the minutes of a January 1999 meeting:

“Council discussion centered on why current mechanisms do not satisfy the need; the fact that these will be experiments in the organization of integrative science; the usefulness of consortia in collecting reference data and developing new technology that are difficult to justify on individual projects; the need to include international collaborations; and the need for rapid and open data sharing. Council voted concurrence with NIGMS plans to further develop and issue these initiatives.”

This summary captures many of the key challenges that emerged over the next 12 years in the glue grant program. I’d like to give my perspective on three key points.

First, these grants were always intended to be experiments in the organization of integrative science. One of the most striking features of the glue grants is how different they are from one another based on the scientific challenge being addressed, the nature of the scientific community in which each glue grant is embedded and the approach of the principal investigator and other members of the leadership team. The process evaluation expressed major concern about such differences between the glue grants, but in fact this diversity reflects a core principle of NIGMS: a deep appreciation for the role of the scientific community in defining scientific problems and identifying approaches for addressing them.

Second, as highlighted in both reports, the need for rapid and open data sharing remains a great challenge. All of the glue grants have included substantial investments in information technology and have developed open policies toward data release. However, making data available and successfully sharing data in forms that the scientific community will embrace are not equivalent. And of course, effective data and knowledge sharing is a challenge throughout the scientific community, not just in the glue grants.

Third, the timing for these assessments is challenging. On one hand, it is desirable to perform the assessments as early as possible after the initiation of a new program to inform program management and to indicate the need for potential adjustments.  On the other hand, the impact of scientific advances takes time to unfold and this can be particularly true for ambitious, larger-scale programs. It may be interesting to look at the impact of these programs again in the future to gauge their impact over time.

During my time at NIGMS, I have been impressed by the considerable efforts of the Institute staff involved with the glue grants. They have approached the stewardship of this novel program with a critical eye, working to find oversight mechanisms that would facilitate the impact of the grants on the relevant, broad components of the scientific community. As the current program ends, we will continue to think creatively about how best to support collaborative research, bearing in mind the glue grant assessment panel’s recommendation that we explore appropriate mechanisms to support large-scale, collaborative approaches.

Productivity Metrics and Peer Review Scores

18 comments

A key question regarding the NIH peer review system relates to how well peer review scores predict subsequent scientific output. Answering this question is a challenge, of course, since meaningful scientific output is difficult to measure and evolves over time–in some cases, a long time. However, by linking application peer review scores to publications citing support from the funded grants, it is possible to perform some relevant analyses.

The analysis I discuss below reveals that peer review scores do predict trends in productivity in a manner that is statistically different from random ordering. That said, there is a substantial level of variation in productivity metrics among grants with similar peer review scores and, indeed, across the full distribution of funded grants.

I analyzed 789 R01 grants that NIGMS competitively funded during Fiscal Year 2006. This pool represents all funded R01 applications that received both a priority score and a percentile score during peer review. There were 357 new (Type 1) grants and 432 competing renewal (Type 2) grants, with a median direct cost of $195,000. The percentile scores for these applications ranged from 0.1 through 43.4, with 93% of the applications having scores lower than 20. Figure 1 shows the percentile score distribution.

Figure 1. Cumulative number of NIGMS R01 grants in Fiscal Year 2006 as a function of percentile score.

Figure 1. Cumulative number of NIGMS R01 grants in Fiscal Year 2006 as a function of percentile score.

These grants were linked (primarily by citation in publications) to a total of 6,554 publications that appeared between October 2006 and September 2010 (Fiscal Years 2007-2010). Those publications had been cited 79,295 times as of April 2011. The median number of publications per grant was 7, with an interquartile range of 4-11. The median number of citations per grant was 73, with an interquartile range of 26-156.

The numbers of publications and citations represent the simplest available metrics of productivity. More refined metrics include the number of research (as opposed to review) publications, the number of citations that are not self-citations, the number of citations corrected for typical time dependence (since more recent publications have not had as much time to be cited as older publications), and the number of highly cited publications (which I defined as the top 10% of all publications in a given set). Of course, the metrics are not independent of one another. Table 1 shows these metrics and the correlation coefficients between them.

Table 1. Correlation coefficients between nine metrics of productivity.

Table 1. Correlation coefficients between nine metrics of productivity.

How do these metrics relate to percentile scores? Figures 2-4 show three distributions.

Figure 2. Distribution of the number of publications as a function of percentile score. The inset shows a histogram of the number of grants as a function of the number of publications.

Figure 2. Distribution of the number of publications as a function of percentile score. The inset shows a histogram of the number of grants as a function of the number of publications.

Figure 3. Distribution of the number of citations as a function of percentile score. The inset shows a histogram of the number of grants as a function of the number of citations.

Figure 3. Distribution of the number of citations as a function of percentile score. The inset shows a histogram of the number of grants as a function of the number of citations.

Figure 4. Distribution of the number of highly cited publications as a function of percentile score. Highly cited publications are defined as those in the top 10% of all research publications in terms of the total number of citations corrected for the observed average time dependence of citations.

Figure 4. Distribution of the number of highly cited publications as a function of percentile score. Highly cited publications are defined as those in the top 10% of all research publications in terms of the total number of citations corrected for the observed average time dependence of citations.

As could be anticipated, there is substantial scatter across each distribution. However, as could also be anticipated, each of these metrics has a negative correlation coefficient with the percentile score, with higher productivity metrics corresponding to lower percentile scores, as shown in Table 2.

Table 2. Correlation coefficients between the grant percentile score and nine metrics of productivity.

Table 2. Correlation coefficients between the grant percentile score and nine metrics of productivity.

Do these distributions reflect statistically significant relationships? This can be addressed through the use of a Lorenz curve to plot the cumulative fraction of a given metric as a function of the cumulative fraction of grants, ordered by their percentile scores. Figure 5 shows the Lorentz curve for citations.

Figure 5. Cumulative fraction of citations as a function of the cumulative fraction of grants, ordered by percentile score. The shaded area is related to the excess fraction of citations associated with more highly rated grants.

Figure 5. Cumulative fraction of citations as a function of the cumulative fraction of grants, ordered by percentile score. The shaded area is related to the excess fraction of citations associated with more highly rated grants.

The tendency of the Lorenz curve to reflect a non-uniform distribution can be measured by the Gini coefficient. This corresponds to twice the shaded area in Figure 5. For citations, the Gini coefficient has a value of 0.096. Based on simulations, this coefficient is 3.5 standard deviations above that for a random distribution of citations as a function of percentile score. Thus, the relationship between citations and the percentile score for the distribution is highly statistically significant, even if the grant-to-grant variation within a narrow range of percentile scores is quite substantial. Table 3 shows the Gini coefficients for the all of the productivity metrics.

Table 3. Gini coefficients for nine metrics of productivity. The number of standard deviations above the mean, as determined by simulations, is shown in parentheses below each coefficient.

Table 3. Gini coefficients for nine metrics of productivity. The number of standard deviations above the mean, as determined by simulations, is shown in parentheses below each coefficient.

Of these metrics, overall citations show the most statistically significant Gini coefficient, whereas highly cited publications show one of the least significant Gini coefficients. As shown in Figure 4, the distribution of highly cited publications is relatively even across the entire percentile score range.

Fiscal Year 2012 Budget Update

0 comments

The Senate hearing on the Fiscal Year 2012 President’s budget request for NIH happened on May 11, and you can watch the archived Webcast.

My written testimony and NIH Director Francis Collins’ written testimony on next year’s budget are also now available. The ultimate outcome will be a bill appropriating funds for NIH and its components (view history of NIH appropriations).

For more on the proposed NIGMS budget, see my update from February 15.

Fiscal Year 2011 Funding Policy

3 comments

As you may be aware, the Full-Year Continuing Appropriations Act of 2011 (Public Law 112-10) enacted on April 15 provides Fiscal Year 2011 funds for NIH and NIGMS at a level approximately 1% lower than those for Fiscal Year 2010.

NIH has released a notice outlining its fiscal policy for grant awards. This notice includes reductions in noncompeting awards to allow the funding of additional new and competing awards.

For noncompeting awards:

  • Modular grants will be awarded at a level 1% lower than the current Fiscal Year 2011 committed level.
  • Nonmodular grants will not receive any inflationary increase and will be reduced by an additional 1%.
  • Awards that have already been issued (typically at 90% of the previously committed level) will be adjusted upward to be consistent with the above policies.

For competing awards:

  • Based on the appropriation level, we expect to make approximately 866 new and competing research project grant awards at NIGMS, compared to 891 in Fiscal Year 2010.
  • It is worth noting that we received more new and competing grant applications this fiscal year—3,875 versus 3,312 in Fiscal Year 2010.

The NIGMS Fiscal Year 2011 Financial Management Plan (link no longer available) has additional information, including about funding of new investigators and National Research Service Award stipends.

Council Tribute to Director Berg

1 comment

At today’s National Advisory General Medical Sciences Council meeting, member Howard Garrison offered the following statement to Jeremy Berg on behalf of the entire Council:

“In appreciation of your 7 years of leadership at NIGMS, the members of the Council express their profound gratitude to you for your distinguished service to science and the nation. We recognize your outstanding work in the pursuit of excellence in research and education, mentoring and advocacy for basic research. Your willingness to deal directly with challenging issues has earned you our respect and admiration. It has been a pleasure and an honor to work with you, and we will miss you. We wish you continued success in your new endeavors.”

Implementation Under Way for Training Strategic Plan

2 comments

Strategic Plan for Biomedical and Behavioral Research TrainingWe have just posted the final version of Investing in the Future: National Institute of General Medical Sciences Strategic Plan for Biomedical and Behavioral Research Training. Fifteen months in the making, the plan reflects our long-standing commitment to research training and the development of a highly capable, diverse scientific workforce.

In an earlier post, I shared the plan’s four key themes.

As you’ll see, the plan considers training in the broadest sense, including not just activities supported on training grants and fellowships, but also those supported through research grants. It strongly encourages the development of training plans in all R01 and other research grant applications that request support for graduate students or postdoctoral trainees. And it endorses the use of individual development plans for these trainees as well as the overall importance of mentoring.

Finally, the plan acknowledges that trainees may pursue many different career outcomes that can contribute to the NIH mission.

My thanks to the dedicated committee of NIGMS staff who developed the plan and to the hundreds of investigators, postdocs, students, university administrators and others who took the time to give us their views throughout the planning process.

We’ve already started implementing the plan’s action items. While there are some that we can address on our own, others will require collaboration. We will again be reaching out to our stakeholders, and we look forward to continued input from and interactions with them to achieve our mutual goals.

Inspired by Stellar Student Scientists

0 comments

Intel Science Talent SearchI recently had the pleasure of attending the awards ceremony for this year’s Intel Science Talent Search. Before the award announcements, I enjoyed visiting the posters of many of the 40 finalists. These high school students displayed impressive knowledge and passion for their projects, many of which were quite sophisticated. A number of the finalists were even savvy enough to test my understanding so they could pitch their presentation to me at an appropriate level.

A highlight of the evening was definitely the award presentations themselves. The winners were clearly thrilled to be recognized among an incredibly accomplished group of young scientists. The speakers—who included Elizabeth Marincola, president of the Society for Science & the Public, which runs the Science Talent Search program; Paul Otellini, president and CEO of Intel; Miles O’Brien, who covers science on the PBS NewsHour; and one of the students, selected by his peers—uniformly stressed the importance of communicating the excitement and value of science to the public.

This is a good reminder to all of us in the scientific community about our responsibility to reach out broadly to explain what we do and why we do it in understandable terms that can inform the public and potentially inspire new members of future generations of scientists.

Proposed NIH Reorganization and NIGMS

8 comments

NCRR task force recommendations.I have previously noted that NIH has proposed creating a new entity, the National Center for Advancing Translational Sciences (NCATS), to house a number of existing programs relating to the discipline of translational science and the development of novel therapeutics. Plans for NCATS have been coupled to a proposal to dismantle the National Center for Research Resources (NCRR), in part because the largest program within NCRR, the Clinical and Translational Science Awards, would be transferred to NCATS and in part because of a statutory limitation on the number of institutes and centers at NIH.

NIH leadership established a task force to determine the placement of NCRR programs within NIH. This group initially developed a “straw model” for discussion and more recently submitted its recommendations to the NIH Director. The recommendations include transferring the Institutional Development Award (IDeA) program and some Biomedical Technology Research Centers and other technology research and development grants to NIGMS at the beginning of Fiscal Year 2012.

As you may be aware, I have expressed concerns about the processes associated with the proposal to abolish NCRR. I hope it is clear that my concerns relate to the processes and not to the NCRR programs, which I hold in very high regard. This opinion is also clearly shared by many others in the scientific community, based on comments on the Feedback NIH site and in other venues.

While there are several additional steps that would need to occur before organizational changes could take place, we at NIGMS are already deepening our understanding of the NCRR programs through meetings with NCRR staff and others directly familiar with the programs. We welcome your input, as well, particularly if you have experience with these NCRR programs. Please comment here or contact me directly.

Enhancing Peer Review Survey Results

5 comments

One of the key principles of the NIH Enhancing Peer Review efforts was a commitment to a continuous review of peer review. In that spirit, NIH conducted a broad survey of grant applicants, reviewers, advisory council members and NIH program and review officers to examine the perceived value of many of the changes that were made. The results of this survey are now available. The report analyzes responses from these groups on topics including the nine-point scoring scale, criterion scores, consistency, bulleted critiques, enhanced review criteria and the clustering of new investigator and clinical research applications.

NIH Enhancing Peer Review Survey criterion score responses from applicants.

NIH Enhancing Peer Review Survey criterion score responses from applicants.

Please feel free to comment here or on Rock Talk, where NIH Deputy Director for Extramural Research Sally Rockey recently mentioned the release of this survey.

Milestone Grant Application

0 comments

Standard Form 424 (Research & Related)We just reached a milestone—our 100,000th grant application. Interestingly, it’s for a K99/R00 (Pathway to Independence) award. This program enables promising postdoctoral scientists to receive mentored—and later independent—research support. Given our strong commitment to research training, mentoring and workforce development, it’s somehow fitting that this application is for a program that addresses these needs.

I remember well when I submitted my first independent grant application, R29GM038230. It was for a FIRST award, an earlier program directed toward helping early stage investigators develop their independent careers. I submitted the application before there was a Grants.gov—and even before there were overnight delivery services. Since I was located in Baltimore, not far from NIH, I personally drove the application down to the old Westwood Building, where the predecessor to the NIH Center for Scientific Review (and NIGMS) was housed at the time. After I presented my carefully wrapped box, I watched as it was thrown on top of a pile of other applications that reminded me of the warehouse scene at the end of “Raiders of the Lost Ark”. Needless to say, I was relieved when I got the self-addressed card indicating that my application had been received and assigned a grant number.

While this is just one small example of how much things can change over time, it leads me to think about other changes and other milestones. One major milestone is coming in 2012, when NIGMS will mark its 50th anniversary. It will be a time for reflecting on the great scientific progress that has been made with NIGMS grant support and also on the opportunities and challenges that lie ahead. You can expect to hear more about the events associated with this anniversary as our plans develop.