Month: February 2015

Improved Success Rate and Other Funding Trends in Fiscal Year 2014

10 comments

The Consolidated and Further Continuing Appropriations Act, 2015, provides funding for the Federal Government through September 30. NIGMS has a Fiscal Year 2015 appropriation of $2.372 billion, which is $13 million, or 0.5%, higher than it was in Fiscal Year 2014.

As I explained in an earlier post, we made a number of adjustments to our portfolio and funding policies last fiscal year in order to bolster our support for investigator-initiated research. Partly because of these changes, the success rate for research project grants (RPGs)—which are primarily R01s—was 25 percent in Fiscal Year 2014. This is 5 percentage points higher than it was in Fiscal Year 2013. Had we not made the funding policy changes, we predicted that the success rate would have remained flat at 20 percent.

Figure 1 shows the number of RPG applications we received and funded, as well as the corresponding success rates, for Fiscal Years 2002-2014.

Figure 1. Number of competing RPG applications assigned to NIGMS (blue line with diamonds, left axis) and number funded (red line with squares, left axis) for Fiscal Years 2002-2014. The success rate (number of applications funded divided by the total number of applications) is shown in the green line with triangles, right axis. Data: Tony Moore.
Figure 1. Number of competing RPG applications assigned to NIGMS (blue line with diamonds, left axis) and number funded (red line with squares, left axis) for Fiscal Years 2002-2014. The success rate (number of applications funded divided by the total number of applications) is shown in the green line with triangles, right axis. Data: Tony Moore.

Moving forward, it will be important to employ strategies that will enable us to at least maintain this success rate. In keeping with this goal, we recently released a financial management plan (no longer available) that continues many of the funding policies we instituted last year. As funds from the retirement of the Protein Structure Initiative come back into the investigator-initiated RPG pool, we’ll be working to ensure that they support a sustained improvement in success rate rather than create a 1-year spike followed by a return to lower rates.

Figures 2 and 3 show data for funding versus the percentile scores of the R01 applications we received. People frequently ask me what NIGMS’ percentile cutoff or “payline” is, but it should be clear from these figures that we do not use a strict percentile score criterion for making funding decisions. Rather, we take a variety of factors into account in addition to the score, including the amount of other support already available to the researcher; the priority of the research area for the Institute’s mission; and the importance of maintaining a broad and diverse portfolio of research topics, approaches and investigators.

Figure 2. Percentage of competing R01 applications funded by NIGMS as a function of percentile scores for Fiscal Years 2010-2014. For Fiscal Year 2014, the success rate for R01 applications was 25.7 percent, and the midpoint of the funding curve was at approximately the 22nd percentile. Data: Jim Deatherage.
Figure 2. Percentage of competing R01 applications funded by NIGMS as a function of percentile scores for Fiscal Years 2010-2014. For Fiscal Year 2014, the success rate for R01 applications was 25.7 percent, and the midpoint of the funding curve was at approximately the 22nd percentile. See more details about the data analysis for Figure 2. Data: Jim Deatherage.
Figure 3. Number of competing R01 applications (solid black bars) assigned to NIGMS and number funded (striped red bars) in Fiscal Year 2014 as a function of percentile scores. Data: Jim Deatherage.
Figure 3. Number of competing R01 applications (solid black bars) assigned to NIGMS and number funded (striped red bars) in Fiscal Year 2014 as a function of percentile scores. See more details about the data analysis for Figure 3. Data: Jim Deatherage.

It’s too early to say what the success rate will be for Fiscal Year 2015 because it can be influenced by a number of factors, as I described last year. However, we’re hopeful that by continuing to adjust our priorities and policies to focus on supporting a broad and diverse portfolio of investigators, we can reverse the trend of falling success rates seen in recent years.

Give Input on NIGMS Undergraduate Student Development Programs to Enhance Diversity in the Biomedical Research Workforce

2 comments

As part of our longstanding commitment to fostering a highly trained and diverse biomedical research workforce, we have launched a review process to ensure that our programs contribute most effectively to this goal. An important part of this effort is to seek your input.

To this end, we just issued a request for information for feedback and novel ideas that might bolster the effectiveness of our undergraduate student development programs. Some of the things we’re particularly interested in are:

  • The advantages (or disadvantages) of supporting a single program per institution that begins after matriculation and provides student development experiences through graduation.
  • Approaches to leveraging successful institutional models for preparing baccalaureates for subsequent Ph.D. completion.
  • Strategies to build institutional capabilities and effective institutional networks that promote undergraduate student training programs that lead to successful Ph.D. completion.
  • If applicable, your specific experiences with any of our student development programs and their outcomes in preparing participants for biomedical research careers.

More broadly, we welcome your suggestions regarding the most important issues we can address in this arena.

I encourage you to share your views (no longer available) on these and associated topics by the response deadline of April 15, 2015.

NIH Data Science Leader’s Vision of a Digital Enterprise for Biomedical Research

0 comments

Phil BourneI recently had the opportunity to talk to Phil Bourne, NIH’s associate director for data science, about some of the current Big Data to Knowledge (BD2K) initiative activities. I asked him how they tie together his vision of a digital enterprise for biomedical research and how they might benefit NIGMS grantees.

Phil explained that the goal of his office, commonly referred to as ADDS, is to achieve efficiencies in biomedical research, such as by making it easier for researchers to locate and manipulate data and software. “If we could just achieve a 5 percent improvement in efficiency in research that would be, in NIH budget dollars, more than $150 million a year that could be spent on funding more people and doing more research,” he said.

An active area that we at NIGMS are engaged in with ADDS is sustaining biomedical data resources, of which we support a fair number. As someone who previously set up databases and who now oversees them, I’m very passionate about this topic. A key question is how to sustain support of data resources in the current research budget environment. Led by Phil’s team, NIH has issued a request for information on sustaining biomedical data repositories that seeks input on every aspect of maintaining these resources. I encourage you to share your ideas by the March 18 response date.

Training is important in Phil’s vision for a digital enterprise, too. He told me of a number of recent training activities at NIH, including a “software carpentry” workshop for experimental researchers to learn how to use a wide variety of analysis tools. In a blog post about this and another event, the ADDS office asks for suggestions on other types of data science courses to offer. They want to provide workshops that train more experimentally versed scientists to work with big data and take those skills back to their labs. In addition, the ADDS office is planning to stand up a workforce development center to catalog classroom and online courses in the data sciences.

Another effort that’s in the works is creating a virtual space called the Commons where researchers can share, locate, utilize and cite datasets, software, standards definitions and documentation. Phil anticipates that the first components of the Commons will be available in 2016.

I’m really excited about Phil’s efforts and believe that they will help drive the “data quantum leap” I described in my first Feedback Loop blog post.

Change in Receipt Dates for Noncompeting Continuation Institutional Training Grant (T32 and T34) Progress Reports

0 comments

To increase the efficiency of issuing noncompeting grant awards, we’ve changed the submission date for noncompeting continuation institutional training grant (T32 and T34) progress reports. Beginning with applications for noncompeting awards that will be made in Fiscal Year 2016:

  • Progress reports for all T32 grants will be due on November 15 (rather than on December 1).
  • Progress reports for all T34 grants will be due on October 15 (rather than on November 1).

There is no change in the receipt dates for competing T32 or T34 applications.

Examining the First Competing Renewal Rates of New NIGMS Investigators

9 comments

The successful entry and retention of new investigators into biomedical research is a priority for us, and the renewal rate of this group’s first R01 research grants is an important indicator for this goal. Here are the results of an analysis I did of the first competing renewal rates for new and established investigators.

Figure 1 shows that the first competing renewal rate of new investigators’ first NIGMS R01 or R29 grants has declined over the past 10 years. This trend is similar to the one for overall NIGMS R01 application success rates.

Percent Funded by End of Project Period (approximate) 53% 2002, 53% 2003, 52% 2004, 45% 2005, 44% 2006, 43% 2007, 43% 2008, 38% 2009, 39% 2010, 35% 2011, 32% 2012
Figure 1. Percentage of new investigators’ R01 and R29 grants that were successfully renewed. The horizontal axis is the fiscal year in which the first project period ended. The vertical axis is the percentage of these projects that were successfully renewed at least once (regardless of whether the new or amended competing renewal application was funded) by the end of Fiscal Year 2014.

Figure 2 gives a more complete picture of the renewal history of new investigators’ NIGMS R01 and R29 projects. In addition to the renewal rate (also shown in Figure 1), it shows the percentage of projects for which at least one renewal application was submitted but was not successfully renewed as well as the percentage of projects for which no renewal application was submitted.

Renewal history at the end date of first project based on paid, not paid, and no application submitted (approximate). 2002 53% paid, 20% not paid, 27% no application. 2003 53% paid, 23% not paid, 24% no application. 2004 52% paid, 27% not paid, 21% no application. 2005 45% paid, 32% not paid, 23% no application. 2006 43% paid, 34% not paid, 23% no application. 2007 43% paid, 30% not paid, 27% no application. 2008 43% paid, 33% not paid, 24% no application. 2009 38% paid, 33% not paid, 29% no application. 2010 39% paid, 36% not paid, 25% no application. 2011 35% paid, 38% not paid, 27% no application. 2012 32% paid, 35% not paid, 33% no application.
Figure 2. Renewal history of new investigators’ R01 and R29 grants between Fiscal Years 2002-2012. The bottom section (green) shows successful renewals (paid), which are also shown in Figure 1; the middle section (red) shows grants for which renewal was attempted but was not successful (not paid); and the top section (blue) shows grants for which no renewal application was submitted (no app).

Figure 3 shows that success in renewing an NIGMS-funded R01 grant correlates positively with how long the grant has been active.

Percentage of Projects funded by grant years (approximate) 31% 1st renewals for grant years 4-6, 49% 2nd renewals for grant years 7-12, 54% for 3rd and greater renewals for grant years 12 or greater
Figure 3. Competing renewal rates for the first, second and third or more renewals of NIGMS R01 grants awarded between Fiscal Years 2004-2007 to new and established investigators.

Since first renewals have lower success rates than subsequent renewals, Figure 4 addresses whether new investigators seeking to renew their first R01 grants are competitive with established investigators who are renewing long-term and/or new projects. The figure shows that the renewal rate for all projects from established investigators, including new as well as long-term projects, is higher than the renewal rate of projects from new investigators (46 percent in the left column versus 36 percent in the right column). However, when focusing only on the first renewals of new projects (in the middle and right columns), new investigators are renewing at a higher rate than are established investigators (36 percent versus 30 percent).

Renewal history by percentage of projects FY2004-FY2007 by type of investigator (approximate). All renewals by established investigators 46% paid, 24% not paid, 30% no application. Renewals of new projects by established investigators 30% paid, 25% not paid, 45% no application. Renewals of new projects by new investigators, 36% paid, 35% not paid, 29% no application.
Figure 4. Renewal history of NIGMS R01 projects from new and established investigators that were initially funded by NIGMS between Fiscal Years 2004-2007.

Figure 5 shows the relative success of new and established investigators in renewing new projects as a function of the percentile score obtained on the initial award. As the “Paid” sections of the bars indicate, for each of the percentile groups, the overall renewal rate for new investigators’ new R01s was higher than that for established investigators’ new R01s.

Percentage of new and established investigators' projects renewed in relation to the percentile ranking of the original award (approximate). Percentile 0-9, new investigators 41% paid, 28% not paid, 31% no application. Percentile 0-9 established investigators, 34% paid, 22% not paid, 44% no application. Percentile 10-19, new investigators, 32% paid, 41% not paid, 27% no application. Percentile 10-19, established investigators 28% paid, 29% not paid, 43% no application. 20th percentile and greater, new investigators 41% funded, 36% not paid, 23% no application. 20th percentile and greater, established investigators 26% paid, 29% not paid, 45% no application.
Figure 5. Renewal history of NIGMS R01 projects from new and established investigators that were initially funded between Fiscal Years 2004-2007 in relation to the percentile ranking (0-9th, 10-19th, and 20th and higher percentiles) of the original award. The number of projects in each of the six categories analyzed from left (new investigators, 0-9th percentile) to right (established investigators, ≥20th percentile) are: 239, 328, 299, 347, 172 and 102.

Recognizing the importance of new investigators in sustaining the vitality of biomedical research, we give special consideration to applications from them, and in some cases, we fund these applications at percentiles beyond those for most established investigators. The data in Figure 5 supports this practice by showing that the renewal rates of new investigators whose original applications scored at or above the 20th percentile are about the same as, or higher than, those for new and established investigators whose original applications scored in the 0-9th percentile range.

More About This Analysis

This analysis includes Recovery Act projects and excludes withdrawn applications and multi-principal investigator grants.

Definitions
R01 projects: Research project grants.
R29 projects: First Independent Research Support and Transition (FIRST) awards, R01-type research project grants awarded to new investigators available from 1987 to 1998.
Renewal rate: Percentage of grants that were successfully renewed by the date of this analysis (end of Fiscal Year 2014), regardless of whether a new or amended competing renewal application was funded.
Grant year: Grant year in which the renewal R01 application was submitted.
New investigator: An individual who has not previously competed successfully as a program director/principal investigator for a substantial NIH independent research award (see http://grants.nih.gov/grants/glossary.htm#NewInvestigator).

MIRA Webinar and Other Resources

0 comments

NIGMS MIRA WebinarNIGMS Director Jon Lorsch and I will field your questions about the recently announced Maximizing Investigators’ Research Award (MIRA) (R35) during a webinar on Thursday, Feb. 19, from 2-3 p.m. EST. You’ll be able to access the webinar at https://webmeeting.nih.gov/nigmsmira/. During the event, you can submit questions by calling 301-451-4301 or e-mailing me. You also can send questions to me ahead of time.

Since announcing this new funding opportunity, we’ve received many inquiries. The most common questions have concerned eligibility, so we created a flowchart to help you determine this. Our MIRA Answers to Frequently Asked Questions offer additional details about eligibility, the submission and review processes, award administration and other aspects of the program. The earliest start date is April 2016, not December 2016 as originally indicated in the funding opportunity announcement.

As stated in earlier posts, this first MIRA competition is an experiment and is intentionally limited to a small group of eligible applicants. If this pilot is successful, we plan to issue future funding opportunity announcements covering additional groups of investigators.

UPDATE: The Maximizing Investigators’ Research Award (MIRA) (R35) Web page also includes links to the MIRA webinar and slides, MIRA-specific instructions for listing current and pending support, and a sample NIH biosketch.

Give Input on Proposed Center for Research Capacity Building

27 comments

At our recent Advisory Council meeting, I announced that we are proposing the establishment of a new organizational unit in NIGMS: the Center for Research Capacity Building (CRCB). As the name implies, it would serve as the hub for our capacity-building programs, which include the Institutional Development Award (IDeA), Support of Competitive Research (SCORE) and Native American Research Centers for Health (NARCH). These programs are now housed in a branch of our Division of Training, Workforce Development, and Diversity.

Among the factors that contributed to this plan are the complexity of our capacity-building programs and the broad range of scientific areas and grant mechanisms they support. We believe that the new organizational structure would allow for more efficient planning, coordination and execution among these programs’ research, research training and research resource access activities.

The head of the new center would report directly to me and be part of the Institute’s senior leadership. Beyond that change, we do not plan to alter the missions, goals, staff or budgets of the IDeA, SCORE and NARCH programs as a result of the reorganization. Also remaining the same would be the review of applications and most grants management and review staff assignments.

We invite your input to inform our planning. Please post your comments by February 18, 2015.

Update: Thank you for your valuable input on this organizational change. A post announcing the establishment of the center is at http://loop.nigms.nih.gov/2015/05/establishment-of-our-center-for-research-capacity-building/