Author: Jeremy Berg

Headshot of Jeremy Berg.

As former NIGMS director, Jeremy oversaw the Institute’s programs to fund biomedical research and to train the next generation of scientists. He was a leader in many NIH-wide activities and also found time to study a variety of molecular recognition processes in his NIH lab.

Posts by Jeremy Berg

Another Look at Measuring the Scientific Output and Impact of NIGMS Grants

33 comments

In a recent post, I described initial steps toward analyzing the research output of NIGMS R01 and P01 grants. The post stimulated considerable discussion in the scientific community and, most recently, a Nature news article Link to external web site.

In my earlier post, I noted two major observations. First, the output (measured by the number of publications from 2007 through mid-2010 that could be linked to all NIH Fiscal Year 2006 grants from a given investigator) did not increase linearly with increased total annual direct cost support, but rather appeared to reach a plateau. Second, there were considerable ranges in output at all levels of funding.

These observations are even more apparent in the new plot below, which removes the binning in displaying the points corresponding to individual investigators.

A plot of the number of grant-linked publications from 2007 to mid-2010 for 2,938 investigators who held at least one NIGMS R01 or P01 grant in Fiscal Year 2006 as a function of the total annual direct cost for those grants. For this data set, the overall correlation coefficient between the number of publications and the total annual direct cost is 0.14.

A plot of the number of grant-linked publications from 2007 to mid-2010 for 2,938 investigators who held at least one NIGMS R01 or P01 grant in Fiscal Year 2006 as a function of the total annual direct cost for those grants. For this data set, the overall correlation coefficient between the number of publications and the total annual direct cost is 0.14.

The 10th Anniversary of ABRCMS: Preparing Underrepresented Minority Students for Scientific Careers

1 comment
The 10th Anniversary of ABRCMS: Preparing Underrepresented Minority Students for Scientific Careers

Last week, I had the privilege of giving a keynote address at the 10th Annual Biomedical Research Conference for Minority Students (ABRCMS) Link to external website in Charlotte, North Carolina. The conference, sponsored by NIGMS and organized by NIGMS Council member Cliff Houston, had a record attendance of 3,100, including more than 2,000 students and about 20 NIGMS staff members.

The meeting contributes in two major ways to the goal of a scientific workforce that reflects the diversity of the U.S. population. It provides a forum for promising scientists from underrepresented groups to showcase their talent and knowledge and make important training and career connections. It also gives faculty mentors valuable resources for facilitating their students’ success.

My address was organized around the themes from Randy Pausch’s lecture “Really Achieving Your Childhood Dreams Link to external website,” and it described key events and strategies that facilitated my own path to a career in science. I greatly enjoyed discussing science and career opportunities with many of the students at the poster session and after my talk.

Other keynote speakers at this impressive conference included Neil DeGrasse Tyson, Maya Angelou, NIH Director Francis Collins and NIGMS grantee Carolyn Bertozzi.

Jilliene Mitchell, who staffed the NIGMS exhibit booth and talked to a lot of attendees, writes:

The energy level among the meeting attendees soared through the roof of the Charlotte Convention Center. The undergraduate and graduate students were tremendously enthusiastic about networking, presenting their research, listening to scientific talks and getting advice about their career paths from accomplished scientists. The NIGMS exhibit booth received a lot of traffic, with students lined up to talk about training opportunities and faculty members lined up to discuss their grants.

Throughout the conference, I encountered many students who thanked NIGMS for sponsoring ABRCMS. One postdoc summed it up best when she said, “This is the best career development workshop I’ve been to—it’s huge!”

These video clips I took capture the mood and excitement.

The announcement for next year’s ABRCMS meeting is expected soon, and we will post information here when it is available.

Nation’s Top Science Honor to Benkovic, Lindquist, Others

0 comments

Last week, President Obama announced the 2010 recipients of the National Medal of Science and the National Medal of Technology and Innovation. The 10 winners of the National Medal of Science include long-time NIGMS grantees Steve Benkovic from Pennsylvania State University and Susan Lindquist from the Whitehead Institute, MIT. As always, I am pleased when our grantees are among the outstanding scientists and innovators recognized by the President in this significant way.

Nobel News

0 comments
Purdue University Nobel Prize for Chemistry News Conference 

(Download the free Windows Media Player Link to external web site to view)

The Royal Swedish Academy of Sciences announced today that long-time NIGMS grantee Ei-ichi Negishi from Purdue University will share the Nobel Prize in chemistry with Richard Heck from the University of Delaware and Akiri Suzuki from Hokkaido University in Japan for “palladium-catalyzed cross couplings in organic synthesis.” All of us at NIGMS congratulate them on this outstanding recognition of their accomplishments.

Carbon-carbon bond-forming reactions are the cornerstone of organic synthesis, and the reactions developed by these Nobelists are widely used to produce a range of substances, from medicines and other biologically active compounds to plastics and electronic components. NIGMS supports a substantial portfolio of grants directed toward the development of new synthetic methods precisely because of the large impact these methods can have.

I have personal experience with similar methods. I am a synthetic inorganic chemist by training, and a key step during my Ph.D. training was getting a carbon-carbon bond-forming reaction to work (using a reaction not directly related to today’s Nobel Prize announcement). I spent many months trying various reaction schemes, and my eventual success was really the “transition state” for my Ph.D. thesis: Within a month of getting this reaction to work, it was clear that I would be Dr. Berg sooner rather than later!

I’d also like to note that this year’s Nobel Prize in physiology or medicine to Robert Edwards “for the development of in vitro fertilization” also appears to have an NIGMS connection. Roger Donahue sent me a paper he coauthored with Edwards, Theodore Baramki and Howard Jones titled “Preliminary attempts to fertilize human oocytes matured in vitro.” This paper stemmed from a short fellowship that Edwards did at Johns Hopkins in 1964. Referencing the paper in an account of the development of IVF, Jones notes that, “No fertilization was claimed but, in retrospect looking at some of the photographs published in that journal (referring to the paper above), it is indeed likely that human fertilization was achieved at Johns Hopkins Hospital in the summer of 1964.” The paper cites NIGMS support for this work through grants to Victor McKusick.

In all, NIGMS has supported the prizewinning work of 74 grantees, 36 of whom are Nobel laureates in chemistry.

NIH-Wide Correlations Between Overall Impact Scores and Criterion Scores

5 comments

In a recent post, I presented correlations between the overall impact scores and the five individual criterion scores for sample sets of NIGMS applications. I also noted that the NIH Office of Extramural Research (OER) was performing similar analyses for applications across NIH.

OER’s Division of Information Services has now analyzed 32,608 applications (including research project grant, research center and SBIR/STTR applications) that were discussed and received overall impact scores during the October, January and May Council rounds in Fiscal Year 2010. Here are the results by institute and center:

Correlation coefficients between the overall impact score and the five criterion scores for 32,608 NIH applications from the Fiscal Year 2010 October, January and May Council rounds.

Correlation coefficients between the overall impact score and the five criterion scores for 32,608 NIH applications from the Fiscal Year 2010 October, January and May Council rounds. High-res. image (112KB JPG)

This analysis reveals the same trends in correlation coefficients observed in smaller data sets of NIGMS R01 grant applications. Furthermore, no significant differences were observed in the correlation coefficients among the 24 NIH institutes and centers with funding authority.

Measuring the Scientific Output and Impact of NIGMS Grants

29 comments

A frequent topic of discussion at our Advisory Council meetings—and across NIH—is how to measure scientific output in ways that effectively capture scientific impact. We have been working on such issues with staff of the Division of Information Services in the NIH Office of Extramural Research. As a result of their efforts, as well as those of several individual institutes, we now have tools that link publications to the grants that funded them.

Using these tools, we have compiled three types of data on the pool of investigators who held at least one NIGMS grant in Fiscal Year 2006. We determined each investigator’s total NIH R01 or P01 funding for that year. We also calculated the total number of publications linked to these grants from 2007 to mid-2010 and the average impact factor for the journals in which these papers appeared. We used impact factors in place of citations because the time dependence of citations makes them significantly more complicated to use.

I presented some of the results of our analysis of this data at last week’s Advisory Council meeting. Here are the distributions for the three parameters for the 2,938 investigators in the sample set:

Histograms showing the distributions of total annual direct costs, number of publications linked to those grants from 2007 to mid-2010 and average impact factor for the publication journals for 2,938 investigators who held at least one NIGMS R01 or P01 grant in Fiscal Year 2006.

Histograms showing the distributions of total annual direct costs, number of publications linked to those grants from 2007 to mid-2010 and average impact factor for the publication journals for 2,938 investigators who held at least one NIGMS R01 or P01 grant in Fiscal Year 2006.

For this population, the median annual total direct cost was $220,000, the median number of grant-linked publications was six and the median journal average impact factor was 5.5.

A plot of the median number of grant-linked publications and median journal average impact factors versus grant total annual direct costs is shown below.

A plot of the median number of grant-linked publications from 2007 to mid-2010 (red circles) and median average impact factor for journals in which these papers were published (blue squares) for 2,938 investigators who held at least one NIGMS R01 or P01 grant in Fiscal Year 2006. The shared bars show the interquartile ranges for the number of grant-linked publications (longer red bars) and journal average impact factors (shorter blue bars). The medians are for bins, with the number of investigators in each bin shown below the bars.

A plot of the median number of grant-linked publications from 2007 to mid-2010 (red circles) and median average impact factor for journals in which these papers were published (blue squares) for 2,938 investigators who held at least one NIGMS R01 or P01 grant in Fiscal Year 2006. The shared bars show the interquartile ranges for the number of grant-linked publications (longer red bars) and journal average impact factors (shorter blue bars). The medians are for bins, with the number of investigators in each bin shown below the bars.

This plot reveals several important points. The ranges in the number of publications and average impact factors within each total annual direct cost bin are quite large. This partly reflects variations in investigator productivity as measured by these parameters, but it also reflects variations in publication patterns among fields and other factors.

Nonetheless, clear trends are evident in the averages for the binned groups, with both parameters increasing with total annual direct costs until they peak at around $700,000. These observations provide support for our previously developed policy on the support of research in well-funded laboratories. This policy helps us use Institute resources as optimally as possible in supporting the overall biomedical research enterprise.

This is a preliminary analysis, and the results should be viewed with some skepticism given the metrics used, the challenges of capturing publications associated with particular grants, the lack of inclusion of funding from non-NIH sources and other considerations. Even with these caveats, the analysis does provide some insight into the NIGMS grant portfolio and indicates some of the questions that can be addressed with the new tools that NIH is developing.

Scoring Analysis with Funding and Investigator Status

17 comments

My previous post generated interest in seeing the results coded to identify new investigators and early stage investigators. Recall that new investigators are defined as individuals who have not previously competed successfully as program director/principal investigator for a substantial NIH independent research award. Early stage investigators are defined as new investigators who are within 10 years of completing the terminal research degree or medical residency (or the equivalent).

Below is a plot for 655 NIGMS R01 applications reviewed during the January 2010 Council round.

A plot of the overall impact score versus the percentile for 655 NIGMS R01 applications reviewed during the January 2010 Council round. Solid symbols show applications for which awards have been made and open symbols show applications for which awards have not been made. Red circles indicate early stage investigators, blue squares indicate new investigators who are not early stage investigators and black diamonds indicate established investigators.

A plot of the overall impact score versus the percentile for 655 NIGMS R01 applications reviewed during the January 2010 Council round. Solid symbols show applications for which awards have been made and open symbols show applications for which awards have not been made. Red circles indicate early stage investigators, blue squares indicate new investigators who are not early stage investigators and black diamonds indicate established investigators.

This plot reveals that many of the awards made for applications with less favorable percentile scores go to early stage and new investigators. This is consistent with recent NIH policies.

The plot also partially reveals the distribution of applications from different classes of applicants. This distribution is more readily seen in the plot below.

A plot of the cumulative fraction of applications for four classes of applications with a pool of 655 NIGMS R01 applications reviewed during the January 2010 Council round. The classes are applications from early stage investigators (red squares), applications from new investigators (blue circles), new (Type 1) applications from established investigators (black diamonds) and competing renewal (Type 2) applications from established investigators (black triangles). N indicates the number in each class of applications within the pool.

A plot of the cumulative fraction of applications for four classes of applications with a pool of 655 NIGMS R01 applications reviewed during the January 2010 Council round. The classes are applications from early stage investigators (red squares), applications from new investigators (blue circles), new (Type 1) applications from established investigators (black diamonds) and competing renewal (Type 2) applications from established investigators (black triangles). N indicates the number in each class of applications within the pool.

This plot shows that competing renewal (Type 2) applications from established investigators represent the largest class in the pool and receive more favorable percentile scores than do applications from other classes of investigators. The plot also shows that applications from early stage investigators have a score distribution that is quite similar to that for established investigators submitting new applications. The curve for new investigators who are not early stage investigators is similar as well, although the new investigator curve is shifted somewhat toward less favorable percentile scores.

Scoring Analysis with Funding Status

15 comments

In response to a previous post, a reader requested a plot showing impact score versus percentile for applications for which funding decisions have been made. Below is a plot for 655 NIGMS R01 applications reviewed during the January 2010 Council round.

A plot of the overall impact score versus the percentile for 655 NIGMS R01 applications reviewed during the January 2010 Council round. Green circles show applications for which awards have been made. Black squares show applications for which awards have not been made.

A plot of the overall impact score versus the percentile for 655 NIGMS R01 applications reviewed during the January 2010 Council round. Green circles show applications for which awards have been made. Black squares show applications for which awards have not been made.

This plot confirms that the percentile representing the halfway point of the funding curve is slightly above the 20th percentile, as expected from previously posted data.

Notice that there is a small number of applications with percentile scores better than the 20th percentile for which awards have not been made. Most of these correspond to new (Type 1, not competing renewal) applications that are subject to the NIGMS Council’s funding decision guidelines for well-funded laboratories.

New NIH Principal Deputy Director

1 comment

Dr. Lawrence A. TabakNIH Director Francis Collins recently named Larry Tabak as the NIH principal deputy director. Raynard Kington previously held this key position.

Over the years, I have worked closely with Dr. Tabak in many settings, including the Enhancing Peer Review initiative. A biochemist who continues to do research in the field of glycobiology, he is a firm supporter of investigator-initiated research and basic science. He is also a good listener and a creative problem solver.

Dr. Tabak, who has both D.D.S. and Ph.D. degrees, has directed NIH’s National Institute of Dental and Craniofacial Research for the past decade. In 2009, Dr. Kington—who had stepped in as acting director of NIH following the departure of Elias Zerhouni—tapped Dr. Tabak to be his acting deputy. Dr. Tabak’s achievements included playing an integral role in NIH Recovery Act activities.

Given the challenging issues that the principal deputy director often works on, Dr. Tabak’s experience—from dentist and bench scientist to scientific administrator—clearly provides him with valuable tools for the job. His experience as an endodontist may be particularly useful in some situations, allowing him to identify and “treat” potentially serious issues.

Scoring Analysis: 1-Year Comparison

12 comments

I recently posted several analyses (on July 15, July 19 and July 21) of the relationships between the overall impact scores on R01 applications determined by study sections and the criterion scores assigned by individual reviewers. These analyses were based on a sample of NIGMS applications reviewed during the October 2009 Council round. This was the first batch of applications for which criterion scores were used.

NIGMS applications for the October 2010 Council round have now been reviewed. Here I present my initial analyses of this data set, which consists of 654 R01 applications that were discussed, scored and percentiled.

The first analysis, shown below, relates to the correlation coefficients between the overall impact score and the averaged individual criterion scores.

Correlation coefficients between the overall impact score and averaged individual criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round. The corresponding scores for a sample of 360 NIGMS R01 applications reviewed during the October 2009 Council round are shown in parentheses.

Correlation coefficients between the overall impact score and averaged individual criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round. The corresponding scores for a sample of 360 NIGMS R01 applications reviewed during the October 2009 Council round are shown in parentheses.

Overall, the trend in correlation coefficients is similar to that observed for the sample from 1 year ago, although the correlation coefficients for the current sample are slightly higher for four out of the five criterion scores.

Here are results from a principal component analysis:

Principal component analysis of overall impact score based on the five criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round. The corresponding scores for a sample of 360 NIGMS R01 applications reviewed during the October 2009 Council round are shown in parentheses.

Principal component analysis of overall impact score based on the five criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round. The corresponding scores for a sample of 360 NIGMS R01 applications reviewed during the October 2009 Council round are shown in parentheses.

There is remarkable agreement between the results of the principal component analysis for the October 2010 data set and those for the October 2009 data set. The first principal component accounts for 72% of the variance, with the largest contribution coming from approach, followed by innovation, significance, investigator and finally environment. This agreement between the data sets extends through all five principal components, although there is somewhat more variation for principal components 2 and 3 than for the others.

Another important factor in making funding decisions is the percentile assigned to a given application. The percentile is a ranking that shows the relative position of each application’s score among all scores assigned by a study section at its last three meetings. Percentiles provide a way to compare applications reviewed by different study sections that may have different scoring behaviors. They also correct for “grade inflation” or “score creep” in the event that study sections assign better scores over time.

Here is a plot of percentiles and overall impact scores:

A plot of the overall impact score versus the percentile for 654 NIGMS R01 applications reviewed during the October 2010 Council round.

A plot of the overall impact score versus the percentile for 654 NIGMS R01 applications reviewed during the October 2010 Council round.

This plot reveals that a substantial range of overall impact scores can be assigned to a given percentile score. This phenomenon is not new; a comparable level of variation among study sections was seen in the previous scoring system, as well.

The correlation coefficient between the percentile and overall impact score in this data set is 0.93. The correlation coefficients between the percentile and the averaged individual criterion scores are given below:

Correlation coefficients between the percentile and the averaged individual criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round.

Correlation coefficients between the percentile and the averaged individual criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round.

As one would anticipate, these correlation coefficients are somewhat lower than those for the overall impact score since the percentile takes other factors into account.

The results of a principal component analysis applied to the percentile data show:

Principal component analysis of percentile data based on the five criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round.

Principal component analysis of percentile data based on the five criterion scores for 654 NIGMS R01 applications reviewed during the October 2010 Council round.

The results of this analysis are very similar to those for the overall impact scores, with the first principal component accounting for 72% of the variance and similar weights for the individual averaged criterion scores.

Our posting of these scoring analyses has led the NIH Office of Extramural Activities and individual institutes to launch their own analyses. I will share their results as they become available.