Tag: Rigor and Reproducibility

NIH Workshop on Reproducibility in Cell Culture Studies

2 comments

NIGMS is actively involved in NIH-wide efforts to enhance rigor and reproducibility in research. As part of our work on this issue, we will co-host a trans-NIH workshop on September 28-29, 2015, to examine current quality-control challenges in cell culture research and identify opportunities for expanding its capabilities and applications. The meeting will be videocast and archived on the NIH Videocasting site.

The workshop agenda includes panel discussions led by researchers from academia and industry on cell line identification, genetic and phenotypic characterization of cells, heterogeneity in populations of cells, reagents, and research and reporting standards. The meeting will also cover new approaches to understanding the characteristics and behaviors of cultured cells and technologies for enhancing their usefulness in research.

Reproducibility Update: New Resources and Expected Changes to the SF424 Application Guide

2 comments

I previously told you about the development of an NIGMS clearinghouse site where members of the research community will be able to find grantee-produced training materials designed to teach rigorous experimental design and enhance data reproducibility. Since then, NIH has established two new related sites. The first is a Rigor and Reproducibility web portal that provides general information about NIH efforts and offers resources that include guidelines for how research results should be reported and links to publications written by NIH authors on rigor, reproducibility and transparency.

The second site is focused on grants and funding and includes a summary of NIH’s proposal to clarify its instructions to applicants to emphasize expectations that rigorous experimental design and reproducibility of results should be considered during the application and review process. You may have read about the changes in a recent Rock Talk blog post that announced the publication of two new NIH Guide notices: Enhancing Reproducibility through Rigor and Transparency and Consideration of Sex as a Biological Variable in NIH-funded Research. We anticipate that the new instructions will be released in the fall of 2015 and will take effect for all research grant applications submitted on or after January 25, 2016.

As always, if you have questions or concerns, contact your program director. We’re also interested in hearing how your lab validates key biological and chemical reagents, so tell us about your procedures!

Division Director Mike Rogers Retires

3 comments

Mike Rogers, Ph.D.Mike Rogers, who has directed the NIGMS Division of Pharmacology, Physiology, and Biological Chemistry for the past 22 years, retired today. Throughout his NIH career, Mike has been a champion for chemistry and its important role in biomedical research.

Before joining NIGMS 26 years ago, Mike worked for more than a decade in what is now the Center for Scientific Review, where he oversaw the Bioorganic and Natural Products study section.

Between these two positions, Mike completed a detail assignment on Capitol Hill working for Senator Ted Kennedy’s Health, Education, Labor and Pensions Committee, an experience that he says allowed him to see NIH from a different perspective.

Throughout his time at NIGMS, Mike has sought to build scientific bridges. He created the chemistry-biology interface predoctoral training program, which aims to cross-train students in both disciplines. He was instrumental in developing the large-scale collaborative project awards program that “glued” together scientists with diverse expertise to tackle big, unanswered questions in biology. More recently, he forged a link between two fields to help form the new field of quantitative and systems pharmacology. Along the way, he mentored and encouraged others to develop major NIGMS and trans-NIH initiatives, such as those in glycoscience, pharmacogenomics and synthetic organic chemistry.

Continue reading “Division Director Mike Rogers Retires”

Clearinghouse for Training Modules to Enhance Data Reproducibility

4 comments

NIH and NIH-funded Training Modules to Enhance Data ReproducibilityAs part of a series of NIH-wide initiatives to enhance rigor and reproducibility in research, we recently launched a Web page that will serve as a clearinghouse for NIH and NIH-funded training modules to enhance data reproducibility. Among other things, the site will house the products of grants we’ll be making over the next few months for training module development, piloting and dissemination.

Currently, the page hosts a series of four training modules developed by the NIH Office of the Director. These modules, which are being incorporated into NIH intramural program training activities, cover some of the important factors that contribute to rigor and reproducibility in the research endeavor, including blinding, selection of exclusion criteria and awareness of bias. The videos and accompanying discussion materials are not meant to provide specific instructions on how to conduct reproducible research, but rather to stimulate conversations among trainees as well as between trainees and their mentors. Graduate students, postdoctoral fellows and early stage investigators are the primary audiences for the training modules.

Also included on the page are links to previously recorded reproducibility workshops held here at NIH that detail the potentials and pitfalls of cutting-edge technologies in cell and structural biology.

Training is an important element of the NIGMS mission and a major focus of NIH’s overall efforts to enhance data reproducibility. In addition to the training modules we’ll be funding, we recently announced the availability of administrative supplements to our T32 training grants to support the development and implementation of curricular activities in this arena.

I hope you find the resources on this site useful, both now and as we add more in the future.

Wanted: Input on Consideration of Sex as a Biological Variable in Biomedical Research

0 comments

Although researchers have made major progress in achieving a balance between male and female subjects in human studies—women now account for roughly half of the participants in NIH-funded clinical trials—a similar pattern has not been seen in pre-clinical research involving animals and cells. To inform the development of policies that address this issue, NIH has issued a request for information (RFI) on the consideration of sex as a biological variable in biomedical research.

As NIH Deputy Director for Extramural Research Sally Rockey wrote in a recent blog post announcing the RFI, “Sex is a critical variable when trying to understand the biological and behavioral systems that fundamentally shape human health.” Appropriate representation of animals and cells is also relevant to NIGMS and NIH efforts to enhance scientific rigor and data reproducibility in research.

And in her own blog post about the RFI, NIH Associate Director for Research on Women’s Health Janine Clayton said that while many scientists have already expressed support for the policy change, she has also heard from many sources that it needs “to be carefully implemented, as a true benefit to science—and not become a trivial, bureaucratic box to check.” She noted that comments in response to the RFI will guide NIH in creating “meaningful change in a deliberate and thoughtful way.”

Since NIGMS supports a significant amount of basic biomedical science that utilizes animal models and cells, we encourage our grantees to submit their input on this topic by the October 13 deadline.

UPDATE: The deadline for submitting input has been extended to October 24.

Funding Opportunity to Create Training Modules to Enhance Data Reproducibility

0 comments

In February, we asked for input on training activities relevant to enhancing data reproducibility, which has become a very serious issue for both basic and clinical research. The responses revealed that there is substantial variation in the training occurring at institutions. One reason is that “best practices” training in skills that influence data reproducibility appears to be largely passed down from generation to generation of scientists working in the laboratory.

To increase the likelihood that researchers generate reproducible, unbiased and properly validated results, NIGMS and nine additional NIH components have issued a funding opportunity announcement to develop, pilot and disseminate training modules to enhance data reproducibility. Appropriate areas for the modules include experimental design, laboratory practices, analysis and reporting of results, and/or the influence of cultural factors such as confirmation bias in hypothesis testing or the scientific rewards system. The modules should be creative, engaging, readily accessible online at no cost and easily incorporated into research training programs for the intended audience, which includes graduate students, postdoctoral fellows and beginning faculty.

The application deadline is November 20, 2014, with letters of intent due by October 20, 2014. Applicants may request up to $150,000 in total costs to cover the entire award period. For more details, read the FAQs.

Hypothesis Overdrive?

59 comments

Historically, this blog has focused on “news you can use,” but in the spirit of two-way communication, for this post I thought I would try something that might generate more discussion. I’m sharing my thoughts on an issue I’ve been contemplating a lot: the hazards of overly hypothesis-driven science.

When I was a member of one study section, I often saw grant applications that began, “The overarching hypothesis of this application is….” Frequently, these applications were from junior investigators who, I suspect, had been counseled that what study sections want is hypothesis-driven science. In fact, one can even find this advice in articles about grantsmanship Link to external web site.

Despite these beliefs about “what study sections want,” such applications often received unfavorable reviews because the panel felt that if the “overarching hypothesis” turned out to be wrong, the only thing that would be learned is that the hypothesis was wrong. Knowing how a biological system doesn’t work is certainly useful, but most basic research study sections expect that a grant will tell us more about how biological systems do work, regardless of the outcomes of the proposed experiments. Rather than praising these applications for being hypothesis-driven, the study section often criticized them for being overly hypothesis-driven.

Many people besides me have worried about an almost dogmatic emphasis on hypothesis-driven science as the gold standard for biomedical research (e.g., see Jewett, 2005; Beard and Kushmerick, 2009; Glass, 2014 Link to external web site). But the issue here is even deeper than just grantsmanship, and I think it is also relevant to recent concerns over the reproducibility of scientific data and the correctness of conclusions drawn from those data Link to external web site. It is too easy for us to become enamored with our hypotheses, a phenomenon that has been called confirmation bias. Data that support an exciting, novel hypothesis will likely appear in a “high-impact” journal and lead to recognition in the field. This creates an incentive to show that the hypothesis is correct and a disincentive to proving it wrong. Focusing on a single hypothesis also produces tunnel vision, making it harder to see other possible explanations for the data and sometimes leading us to ignore anomalies that might actually be the key to a genuine breakthrough.

In a 1964 paper Link to external web site, John Platt codified an alternative approach to the standard conception of the scientific method, which he named strong inference. In strong inference, scientists always produce multiple hypotheses that will explain their data and then design experiments that will distinguish among these alternative hypotheses. The advantage, at least in principle, is that it forces us to consider different explanations for our results at every stage, minimizing confirmation bias and tunnel vision.

Another way of addressing the hazards of hypothesis-driven science is to shift toward a paradigm of question-driven science. In question-driven science, the focus is on answering questions: How does this system work? What does this protein do? Why does this mutation produce this phenotype? By putting questions ahead of hypotheses, getting the answer becomes the goal rather than “proving” a particular idea. A scientific approach that puts questions first and includes multiple models to explain our observations offers significant benefits for fundamental biomedical research.

In order to make progress, it may sometimes be necessary to start with experiments designed to give us information and leads—Who are the players? or What happens when we change this?—before we can develop any models or hypotheses at all. This kind of work is often maligned as “fishing expeditions” and criticized for not being hypothesis-driven, but history has shown us just how important it can be for producing clues that eventually lead to breakthroughs. For example, genetic screens for mutations affecting development in C. elegans set the stage for the discovery of microRNA-mediated regulation of gene expression.

Is it time to stop talking about hypothesis-driven science and to focus instead on question-driven science? Hypotheses and models are important intermediates in the scientific process, but should they be in the driver’s seat? Let me know what you think.

Give Input on Training Activities Relevant to Data Reproducibility

2 comments

Data reproducibility is getting a lot of attention in the scientific community, and NIH is among those seeking to address the issue Link to external web site. At NIGMS, one area we’re focusing on is the needs and opportunities for training in areas relevant to improving data reproducibility in biomedical research. We just issued a request for information to gather input on activities that already exist or are planned as well as on crucial needs that an NIGMS program should address.

I strongly encourage you and your colleagues to submit comments by the February 28 deadline (no longer available). The information will assist us in shaping a program of small grants to support the design and development of exportable training modules tailored for graduate students, postdoctoral students and beginning investigators.

UPDATE: NIGMS and additional NIH components have issued the Training Modules to Enhance Data Reproducibility (R25) funding opportunity announcement. The application deadline is November 20.