Eight steps to Rigorous and Reproducible Experiments in Biomolecular Research at UNC:

  1. If using a core facility, consult with the core staff in the planning stage. Consult with a statistician if you need help developing a Power Analysis to assure that your results will be adequately powered.
  2. Design your experiment with sufficient controls (rigor) and replicates (reproducibility).
  3. Assure that ALL of your reagents (antibodies, cell lines, mice) are fully validated (see below).
  4. Have a clear and detailed protocol (SOP) and data analysis plan. Assure that the protocol is strictly followed or that any deviation is well documented.
  5. Assure that the staff or students performing the experiment are well trained and understand each step and the importance of performing them precisely.
  6. Use only well-maintained instrumentation, preferably maintained and operated in a core facility with expert staff.
  7. Document all steps, reagents, equipment and data analysis methods used in the experiment. Assure that the both the documentation and the data itself are properly stored in a safe data management repository.
  8. Acknowledge the Cancer Center Support Grant (P30 CA016086), the Human Pluripotent Stem Cell Core, and core staff in publications.

 

Guide to Rigor and Reproducibility for the Human Pluripotent Stem Cell Core

  1. Consult  the core staff in the planning stage to best design your experiment and ensure reproducibility. Send an email to Dr. Adriana S. Beltran at beltran@med.unc.edu and 919.537.3996
  2. Consult the HPSCC personnel for questions related to obtaining IRB approval to collect and/or access archived samples. Work involving patient samples must have IRB approval.
  3. Routinely test patient tissues, primary cells, hES, iPSCs and established cell lines for mycoplasma contaminationand authenticate by Short Tandem Repeat (STR) profiling.
  4. Regularly test hES and iPSCs for pluripotency and karyotype to ensure genome stability in culture.
  5. Purchase reagents and culture medium from the highest quality commercial vendors, and handle them according to the manufactures’ recommendations.
  6. Follow a detailed timeline for differentiationof hES and iPSC specific to the desired cell type. Timeline is discussed in detail during the planning stage and differentiated cells are characterized using immunofluorescence staining of cell type-specific markers.
  7. Perform daily cleaning and decontamination of the laboratory, instruments and equipment. At the HPSCC, all instruments and equipment are maintained and certified on a regular basis to ensure optimal performace.
    • Incubators are inspected and cleaned monthly. Water, temperature, CO2 and oxygen levels are check daily.
    • Safety cabinets are professionally tested and certified annually. The stage, walls and the glass sash (in and out) are wiped with 70% ethanol before and after each use.
    • QuantStudio™ 7 Flex qRT-PCR System is regularly calibrated as indicated by the manufacturer to ensure proper operation. Calibrations include: ROI, Background, Uniformity, Dye and Normalization, and RNase P instrument verification test.
    • Stereoscopes and microscopes are checked weekly to ensure optimal performance. The objectives, eye pieces and stages are cleaned and decontaminated daily. For automated equipment, the computer is scanned for viruses and software updates.

 

Additional resources:

Learn about the NIH Initiative to Enhance Reproducibility through Rigor and Transparency. (Video)

Resource Authentication Planhttps://grants.nih.gov/reproducibility/faqs.htm#V

What Kind of Information Should I Include in My Application’s Resource Authentication Plan? Check out instructions on NIH Nexus Blog.

What are ‘Key Biological and/or Chemical Resources’ that should be addressed your application’s authentication plan?Key biological and/or chemical resources include, but are not limited to, cell lines, specialty chemicals, antibodies and other biologics. More on NIH website

FASEB report on enhancing research reproducibility identifies three main gaps to research reproducibility:

  • Lack of uniform definitions to describe the problem
  • Insufficient reporting of key experimental details
  • Gaps in scientific training