Issues with Reproducibility in Open-Source Biotech Experiments

Hello

I have been working on an open-source biotech project but I am running into issues with reproducibility. When I follow a protocol; my results often differ slightly (or sometimes significantly) from what others in the community report. :upside_down_face:

This is frustrating because it makes troubleshooting & improving the method difficult. :thinking:

I have tried checking variables like reagent quality, incubation times & equipment differences, but there’s still some inconsistency. Could it be due to variations in lab conditions, or is there something else I might be missing? How do you ensure reproducibility when working with open-source protocols, especially across different labs? :thinking: Checked Nucleus - Reclone.org Golang Training guide related to this and found it quite informative.

Would love to hear if others have faced similar issues & what strategies worked for you. Are there any tools or standardization methods that help improve consistency in community-driven biotech projects? :thinking:

Thank you!! :slightly_smiling_face:

1 Like

Hi @marcellosalas

Welcome to the community! :wave:

:question: This is a very good question and you’re not the first to have asked this kind of question! It is indeed a frustrating question and sadly, the short answer is that we haven’t found an answer for this yet.
Much longer, and hopefully somewhat helpful answer below. (Apologies in advance!)

:page_facing_up: Often we rely on users to share their protocols (e.g. on the forum, Reclone.org (The Reagent Collaboration Network) - research workspace on protocols.io, etc.) and the community are usually very open to follow-on support to help troubleshoot with specific questions. :people_holding_hands:

But as we all know, there are a lot of different variables as you’ve listed, and sometimes some missing steps that have been assumed or just been picked up along the way.
One of the strategies that have worked for some people is to do a :camera: video (or photo) recording of them as they carry out the experiment and post that alongside their protocols. That way you may see those little things that differ when the experiments are being carried. And hopefully we add in those differences in the protocols. :page_facing_up: Unfortunately, not everyone has the time/resources to do this…

:bulb: If useful, and will some willing participants, perhaps this could be something that we could try to do with the protocols that we share?


:open_book: Some additional reading:

  • I know the iGEM community carried out some interlab studies in 2014 and 2018 to try understand some reproducibility issues using fluorescence and OD measurements.
  • The Turing Way Community always has great resources for skills, tools and best practices for research reproducibility, albeit more for code and data analysis.

:loudspeaker: Like you, I’d be really interested in what others in the community have to say about strategies for improving reproducibility and consistency in the research that we carry out!

Please do post more below! :slight_smile:

Hi @marcellosalas ,

I’ve been working off and on with an organization called Reproducibility 4 Everyone (R4E) for several years. The group gives workshops all over the world, and has a variety of free resources about reproducibility on its website. It’s a nice place to start as you think about reproducibility. And if this is an area that interests you, they’re always looking for volunteers to help updates resources and give workshops!
–Eric

hi there marcello
this is completely normal
there are several parameters and devices and reagents that differ
for example the ddH2O sometimes, if the distiller is old would have a lot of calcium ions and inhibit PCR reaction… and many other similar things happen

if you need a tip for a ver very low-budget lab in biotech, i would be happy to help with what i know