Team:Purdue/Project/Standardized Datasheets

From 2013.igem.org

(Difference between revisions)
Line 64: Line 64:
<p>The results of this survey clearly validated the need to improve the registry. The answers we received also directed how we were designing the datasheet, and let us know what iGEM teams were looking for in a solution. At the end of this survey, we asked the participants to talk to the rest of their team and see if they would like to collaborate with us by giving critical feedback to our designs. The teams that agreed to take on this challenge with us are listed below:</p>
<p>The results of this survey clearly validated the need to improve the registry. The answers we received also directed how we were designing the datasheet, and let us know what iGEM teams were looking for in a solution. At the end of this survey, we asked the participants to talk to the rest of their team and see if they would like to collaborate with us by giving critical feedback to our designs. The teams that agreed to take on this challenge with us are listed below:</p>
-
<table cellspacing="10"; align="center">
+
<table cellspacing="10"; cellpadding="5"; align="center">
<tr>
<tr>
   <td>Arizona State</td>
   <td>Arizona State</td>

Revision as of 07:06, 27 September 2013


PurdueLogo2013.png

Standardized Datasheets

Creating a Better Way to Communicate Data

Overview

Synthetic biology is a rapidly growing field of science that promises to revolutionize almost every part of our technology. However, one of the biggest drawbacks of synthetic biology compared to other fields is the lack of standardization within the field. Some of this can be attributed to the nature of the field itself; biological systems are much harder to control than electrical or mechanical ones.

A good portion of the problem though, comes from how the genetic parts are characterized and presented. The iGEM competition has sought for many years to create a “Registry of Standardized Parts” so that iGEM teams and other researchers can submit and use genetic parts that have been proven to work and function. While creating and characterizing new parts and devices to be added to the registry are important, if the information needed to use that part is not communicated efficiently, the parts themselves are useless. This year, the Purdue iGEM team set out to solve this problem by creating a definitive characterization standard for the registry. By talking and collaborating with over fifty other iGEM teams around the world, we have developed a way to standardize how characterization data is submitted and presented in the registry.

This system encompasses an easy, template-based system to enter data of a part into the iGEM registry. Once implemented, our solution will revolutionize the Registry of Standardized Parts and add some much-needed standardization to the field of synthetic biology.

Background

As everyone in iGEM already knows, using the Registry of Standardized Parts can be a very frustrating endeavor. Searching through the registry with ease takes practice, and even then it can be difficult to find good parts to use. Many of the parts in the registry have little to no characterization data present on their registry pages. And some parts in the registry have been very well characterized, but all with different assays and techniques. So it can be very difficult to compare multiple parts that have good data because the data is so different.


After weeks of traversing the registry and igem.org, we finally came to the conclusion that the main, underlying issue is that there is currently no standard for characterization data in the registry. Teams have no definitive guideline to follow to characterize their parts.


So to solve this problem, we had to figure out where to start. As is standard with most engineering problems, we started with looking at prior art. In terms of standardizing characterization, we quickly realized how difficult it would be to create a single protocol that could be applied across projects and different types of parts. So instead, we started researching datasheets.


One of the first things we looked at was Drew Endy's datasheet from 2008, shown below on the left. While this data sheet is functional and provides an abundance of information, it also seemed too complex and difficult to understand for the iGEM competition. After a few days of researching, we found out that the Boston University iGEM team had tried a similar project in the 2012 competition. Once we started contacting teams, we especially sought their advice and collaboration. The datasheet they designed in 2012 is shown below on the right. We started brainstorming what our datasheet would look like very early, and many of the early designs incorporated elements from both examples below.
























However, as a group we discussed some of the protocols iGEM teams had tried to write over the years, and how most of them never went past the team that created it. This was evident in that we had never heard of most of the protocols and datasheets we found while researching. We were also having trouble deciding how to design a datasheet that every iGEM team would hopefully use. We realized that the best way to design a system that everyone would use and also find a way to keep the idea alive was to talk to as many iGEM teams as possible and get their help and input.

Acquiring Information and Generating Interest


To start this process, we gathered as many personal emails as possible from team roster pages on iGEM.org and Google searches. We emailed at least one person on every single team in the 2013 competition in late May. We sent out a simple survey to gauge how various iGEM teams felt about the current state of the registry, what data is most important to represent on a part page, how often teams find parts with no characterization, and if they would be in favor of a characterization standard for the registry. We asked individuals to fill out the survey instead of entire teams, and we received over 170 responses. Some of the results are below:


The results of this survey clearly validated the need to improve the registry. The answers we received also directed how we were designing the datasheet, and let us know what iGEM teams were looking for in a solution. At the end of this survey, we asked the participants to talk to the rest of their team and see if they would like to collaborate with us by giving critical feedback to our designs. The teams that agreed to take on this challenge with us are listed below:

Arizona State Baskent Meds Bingham Young Boston University Carnegie Mellon Clemson Copenhagen
Cornell Duke Dundee iGEM Exeter iGEM Georgia Tech
Goethe University HIT Hiedelburg Hokkaidou
EVERY Genopole IIT Delhi Imperial ITU
KU Leuven Kyoto Linkoping Macquarie iGEM
Manaus UFAM METU Norwich NTU Taiwan
Paris Bettencourt Penn State Queen's iGEM Rutgers
Ivy Tech South Bend TU Eindhoven Toronto Tsinghua University A
TU Delft TU Munich Tufts UCL
UCSF UFMG Brazil Univ. of Chicago Univ. of Edinburgh
Univ. of Freiburg Univ. of Goettingen Univ. of Groningen Univ. of Leeds
Univ. of Manchester Univ. of Nevada Univ. of Oklahoma Univ. of Ottawa
Univ. of Pennsylvania Univ. of Salento Univ. of Tokyo Univ. of Virginia
Univ. of York USP Brazil UT Panama Valencia
Valencia CIPF Wageningen Wellesley Xiamen University

First Draft and Gathering Feedback

Incorporating Ideas and Finalizing Design

Implementation and BostonU iGEM

Future Plans and Collaboration