Team:Purdue/Project/Standardized Datasheets



Standardized Datasheets

Creating a Better Way to Communicate Data


Synthetic biology is a rapidly growing field of science that promises to revolutionize almost every part of our technology. However, one of the biggest drawbacks of synthetic biology compared to other fields is the lack of standardization within the field. Some of this can be attributed to the nature of the field itself; biological systems are much harder to control than electrical or mechanical ones.

A good portion of the problem though, comes from how the genetic parts are characterized and presented. The iGEM competition has sought for many years to create a “Registry of Standardized Parts” so that iGEM teams and other researchers can submit and use genetic parts that have been proven to work and function. While creating and characterizing new parts and devices to be added to the registry are important, if the information needed to use that part is not communicated efficiently, the parts themselves are useless. This year, the Purdue iGEM team set out to solve this problem by creating a definitive characterization standard for the registry. By talking and collaborating with over fifty other iGEM teams around the world, we have developed a way to standardize how characterization data is submitted and presented in the registry.

This system encompasses an easy, template-based system to enter data of a part into the iGEM registry. Once implemented, our solution will revolutionize the Registry of Standardized Parts and add some much-needed standardization to the field of synthetic biology.


As everyone in iGEM already knows, using the Registry of Standardized Parts can be a very frustrating endeavor. Searching through the registry with ease takes practice, and even then it can be difficult to find good parts to use. Many of the parts in the registry have little to no characterization data present on their registry pages. And some parts in the registry have been very well characterized, but all with different assays and techniques. So it can be very difficult to compare multiple parts that have good data because the data is so different.

After weeks of traversing the registry and, we finally came to the conclusion that the main, underlying issue is that there is currently no standard for characterization data in the registry. Teams have no definitive guideline to follow to characterize their parts.

So to solve this problem, we had to figure out where to start. As is standard with most engineering problems, we started with looking at prior art. In terms of standardizing characterization, we quickly realized how difficult it would be to create a single protocol that could be applied across projects and different types of parts. So instead, we started researching datasheets.

One of the first things we looked at was Drew Endy's datasheet from 2008, shown below on the left. While this data sheet is functional and provides an abundance of information, it also seemed too complex and difficult to understand for the iGEM competition. After a few days of researching, we found out that the Boston University iGEM team had tried a similar project in the 2012 competition. Once we started contacting teams, we especially sought their advice and collaboration. The datasheet they designed in 2012 is shown below on the right. We started brainstorming what our datasheet would look like very early, and many of the early designs incorporated elements from both examples below.

However, as a group we discussed some of the protocols iGEM teams had tried to write over the years, and how most of them never went past the team that created it. This was evident in that we had never heard of most of the protocols and datasheets we found while researching. We were also having trouble deciding how to design a datasheet that every iGEM team would hopefully use. We realized that the best way to design a system that everyone would use and also find a way to keep the idea alive was to talk to as many iGEM teams as possible and get their help and input.

Acquiring Information and Generating Interest

To start this process, we gathered as many personal emails as possible from team roster pages on and Google searches. We emailed at least one person on every single team in the 2013 competition in late May. We sent out a simple survey to gauge how various iGEM teams felt about the current state of the registry, what data is most important to represent on a part page, how often teams find parts with no characterization, and if they would be in favor of a characterization standard for the registry. We asked individuals to fill out the survey instead of entire teams, and we received over 170 responses. Some of the results are below:

The results of this survey clearly validated the need to improve the registry. The answers we received also directed how we were designing the datasheet, and let us know what iGEM teams were looking for in a solution. At the end of this survey, we asked the participants to talk to the rest of their team and see if they would like to collaborate with us by giving critical feedback to our designs. The teams that agreed to take on this challenge with us are listed below:

Arizona State Baskent Meds Bingham Young Boston University
Carnegie Mellon Clemson Copenhagen Cornell
Duke Dundee iGEM Exeter iGEM Georgia Tech
Goethe University HIT Hiedelburg Hokkaidou
EVRY Genopole IIT Delhi Imperial ITU
KU Leuven Kyoto Linkoping Macquarie iGEM
Manaus UFAM METU Norwich NTU Taiwan
Paris Bettencourt Penn State Queen's iGEM Rutgers
Ivy Tech South Bend TU Eindhoven Toronto Tsinghua University A
TU Delft TU Munich Tufts UCL
UCSF UFMG Brazil Univ. of Chicago Univ. of Edinburgh
Univ. of Freiburg Univ. of Goettingen Univ. of Groningen Univ. of Leeds
Univ. of Manchester Univ. of Nevada Univ. of Oklahoma Univ. of Ottawa
Univ. of Pennsylvania Univ. of Salento Univ. of Tokyo Univ. of Virginia
Univ. of York USP Brazil UT Panama Valencia
Valencia CIPF Wageningen Wellesley Xiamen University

First Draft and Gathering Feedback

Once we had gotten all of the results from the first survey, we started designing our datasheet. From the suvey results, we determined that teams wanted a modular datasheet that could be adapted by teams to suit their own needs. So we created a design that had two components; the datasheet and the protocol form. The protocol form was going to be a table of contents for the datasheet, so that users could easily see what had and had not been done without having to sort through data. The datasheet would be where all of the data was actually presented. The first draft of our standardized system is shown below (datasheet on left and protocol form on right):

After we finished this initial design we immediately went to get feedback from the collaborating teams. Except this time, we wanted the interactions between our teams be more personal than just a survey. We had the idea of hosting video conferences with the other teams so we could go through and explain the datasheets in person, and also get to know our fellow iGEMers! So we sent out a scheduling poll to find times that worked for the teams, and over the course of four weeks we had video conferences with 28 other iGEM teams!

Our idea to get feedback from other iGEM teams proved to be one of the best we've ever had! We got so many new ideas, perspectives, and opinions on our datasheet that made the design and concept that much better. We came up with an implementation plan that would be integrated into the iGEM registry, so teams could generate and post their datasheets at the same time that they add their part pages. The other 28 teams came up with so many new ideas that we decided to completely re-design our datasheet with BostonU iGEM.

Incorporating Ideas and Finalizing Design

The amount of ideas that came from feedback of other teams was enormous. Entirely new sections, headers, pieces of data, and ways to implement the system. Once we talked with all of the teams and had some important conversations with the BostonU iGEM team, we created the final design of the datasheet system. This new system got rid of the protocol form and created a more stylized, aesthetically pleasing datasheet. The final design is below:

Implementation and BostonU iGEM

While we were finalizing the design of the data sheet, Thomas Lozanoski from BostonU iGEM came to Purdue to meet with us and discuss the implementation plan of the datasheet. We decided on the final design seen above, and we decided that it would be best if the more experienced programmers at Boston take on creating the actual datasheet program. This way Purdue iGEM would handle the outreach and design, and the Boston team would handle the implementation and functionality.

Now as the competition draws near, both of our teams realized that creating this program and system is too difficult and complex to do in a single competition year. So both teams agreed to use the 2013 competition to build support and interest about the project, and then continue with the collaboration next year.

However, we were able to stylize our part page for BBa_K1225000 in the manner of our datasheets. This provides a good look at how the datasheets would appear on the registry. This can be seen here: BBa_K1225000

Future Plans and Collaboration

The Purdue Biomakers and the BostonU iGEM team are planning to continue this collaborative effort to create a standardized data sheet design and program to improve the iGEM registry. Although we could not fully implement the datasheet system this year, we plan to work very hard at the beginning of next year's competition to finalize the program and the design so we can get as many teams to use it as possible for the 2014 competition. Then once we've had a test year, we can present our system to iGEM HQ and they will hopefully try to officially integrate it into the iGEM competition and registry.

For more information, you can visit BostonU iGEM's wiki page here.


  1. Barry Canton, Anna Labno, and Drew Endy. (2008). Refinement and standardization of synthetic biological parts and devices. Nature Biotechnology. doi:10.1038/nbt1413