Feedback
From 2013.igem.org
(4 intermediate revisions not shown) | |||
Line 2: | Line 2: | ||
<strong>''Dear iGEMers,''</strong> | <strong>''Dear iGEMers,''</strong> | ||
- | Last year, we designed a feedback system based on | + | Last year, we designed a feedback system based on our ballot and voting data and got feedback to teams after the World Championship Jamboree. We were able to get feedback to the teams on how the judges voted. We have improved this system in 2013 and are able to get feedback to you before the WCJ. |
iGEM teams were assessed by judges studying the wikis, examining parts in the Registry, seeing presentations, and speaking to teams at their posters. You can get feedback on your team’s performance directly from the judges’ votes which you can find on the [https://igem.org/Judging_Feedback <strong> Judging Feedback page </strong>]. | iGEM teams were assessed by judges studying the wikis, examining parts in the Registry, seeing presentations, and speaking to teams at their posters. You can get feedback on your team’s performance directly from the judges’ votes which you can find on the [https://igem.org/Judging_Feedback <strong> Judging Feedback page </strong>]. | ||
Line 8: | Line 8: | ||
First, our rubric-assisted judging system reflects the same values that iGEM judges have embraced in previous years: originality, hard work, scientific rigor, usefulness, societal impact, and creativity to name a few. Second, scores are recorded in the newly redesigned judges’ ballot system. | First, our rubric-assisted judging system reflects the same values that iGEM judges have embraced in previous years: originality, hard work, scientific rigor, usefulness, societal impact, and creativity to name a few. Second, scores are recorded in the newly redesigned judges’ ballot system. | ||
- | The new Rubric includes | + | The new Rubric includes standard grading language that enables judges to easily express what they think about the quality of each aspect of the projects. For example, a judge might be asked ‘Did you find the presentation engaging?’ and can choose one of seven responses, ranging from ‘Kept me on the edge of my seat’ to ‘Put me to sleep’. These options correspond with a score of 6 (best) to 1 (worst). We created a Rubric for the Regional Competitions and will have a new version for the World Championship Jamboree. |
- | The rubric organizes key aspects of iGEM projects under the traditional categories, including the Presentation, Wiki, Poster, and Special Prizes. | + | The rubric organizes key aspects of iGEM projects under the traditional categories, including the Presentation, Wiki, Poster, and Special Prizes. Judges evaluated each aspect by selecting one response (from strongly positive to negative or neutral) from a simple list. |
- | + | After every aspect was voted on, all votes were tallied and presented in the form of team rankings for each award. Therefore, every judge who evaluated any aspect of a team’s project contributed directly to that team’s score and ranking. This new system and the theory behind it is based on Ballinski and Laraki’s “Majority Judgment” thesis [http://www.amazon.com/Majority-Judgment-Measuring-Ranking-Electing/dp/0262015137/ref=sr_1_1?ie=UTF8&qid=1352999536&sr=8-1&keywords=majority+judgement]. While we provide you with numerical scores for most categories, we will not release team ranking lists for all of iGEM. | |
In the regional competitions, the medal criteria were included in the beginning of the rubric, as an introduction to the team and as a way to view how each team self-designated their project. The rubric enabled judges to evaluate each iGEM project with the same metric. Therefore scores, rankings, and various awards are now more consistent across all regions. This system also allows new judges to learn what we consider important in evaluating an iGEM project. | In the regional competitions, the medal criteria were included in the beginning of the rubric, as an introduction to the team and as a way to view how each team self-designated their project. The rubric enabled judges to evaluate each iGEM project with the same metric. Therefore scores, rankings, and various awards are now more consistent across all regions. This system also allows new judges to learn what we consider important in evaluating an iGEM project. | ||
- | Because every judge votes on some to most aspects, we have the ability to provide you with these scores. This gives you direct feedback from all the judges on every aspect of your project | + | Because every judge votes on some to most aspects, we have the ability to provide you with these scores. This gives you direct feedback from all the judges on every aspect of your project. |
This system may not be perfect, but represents a great stride forward and contributes to a comprehensive and fair evaluation for each team. We will continue to work on it in the coming years so we can better evaluate all the hard work you, the teams, put into your projects. | This system may not be perfect, but represents a great stride forward and contributes to a comprehensive and fair evaluation for each team. We will continue to work on it in the coming years so we can better evaluate all the hard work you, the teams, put into your projects. | ||
Line 22: | Line 22: | ||
<strong>''Feedback presentation ''</strong> | <strong>''Feedback presentation ''</strong> | ||
- | Your feedback is presented in the form of a table with two columns. The first is the "Average Score", which is the average of the judges votes. The second column is the the category (in the example below "Project") with the aspects listed below. If you have any questions about feedback please contact the judging committee: <strong> judging AT igem DOT org </strong>and put "<strong>iGEM | + | Your feedback is presented in the form of a table with two columns. The first is the "Average Score", which is the average of the judges' votes. The second column is the the category (in the example below "Project") with the aspects listed below. If you have any questions about feedback please contact the judging committee: <strong> judging AT igem DOT org </strong>and put "<strong>iGEM 2013 FEEDBACK QUESTIONS</strong>" in the subject line. |
<html> | <html> | ||
- | |||
- | |||
<style type="text/css"> | <style type="text/css"> | ||
- | . | + | .classname { |
- | -moz-box-shadow:inset 0px 1px 0px 0px # | + | -moz-box-shadow:inset 0px 1px 0px 0px #bbdaf7; |
- | -webkit-box-shadow:inset 0px 1px 0px 0px # | + | -webkit-box-shadow:inset 0px 1px 0px 0px #bbdaf7; |
- | box-shadow:inset 0px 1px 0px 0px # | + | box-shadow:inset 0px 1px 0px 0px #bbdaf7; |
- | background:-webkit-gradient( linear, left top, left bottom, color-stop(0.05, #79bbff), color-stop(1, # | + | background:-webkit-gradient( linear, left top, left bottom, color-stop(0.05, #79bbff), color-stop(1, #378de5) ); |
- | background:-moz-linear-gradient( center top, #79bbff 5%, # | + | background:-moz-linear-gradient( center top, #79bbff 5%, #378de5 100% ); |
- | filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#79bbff', endColorstr='# | + | filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#79bbff', endColorstr='#378de5'); |
background-color:#79bbff; | background-color:#79bbff; | ||
- | + | -moz-border-radius:6px; | |
- | -moz-border-radius | + | -webkit-border-radius:6px; |
- | + | border-radius:6px; | |
- | -webkit-border | + | border:1px solid #84bbf3; |
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | + | ||
- | border: | + | |
display:inline-block; | display:inline-block; | ||
color:#ffffff; | color:#ffffff; | ||
- | font-family: | + | font-family:arial; |
- | font-size: | + | font-size:18px; |
font-weight:bold; | font-weight:bold; | ||
- | + | padding:6px 24px; | |
- | + | ||
- | + | ||
- | + | ||
text-decoration:none; | text-decoration:none; | ||
- | text-align:center; | + | text-shadow:1px 1px 0px #528ecc; |
- | + | text-align: center; | |
- | + | margin-left: 200px; | |
- | . | + | margin-top: 25px; |
- | background:-webkit-gradient( linear, left top, left bottom, color-stop(0.05, # | + | }.classname:hover { |
- | background:-moz-linear-gradient( center top, # | + | background:-webkit-gradient( linear, left top, left bottom, color-stop(0.05, #378de5), color-stop(1, #79bbff) ); |
- | filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='# | + | background:-moz-linear-gradient( center top, #378de5 5%, #79bbff 100% ); |
- | background-color:# | + | filter:progid:DXImageTransform.Microsoft.gradient(startColorstr='#378de5', endColorstr='#79bbff'); |
- | }. | + | background-color:#378de5; |
+ | }.classname:active { | ||
position:relative; | position:relative; | ||
top:1px; | top:1px; | ||
+ | color:#ffffff; | ||
+ | font-family:arial; | ||
+ | font-size:18px; | ||
+ | font-weight:bold; | ||
+ | padding:6px 24px; | ||
+ | text-decoration:none; | ||
+ | text-shadow:1px 1px 0px #528ecc; | ||
+ | }.classname a { | ||
+ | color:#ffffff; | ||
+ | font-family:arial; | ||
+ | font-size:18px; | ||
+ | font-weight:bold; | ||
+ | padding:6px 24px; | ||
+ | text-decoration:none; | ||
+ | text-shadow:1px 1px 0px #528ecc; | ||
} | } | ||
+ | /* This imageless css button was generated by CSSButtonGenerator.com */ | ||
</style> | </style> | ||
- | |||
+ | <a href="https://igem.org/Judging_Feedback?year=2013" class="classname">Access Feedback</a> | ||
+ | |||
+ | <br> | ||
<strong> Example feedback:</strong> | <strong> Example feedback:</strong> | ||
+ | <br> | ||
- | https://static.igem.org/mediawiki/2012/f/f7/Screen_Shot_2012-11-16_at_10.01.23_AM.png | + | <img src="https://static.igem.org/mediawiki/2012/f/f7/Screen_Shot_2012-11-16_at_10.01.23_AM.png" style="width:500px"> |
+ | </html> |
Latest revision as of 14:15, 17 October 2013
Dear iGEMers,
Last year, we designed a feedback system based on our ballot and voting data and got feedback to teams after the World Championship Jamboree. We were able to get feedback to the teams on how the judges voted. We have improved this system in 2013 and are able to get feedback to you before the WCJ.
iGEM teams were assessed by judges studying the wikis, examining parts in the Registry, seeing presentations, and speaking to teams at their posters. You can get feedback on your team’s performance directly from the judges’ votes which you can find on the Judging Feedback page .
First, our rubric-assisted judging system reflects the same values that iGEM judges have embraced in previous years: originality, hard work, scientific rigor, usefulness, societal impact, and creativity to name a few. Second, scores are recorded in the newly redesigned judges’ ballot system.
The new Rubric includes standard grading language that enables judges to easily express what they think about the quality of each aspect of the projects. For example, a judge might be asked ‘Did you find the presentation engaging?’ and can choose one of seven responses, ranging from ‘Kept me on the edge of my seat’ to ‘Put me to sleep’. These options correspond with a score of 6 (best) to 1 (worst). We created a Rubric for the Regional Competitions and will have a new version for the World Championship Jamboree.
The rubric organizes key aspects of iGEM projects under the traditional categories, including the Presentation, Wiki, Poster, and Special Prizes. Judges evaluated each aspect by selecting one response (from strongly positive to negative or neutral) from a simple list.
After every aspect was voted on, all votes were tallied and presented in the form of team rankings for each award. Therefore, every judge who evaluated any aspect of a team’s project contributed directly to that team’s score and ranking. This new system and the theory behind it is based on Ballinski and Laraki’s “Majority Judgment” thesis [http://www.amazon.com/Majority-Judgment-Measuring-Ranking-Electing/dp/0262015137/ref=sr_1_1?ie=UTF8&qid=1352999536&sr=8-1&keywords=majority+judgement]. While we provide you with numerical scores for most categories, we will not release team ranking lists for all of iGEM.
In the regional competitions, the medal criteria were included in the beginning of the rubric, as an introduction to the team and as a way to view how each team self-designated their project. The rubric enabled judges to evaluate each iGEM project with the same metric. Therefore scores, rankings, and various awards are now more consistent across all regions. This system also allows new judges to learn what we consider important in evaluating an iGEM project.
Because every judge votes on some to most aspects, we have the ability to provide you with these scores. This gives you direct feedback from all the judges on every aspect of your project.
This system may not be perfect, but represents a great stride forward and contributes to a comprehensive and fair evaluation for each team. We will continue to work on it in the coming years so we can better evaluate all the hard work you, the teams, put into your projects.
Feedback presentation
Your feedback is presented in the form of a table with two columns. The first is the "Average Score", which is the average of the judges' votes. The second column is the the category (in the example below "Project") with the aspects listed below. If you have any questions about feedback please contact the judging committee: judging AT igem DOT org and put "iGEM 2013 FEEDBACK QUESTIONS" in the subject line.
Access FeedbackExample feedback: