Competitor Final Results Certification Period

Challenge 2 Trial 3 and the Final Event will award cash/grant prizes to eligible teams who meet the ranking criteria of the competition. In an effort to give every team an opportunity to be fairly evaluated, both events will feature a Results Certification Period where a team is allowed to view its own results and offer any feedback or other concerns which could lead to a re-run of the algorithm which was originally submitted for the current Event (Trial 3 or the Final Event).

Final Event Results Certification Period Schedule

September 7th, 2021: Competitors will individually receive their scores from the GO Competition administrator. Results will not be posted on the website at this time. Competitors may self-test their Trial 3 event codebase on their own systems to confirm that they were well-evaluated on the GO platform. Competitors may request time to self-test a limited set of datasets on the GO Competition platform with ARPA-E approval.

Note: We do not expect that every competitor will be able to re-run every scenario on the platform in one week; competitors should use logs or error files to prioritize any scenarios that appear to have issues and could potentially require re-testing.

September 20th, 2021: If a competitor finds that any of their scores were adversely affected by an issue unrelated to their approach/codebase, that competitor may submit up to five scenarios for the GO Competition Administrator to re-evaluate on the GO Competition platform. Competitors should include a description of the perceived issue to receive this official re-evaluation of up to five of their scenarios from Trial 3 event. Competitors may refer to their error logs or any other helpful information while describing the issue.

Valid issues may include:

  1. Platform, configuration, or hardware settings that are inconsistent with the specifications provided on the competition website
  2. Clerical or numerical issues with the evaluation of the results, or in the presentation of the results to the competitor
  3. Evidence of a temporary or passing hardware issue that could have led to lower performance or a failure of a particular scenario

We will not consider justifications that are similar in substance to these:

  1. The codebase works better on a different or faster machine than the hardware described on the competition website.
  2. Any attempt to submit an altered or corrected codebase for re-evaluation. We will only use codebases provided in the original Final Event submission window for this re-evaluation period
  3. Competitors should not request re-evaluations to only achieve marginal improvements on their results.

September 20th – September 30th, 2021: If a competitor raises an issue that is found to be a valid point of widespread concern, ARPA-E will work with that competitor directly to remediate any other affected scenarios, including scenarios outside of the original five submitted for review.

October 4th, 2021: Final Event scores posted for all teams on the GO Competition website. For purposes of awarding prizes for Final Event performance, these team scores and rankings are final.

Trial 3 Final Results Certification Period Schedule

July 16th, 2021: Competitors will individually receive their scores from the GO Competition administrator. Results will not be posted on the website at this time. Competitors may self-test their Trial 3 event codebase on their own systems to confirm that they were well-evaluated on the GO platform. Competitors may request time to self-test a limited set of datasets on the GO Competition platform with ARPA-E approval.

Note: We do not expect that every competitor will be able to re-run every scenario on the platform in one week; competitors should use logs or error files to prioritize any scenarios that appear to have issues and could potentially require re-testing.

July 23rd, 2021: If a competitor finds that any of their scores were adversely affected by an issue unrelated to their approach/codebase, that competitor may submit up to five scenarios for the GO Competition Administrator to re-evaluate on the GO Competition platform. Competitors should include a description of the perceived issue to receive this official re-evaluation of up to five of their scenarios from Trial 3 event. Competitors may refer to their error logs or any other helpful information while describing the issue.

Valid issues may include:

  1. Platform, configuration, or hardware settings that are inconsistent with the specifications provided on the competition website
  2. Clerical or numerical issues with the evaluation of the results, or in the presentation of the results to the competitor
  3. Evidence of a temporary or passing hardware issue that could have led to lower performance or a failure of a particular scenario

We will not consider justifications that are similar in substance to these:

  1. The codebase works better on a different or faster machine than the hardware described on the competition website.
  2. Any attempt to submit an altered or corrected codebase for re-evaluation. We will only use codebases provided in the original Trial 3 event submission window for this re-evaluation period
  3. Competitors should not request re-evaluations to only achieve marginal improvements on their results.

July. 23rd – July. 30th, 2021: If a competitor raises an issue that is found to be a valid point of widespread concern, ARPA-E will work with that competitor directly to remediate any other affected scenarios, including scenarios outside of the original five submitted for review.

August 1st, 2021: Trial 3 scores posted for all teams on the GO Competition website. For purposes of awarding prizes for Trial 3 performance, these team scores and rankings are final.