The ODNI-OUSD(I) Xtend Challenge: Machine Evaluation of Analytic Products | Deadline 1.15.2018

The evaluation of analytic products is an area ripe for exploring new technological capabilities and approaches.  Currently, intelligence products are reviewed—prior to publication—by numerous levels of management and edited against an Intelligence Community (IC) agency’s signature style using essentially the same methods as publishers have traditionally used.  This human-based approach can be highly subjective and often introduces latency that constrains the IC’s ability to produce effective and timely intelligence products, and may inhibit potential gains offered by advanced analytics and computational methods.

Complementing these pre-publication reviews, the Office of the Director of National Intelligence (ODNI) Analytic Integrity and Standards (AIS) staff evaluate IC-wide intelligence products after they are disseminated to policymakers and warfighters using the AIS Rating Scale for Evaluating Analytic Tradecraft Standards (RSEATS, see Appendix).  Using the RSEATS, AIS staff evaluate the IC’s analytic products based on a number of established criteria, such as sourcing and accuracy.  Analytic offices across the IC use feedback from these reviews to inform their analysts and analytic managers of where they can improve subsequent products. 

Missing from the IC’s analytic toolset is an objective means for quality control as products develop—an IC capability that will be essential should approaches such as ODNI’s and the Office of the Under Secretary of Defense for Intelligence’s (OUSD[I]’s) Xpress Challenge[1] to craft machine-generated intelligence products bear fruit.  Such a capability would also improve the quality and homogeneity of traditional, human-generated analytic products before they are delivered to customers.  Moreover, employing such a feedback loop would provide a means for measurably improving the timeliness of analytic production.  To do so, however, the IC will require new tools.

The ODNI and OUSD(I) are seeking ideas and descriptions of a viable technical approach for enabling the automated evaluation of finished intelligence products. For this Ideation Challenge, Solvers are asked to submit their ideas along with a well-supported, technology-based justification for how the proposed approach could rapidly and objectively evaluate analytic intelligence products. An additional award pool of $50,000 is available for Solvers who are able to provide, upon request from the Seekers, more detailed information such as a pseudo-code implementation of their proposed solution.

This is an Ideation Challenge, which has the following unique features:

  • There is a guaranteed award.  The award(s) will be paid to the best submission(s) as solely determined by the Seekers. The total guaranteed payout will be $25,000, with at least one award being no smaller than $5,000 and no award being smaller than $1,000.
  • The Solvers are not required to transfer exclusive intellectual property rights to the Seekers.  Rather, by submitting a proposal, the Solver grants to the Seekers a royalty-free, perpetual, and non-exclusive license to use any information included in this proposal, including for promotional purposes.
  • After initial review of submissions, Solvers with highly rated submissions may be asked to provide additional detailed information including, but not limited to, a pseudo-code or better implementation of their proposed solution. An additional award pool of $50,000 is available for submissions of this additional information that meet additional criteria specified in the Seekers’ request. Such additional requested information is not subject to the standard Ideation licensing provision and Solvers will be asked to grant to the Seekers a non-exclusive license for US government use purposes only if chosen for an additional award.

Submissions to this Challenge must be received by 11:59 PM (US Eastern Time) on January 15, 2018. 

Late submissions will not be considered.

After the Challenge deadline, the Seekers will complete the review process and make a decision with regards to the Winning Solution(s). All Solvers that submit a proposal will be notified on the status of their submissions; however, no detailed evaluation of individual submissions will be provided.


Federal entities or Federal employees acting within the scope of their employment are eligible to compete but are NOT eligible to receive a monetary award for this Challenge.

Please note that winners will have to submit an Academic Institution Acknowledgement Letter acknowledging the role of ODNI in this Challenge if you are:  (i) a U.S. Academic Institution at the college or university level, (ii) an employee of such institution who is participating on behalf of that institution, or (iii) an employee of such institution who is participating in their personal capacity if they are using the resources of such institution to respond to this Challenge. A template for this letter is included as an attachment to this Challenge and will be available after accepting the Challenge-Specific Agreement (CSA). Click the “View Challenge Details” button to access the CSA for details.


This Challenge is sponsored by ODNI’s Office of the Director of Science and Technology (DS&T), in partnership with the Office of the Under Secretary of Defense for Intelligence (OUSD[I]) and in collaboration with the Air Force Research Laboratory (AFRL). DS&T leads the Intelligence Community’s (IC’s) efforts to enhance the returns on investments in technology—its mission is to deliver innovative, technology-based capabilities which solve intelligence challenges today and in the future.  OUSD(I) serves as advisor to the Secretary and Deputy Secretary of Defense for intelligence, counterintelligence, security, sensitive activities and other intelligence-related matters. AFRL is the Air Force's only organization wholly dedicated to leading the discovery, development, and integration of warfighting technologies for our air, space and cyberspace forces.