What we recommend you include when you submit a solution brief.

View Instructions

Work With Us - Open Solicitations - Commercial

Submit your commercial solutions to solve national security challenges with the help from DIU.

Artificial Intelligence For Small Unit Maneuver


Responses Due By

2020-07-19 23:59:59 US/Eastern Time

Solution Overview: The Department of Defense (DoD) seeks commercial solutions to develop Artificial Intelligence (AI) enabled unmanned systems for the purpose of multi-agent cooperative autonomy. This effort seeks to develop multi-agent systems capable of autonomous operation, without the use of RF control, Global Navigation Satellite System (GNSS) reference in highly dynamic, unstructured and unknown environments. The solution should be capable of swarming in complex, contested, and congested environments.

Program Strategy: The program will follow a structured development approach whereby solutions with initial desired capability will be rapidly prototyped, tested, and fielded concurrently with continued prototyping of the next iteration of capability.

The Government understands that companies will likely not be able to meet all of the requirements listed below, but encourage companies with demonstrable capability applicable to one or more the requirements to apply. This includes companies focused on exhibiting the operation of neural networks in unstructured and unknown simulated and natural environments and/or companies developing autonomous sUAS solutions. 

If appropriate, the Government may request vendors to create integrated teams to combine solutions. 

The Government has a strong preference for platforms, models, and software that follow the idea of a modular, open-systems architecture. Where possible, use open standards for data ingest, storage, and exchange as well as for the training, deployment, and transfer of neural networks. Software components should be able to be swapped out for other, comparable components via a modular architecture.

Capability Development:

Purpose: Development of the individual autonomous sUAS platform with the understanding that platform will need to network with no less than (3) similar platforms, training and development of AI neural networks for mission scenarios, and increased standoff range for the delivery of sUAS into operational environments.

Autonomous sUAS Development:

Accept third party models that can detect, count, and track objects of interest (e.g. weapons, vehicles, people etc.) in low light or blackout conditions. The use of passive sensors is preferred

  • Able to sense, navigate, and explore interior, exterior, and/or subterranean environments while avoiding obstacles and executing autonomous path planning in low light or blackout conditions. The use of passive sensors is preferred. 

 

  • Development and improvement of sUAS autonomous flight capabilities in the aforementioned environments in order to allow precision flight in closer proximity to obstacles than current industry standards whilst performing the tasks listed above.

 

  • Conduct and transmit 3-D mapping in near real-time. 

 

  • Endurance of at least 25 minutes with min vehicle take-off weight to achieve objectives above.

 

  • Development of alternative methods of sUAS delivery from increasingly long standoff distances leveraging existing DoD platform(s).

 

  • sUAS solutions shall be National Defense Authorization Act Section 848 compliant

Synthetic Tactical AI Development:

  • Development of synthetic training environments modeled after government mission scenarios incorporating unmanned systems and unknown variables.

 

  • Demonstration of the ability to train neural networks based on user-defined and expected behaviors of unmanned systems in this synthetic environment, and optimize them to run on constrained compute hardware on sUAS platforms.  Examples of these behaviors include detection and identification of items in the environment, as well as understanding the context of an unstructured and previously unknown scene.

 

  • Modeling of various on-edge sensor capabilities and study impacts of these sensors on the neural networks.

 

  • Translate expected behaviors provided by an end-user into autonomous behavior from an unmanned system with little to no involvement or interaction from the human operator.

 

Capability Integration and Demonstration of small-team cooperative autonomy:

Purpose: Integration of the autonomous sUAS platform, edge-enabled AI neural networks, and increased standoff range capability. Demonstration of cooperative autonomy between minimum 3 sUAS.

  • Demonstrate the ability to rapidly deploy to an area of interest in quantities of 3 or more, leveraging existing DoD platform(s), and exfiltrate at the end of the mission.

 

  • Demonstrate ability of 3 or more sUAS to display autonomous navigation and cooperation (coordinated tactical task planning, sharing and synthesis of accurate 3D maps, and shared scene understanding).

 

  • Demonstrate cooperative autonomy functionality in varied terrain, climates, and temperatures corresponding to potential areas of operation such as desert (arid), mountainous (moderate elevation/mid-high winds), dense urban clutter, and subterranean environments.

 

  • Demonstration of sUAS networking abilities to communicate across a homogenous platform set in groups of 3 or more.

 

Demonstration of ability to port maneuver autonomy to different UAS:

Purpose: Demonstrate multi-agent cooperative autonomy between similar type, different size vehicle by porting maneuver autonomy stack to another robot type.  

  • Demonstrate ability for maneuver autonomy to operate on proprietary (in-house, vendor-provided) and other COTS variants of similar type and size hardware and conduct fully autonomous flight.

 

  • Demonstrate ability for maneuver autonomy to operate on vehicles of similar type but varied physical size.

 

  • Demonstrate multiple behaviors/skills such as coordinated surveillance, clearance, follow-me, isolation/containment, coordinated communications relay.