Because U.S. Army battlefield medical experts need a soldier-worn computer to enable medics to document patients autonomously through passive sensors and artificial intelligence (AI) algorithms, they are working with Tomahawk Robotics to develop KxM Edge compute devices with peripheral items.
Officials of the Army Medical Research Acquisition Activity and Tomahawk Robotics are aiming to enable autonomous documentation to work completely without network connection while documenting patients on the battlefield. The developers explain that this requires the AI algorithms to run on person in real time.
As such, the developers are working on wearable computers at the edge with general-purpose graphics processing units (GPGPUs) that can run AI algorithms that use video as input.
Running AI algorithms typically require a graphics processing unit (GPU) to handle graphical processing. Edge compute devices also typically have peripherals like custom housing, input and output ports, and custom software for integration with other systems.
The Tomahawk Robotics KxM edge computers are built with NVIDIA GPUs and the Kinesis Ecosystem software that in addition to battlefield medical applications, also are appropriate for 2D and 3D mapping, electronic warfare (EW) and signals intelligence (SIGINT).
Further, the Kinesis Ecosystem can reportedly help with applications like collaborating surveillance unmanned aircraft as part of a heterogeneous swarm, autonomously searching for targets of interest and sending coordinates to connected users.
The KxM computer help users consume large amounts of data for high-speed, body-worn computation at the tactical edge, reduce cognitive load and fuse raw intelligence data for real-time decision-making.
For more information, visit the Army Medical Research Acquisition Activity website.