Los sistemas deben «poder ver», luego se podrá implementar la autonomía

Las empresas de defensa se encuentran trabajando para llegar en algún momento a disponer de sistemas militares completamente autónomos. Sin embargo, existe todavía un largo camino por delante y los desarrolladores enfatizan que las plataformas primero deben “poder ver”. Esto se logrará a través de múltiples sistemas sensores  e inteligencia Artificial (AI), procesando y gestionando de manera eficiente, la enorme cantidad de datos e información obtenida en el campo de combate. De esta forma, se evitarán errores indeseados que en el caso de plataformas letales, pueden provocar serias consecuencias. La autonomía total llegará luego.


ALBUQUERQUE: Industry is ready to help the Army work towards autonomous weapons, but first, it needs lots and lots of better data.

Before machines drive themselves on the battlefield, AI-powered sensors will offer suggestions to human operators. And today we’re still at the stage where we’re trying to build machines that can perceive what’s on the battlefield, said Vern Boyle, VP for advanced capabilities at Northrop Grumman. This means starting with sensors that can understand field of view, identify features, and can share and combine that with other machines, all without requiring “a lot of command and control back into physical systems.”

The Ripsaw robotic tank package, a technology demonstrator by Textron Systems, FLIR Systems and Howe & Howe, features both a ground robot transported marsupial-style in a pocket in the tank, and a Skyraider quadcopter drone that can fly independently or tethered to the back of the tank. Ripsaw debuted at AUSA 2019, and is competing for the Army’s Robotic Combat Vehicle – Medium.

Skyraider includes an on-board neural network, which “unburdens the operators workload through autonomous target selection,” said David Viens, a VP at FLIR.

The quality of that processing depends a great deal on the quality of data fed into it. While Viens spoke of the fidelity of a target identification algorithm that could distinguish between armed and unarmed people, between military and civilian vehicles, the video demonstrated briefly identified a tree with the same recognition box it used to highlight a human walking in a parking lot a few feet away.

Preventing that kind of error means companies will have to train algorithms on imperfect images, taken at weird angles or against unusual backdrops, to ensure that the algorithms are actually identifying the right targets.

“Algorithms are biased to learn what things look like only under optimal conditions,” said Patrick Biltgen of Perspecta. At present, many algorithms are specifically designed to remove outlier data.

“Should we bias training data towards the weird stuff?,” asked Biltgen. “If there’s a war, we’re almost certain to see weird things we’ve never seen before.”

The sensors already recording battlefield data will need to keep even the weird data, in order to have examples that can help train the image processing of the future.

Much of the challenges of machine object recognition for the military have parallels in the commercial, self-driving vehicle space.

“Driverless vehicles seem very robust in a commercial environment until they are not,” said Boyle.

In the commercial world, there have been two major approaches to managing this autonomy. The first is to collect more and more data, like that from the always-on cameras of the existing Tesla fleet, or through the extensive road trials of other companies. Another approach is to adapt the existing understood autonomous component, like speed and object detection triggering automatic braking, and roll them out not as full autonomy, but as features on the road to full autonomy.

In this way partial autonomy, seen in image processing and target identification, will be adopted first. The greater autonomous challenges, machines maneuvering on their own at the behest of human commanders, will come later. And it will, most importantly, build on the experiences and data collected from deployed partially autonomous machines.

Fuente: https://breakingdefense.com