{"id":6781,"date":"2020-10-27T08:15:11","date_gmt":"2020-10-27T11:15:11","guid":{"rendered":"http:\/\/www.fie.undef.edu.ar\/ceptm\/?p=6781"},"modified":"2020-11-03T08:18:14","modified_gmt":"2020-11-03T11:18:14","slug":"los-sistemas-deben-poder-ver-luego-se-podra-implementar-la-autonomia","status":"publish","type":"post","link":"https:\/\/www.fie.undef.edu.ar\/ceptm\/?p=6781","title":{"rendered":"Los sistemas deben &#8220;poder ver&#8221;, luego se podr\u00e1 implementar la autonom\u00eda"},"content":{"rendered":"<p>Las empresas de defensa se encuentran trabajando para llegar en alg\u00fan momento a disponer de sistemas militares completamente aut\u00f3nomos. Sin embargo, existe todav\u00eda un largo camino por delante y los desarrolladores enfatizan que las plataformas primero deben \u201cpoder ver\u201d. Esto se lograr\u00e1 a trav\u00e9s de m\u00faltiples sistemas sensores\u00a0 e inteligencia Artificial (AI), procesando y gestionando de manera eficiente, la enorme cantidad de datos e informaci\u00f3n obtenida en el campo de combate. De esta forma, se evitar\u00e1n errores indeseados que en el caso de plataformas letales, pueden provocar serias consecuencias. La autonom\u00eda total llegar\u00e1 luego.<\/p>\n<hr \/>\n<p>ALBUQUERQUE: Industry is ready to help the Army work towards autonomous weapons, but first, it needs lots and lots of better data.<\/p>\n<p>Before machines drive themselves on the battlefield, AI-powered sensors will offer suggestions to human operators. And today we\u2019re still at the stage where we\u2019re trying to build machines that can perceive what\u2019s on the battlefield, said Vern Boyle, VP for advanced capabilities at Northrop Grumman. This means starting with sensors that can understand field of view, identify features, and can share and combine that with other machines, all without requiring \u201ca lot of command and control back into physical systems.\u201d<\/p>\n<p>The <a href=\"https:\/\/breakingdefense.com\/2019\/10\/textron-rolls-out-ripsaw-robot-for-rcv-light-and-rcv-medium\/\" target=\"_blank\" rel=\"noopener noreferrer\">Ripsaw robotic tank package<\/a>, a technology demonstrator by Textron Systems, FLIR Systems and Howe &amp; Howe, features both a ground robot transported marsupial-style in a pocket in the tank, and a Skyraider quadcopter drone that can fly independently or tethered to the back of the tank. Ripsaw debuted at AUSA 2019, and is competing for the Army\u2019s Robotic Combat Vehicle \u2013 Medium.<\/p>\n<p>Skyraider includes an on-board neural network, which \u201cunburdens the operators workload through autonomous target selection,\u201d said David Viens, a VP at FLIR.<\/p>\n<p>The quality of that processing depends a great deal on the <a href=\"https:\/\/breakingdefense.com\/2020\/10\/new-pentagon-strategy-to-share-data-like-ammunition\/\" target=\"_blank\" rel=\"noopener noreferrer\">quality of data fed into it<\/a>. While Viens spoke of the fidelity of a target identification algorithm that could distinguish between armed and unarmed people, between military and civilian vehicles, the video demonstrated briefly identified a tree with the same recognition box it used to highlight a human walking in a parking lot a few feet away.<\/p>\n<p>Preventing that kind of error means companies will have to train algorithms on imperfect images, taken at weird angles or against unusual backdrops, to ensure that the algorithms are actually identifying the right targets.<\/p>\n<p>\u201cAlgorithms are biased to learn what things look like only under optimal conditions,\u201d said Patrick Biltgen of Perspecta. At present, many algorithms are specifically designed to remove outlier data.<\/p>\n<p>\u201cShould we bias training data towards the weird stuff?,\u201d asked Biltgen. \u201cIf there\u2019s a war, we\u2019re almost certain to see weird things we\u2019ve never seen before.\u201d<\/p>\n<p>The sensors already recording battlefield data will need to keep even the weird data, in order to have examples that can help train the image processing of the future.<\/p>\n<p>Much of the challenges of machine object recognition for the military have parallels in the commercial, self-driving vehicle space.<\/p>\n<p>\u201cDriverless vehicles seem very robust in a commercial environment until they are not,\u201d said Boyle.<\/p>\n<p>In the commercial world, there have been two major approaches to managing this autonomy. The first is to collect more and more data, like that from the always-on cameras of the existing Tesla fleet, or through the extensive road trials of other companies. Another approach is to adapt the existing understood autonomous component, like speed and object detection triggering automatic braking, and roll them out not as full autonomy, but as features on the road to full autonomy.<\/p>\n<p>In this way partial autonomy, seen in image processing and target identification, will be adopted first. The <a href=\"https:\/\/breakingdefense.com\/2020\/10\/armys-team-ignite-sets-futuristic-rd-targets-ai-robotics-autonomy\/\" target=\"_blank\" rel=\"noopener noreferrer\">greater autonomous challenges<\/a>, machines maneuvering on their own at the behest of human commanders, will come later. And it will, most importantly, build on the experiences and data collected from deployed partially autonomous machines.<\/p>\n<p><strong>Fuente:<\/strong> <a href=\"https:\/\/breakingdefense.com\/2020\/10\/industry-starts-work-on-weapons-that-can-see-autonomy-comes-next\/?utm_campaign=Breaking%20Defense%20Networks%20%26%20Cyber&amp;utm_medium=email&amp;_hsmi=97838651&amp;_hsenc=p2ANqtz-_F6mZ-PlvhN0nH93o1-zsEbWe7HAZgu2K5CStk9_bxK4CLDybOD5MK9OIYsxerOugEQcaW5VFvwz0r9wgAmXjxVy-Kxg&amp;utm_content=97838651&amp;utm_source=hs_email\" target=\"_blank\" rel=\"noopener noreferrer\"><em>https:\/\/breakingdefense.com<\/em><\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Las empresas de defensa se encuentran trabajando para llegar en alg\u00fan momento a disponer de sistemas militares completamente aut\u00f3nomos. Sin embargo, existe todav\u00eda un largo&hellip; <\/p>\n","protected":false},"author":1,"featured_media":6782,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[18,11,2,23],"tags":[],"_links":{"self":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts\/6781"}],"collection":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=6781"}],"version-history":[{"count":1,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts\/6781\/revisions"}],"predecessor-version":[{"id":6783,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts\/6781\/revisions\/6783"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/media\/6782"}],"wp:attachment":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=6781"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=6781"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=6781"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}