{"id":6630,"date":"2020-10-08T16:36:11","date_gmt":"2020-10-08T19:36:11","guid":{"rendered":"http:\/\/www.fie.undef.edu.ar\/ceptm\/?p=6630"},"modified":"2020-10-08T16:36:11","modified_gmt":"2020-10-08T19:36:11","slug":"origin-un-ugv-con-capacidad-antitanque","status":"publish","type":"post","link":"https:\/\/www.fie.undef.edu.ar\/ceptm\/?p=6630","title":{"rendered":"Origin, un UGV con capacidad antitanque"},"content":{"rendered":"<p>En el marco del denominado PROJECT CONVERGENSE, donde el US ARMY desarrolla y ensaya multiplicidad de sistemas aut\u00f3nomos, terrestres y a\u00e9reos, as\u00ed como sensores y redes de informaci\u00f3n, se puso a prueba el empleo de Inteligencia Artificial (AI) en diferentes situaciones. Particularmente, el UGV \u201cORIGIN\u201d que en misiones de exploraci\u00f3n avanzada, detecta, identifica y bate blindados con eficacia, con la particularidad que la orden de fuego contin\u00faa estando a cargo de operadores humanos.<\/p>\n<hr \/>\n<p>WASHINGTON: A pair of unprepossessing robots, looking more like militarized golf carts than Terminators, trundled across the Yuma Desert, part of <a href=\"https:\/\/breakingdefense.com\/tag\/project-convergence\/\" target=\"_blank\" rel=\"noopener noreferrer\">the Army\u2019s Project Convergence exercise<\/a> on future warfare.<\/p>\n<p>Like human troops, the machines took turns covering each other as they advanced. One robot would find a safe spot, stop, and launch the <a href=\"https:\/\/breakingdefense.com\/2020\/08\/marines-explore-robots-5g-networks-for-future-wars\/\" target=\"_blank\" rel=\"noopener noreferrer\">tethered mini-drone<\/a> it carried to look over the next ridgeline while the other bot moved forward; then they\u2019d switch off.<\/p>\n<figure id=\"attachment_6632\" aria-describedby=\"caption-attachment-6632\" style=\"width: 768px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" class=\"size-full wp-image-6632\" src=\"https:\/\/www.fie.undef.edu.ar\/ceptm\/wp-content\/uploads\/2020\/10\/Mini-UAV-on-back-200820-A-WL997-1002-768x512-1.jpg\" alt=\"\" width=\"768\" height=\"512\" srcset=\"https:\/\/www.fie.undef.edu.ar\/ceptm\/wp-content\/uploads\/2020\/10\/Mini-UAV-on-back-200820-A-WL997-1002-768x512-1.jpg 768w, https:\/\/www.fie.undef.edu.ar\/ceptm\/wp-content\/uploads\/2020\/10\/Mini-UAV-on-back-200820-A-WL997-1002-768x512-1-300x200.jpg 300w\" sizes=\"(max-width: 768px) 100vw, 768px\" \/><figcaption id=\"caption-attachment-6632\" class=\"wp-caption-text\">Hoverfly mini-drone carried on the Origin robot<\/figcaption><\/figure>\n<p>Their objective: a group of buildings on the Army\u2019s Yuma Proving Ground, a simulated town for urban combat training. As one robot held back to relay communications to its distant human overseers, the other moved into the town \u2013 and spotted \u201cenemy\u201d forces. <a href=\"https:\/\/breakingdefense.com\/2020\/09\/a-slew-to-a-kill-project-convergence\/\" target=\"_blank\" rel=\"noopener noreferrer\">With human approval<\/a>, the robot <a href=\"https:\/\/breakingdefense.com\/2019\/09\/titan-robot-test-fires-javelin-anti-tank-missile\/\" target=\"_blank\" rel=\"noopener noreferrer\">opened fire<\/a>.<\/p>\n<p>Then the robot\u2019s onboard Aided Target Recognition (ATR) algorithms identified another enemy, a T-72 tank. But this target was too far away for the robot\u2019s built-in weapons to reach. So the bot uploaded the targeting data to <a href=\"https:\/\/breakingdefense.com\/2020\/09\/improvised-mode-the-army-network-evolves-in-project-convergence\/\" target=\"_blank\" rel=\"noopener noreferrer\">the tactical network<\/a> and \u2013 again, with human approval \u2013 called in artillery support.<\/p>\n<p>\u201cThat\u2019s a huge step, Sydney,\u201d said <a href=\"https:\/\/breakingdefense.com\/tag\/gen.-richard-ross-coffman\/\" target=\"_blank\" rel=\"noopener noreferrer\">Brig. Gen. Richard Ross Coffman<\/a>, the Project Convergence exercise director. \u201cThat computer vision\u2026 is nascent, but it is working.\u201d<\/p>\n<p>Algorithmic target recognition and computer vision are critical advances over most current military robots, which aren\u2019t truly autonomous but merely remote-controlled: The machine can\u2019t think for itself, it just relays camera feeds back to a human operator, who tells it exactly where to go and what to do.<\/p>\n<p>That approach, called teleoperation, does let you keep the human out of harm\u2019s way, making it good for bomb squads and small-scale scouting. But it\u2019s too slow and labor-intensive to employ on a large scale. If you want to use lots of robots without tying down a lot of people micromanaging them, you need the robots to make some decisions for themselves \u2013 although the Army emphasizes that the decision to use of lethal force will always be made by a human.<\/p>\n<figure id=\"attachment_6633\" aria-describedby=\"caption-attachment-6633\" style=\"width: 150px\" class=\"wp-caption alignright\"><img loading=\"lazy\" class=\"size-full wp-image-6633\" src=\"https:\/\/www.fie.undef.edu.ar\/ceptm\/wp-content\/uploads\/2020\/10\/Coffman-NGCV-CFT-@-AUSA-150x150-1.jpg\" alt=\"\" width=\"150\" height=\"150\" \/><figcaption id=\"caption-attachment-6633\" class=\"wp-caption-text\">Brig. Gen. Richard Ross Coffman<\/figcaption><\/figure>\n<p>So Coffman, who oversees the <a href=\"https:\/\/breakingdefense.com\/tag\/rcv\/\" target=\"_blank\" rel=\"noopener noreferrer\">Robotic Combat Vehicle<\/a> and <a href=\"https:\/\/breakingdefense.com\/tag\/omfv\/\" target=\"_blank\" rel=\"noopener noreferrer\">Optionally Manned Fighting Vehicle<\/a> programs, turned to the Army\u2019s <a href=\"https:\/\/breakingdefense.com\/2020\/08\/army-tests-new-all-domain-kill-chain-from-space-to-ai\/\" target=\"_blank\" rel=\"noopener noreferrer\">Artificial Intelligence Task Force<\/a>\u00a0at <a href=\"https:\/\/breakingdefense.com\/2018\/11\/army-ai-task-force-comes-to-pittsburgh-c-o-cmu\/\" target=\"_blank\" rel=\"noopener noreferrer\">Carnegie Mellon University<\/a>. \u201cEight months ago,\u201d he told me, \u201cI gave them the challenge: I want you to go out and sense targets with a robot \u2014 and you have to move without using LIDAR.\u201d<\/p>\n<p>LIDAR, which uses low-powered laser beams to detect obstacles, is a common sensor on experimental self-driving cars. But, Coffman noted, because it\u2019s actively emitting laser energy, enemies can easily detect it.<\/p>\n<p>So the robots in the Project Convergence experiment, called \u201cOrigin,\u201d relied on passive sensors: cameras. That meant their machine vision algorithms had to be good enough to interpret the visual imagery and deduce the relative locations of potential obstacles, <em>without<\/em> being able to rely on LIDAR or radar to measure distance and direction precisely. That may seem simple enough to humans, whose eyes and brain benefit from a few hundred million years of evolution, but it\u2019s a radical feat for robots, <a href=\"https:\/\/breakingdefense.com\/2020\/08\/robots-vs-puddles-surprises-from-army-rcv-test\/\" target=\"_blank\" rel=\"noopener noreferrer\">which still struggle to distinguish, say, a shallow puddle from a dangerously deep pit<\/a>.<\/p>\n<p>\u201cJust with machine vision, they were able to move from Point A to Point B,\u201d Coffman said. But the Army doesn\u2019t just want <a href=\"https:\/\/breakingdefense.com\/2019\/11\/the-armys-universal-robot-driver\/\" target=\"_blank\" rel=\"noopener noreferrer\">robots that can find their way around<\/a>: It wants them to scout for threats and targets \u2013 without a human having to constantly stare at the sensor feed.<\/p>\n<p>That\u2019s where Aided Target Recognition comes in. (ATR also stands for <em>Automated<\/em> Target Recognition, but the Army doesn\u2019t like the implication that the software would replace human judgment, so it consistently uses <em>Aided <\/em>instead).<\/p>\n<figure id=\"attachment_6634\" aria-describedby=\"caption-attachment-6634\" style=\"width: 1024px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" class=\"size-full wp-image-6634\" src=\"https:\/\/www.fie.undef.edu.ar\/ceptm\/wp-content\/uploads\/2020\/10\/ATAK-feed-from-robot-200824-A-WL997-1061-1024x683-1.jpg\" alt=\"\" width=\"1024\" height=\"683\" srcset=\"https:\/\/www.fie.undef.edu.ar\/ceptm\/wp-content\/uploads\/2020\/10\/ATAK-feed-from-robot-200824-A-WL997-1061-1024x683-1.jpg 1024w, https:\/\/www.fie.undef.edu.ar\/ceptm\/wp-content\/uploads\/2020\/10\/ATAK-feed-from-robot-200824-A-WL997-1061-1024x683-1-300x200.jpg 300w, https:\/\/www.fie.undef.edu.ar\/ceptm\/wp-content\/uploads\/2020\/10\/ATAK-feed-from-robot-200824-A-WL997-1061-1024x683-1-768x512.jpg 768w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption id=\"caption-attachment-6634\" class=\"wp-caption-text\">Data from the Origin robot and drone updates handheld tactical maps.<\/figcaption><\/figure>\n<p>Recognizing targets is another big challenge. Sure, artificial intelligence has gotten scarily good at identifying individual faces in photos posted on social media. But the private sector hasn\u2019t invested nearly as much in, say, telling the difference between an American M1 Abrams tank and a Russian-made T-72, or between an innocent Toyota pickup and the same truck upgunned as a guerrilla \u201ctechnical\u201d with a heavy machinegun in the back. And the Army needs to be able to tell enemy from friendly from civilian in messy real-world combat zones \u2013 and not only from clear overhead surveillance shots, but from the ground, against troops trained to use camouflage and cover to break up easily recognizable silhouettes.<\/p>\n<p>\u201cTraining algorithms to identify vehicles by type, it\u2019s a <a href=\"https:\/\/breakingdefense.com\/2020\/09\/ais-data-hunger-will-drive-intelligence-collection\/\" target=\"_blank\" rel=\"noopener noreferrer\">huge undertaking<\/a>,\u201d Coffman told me. \u201cWe\u2019ve collected and labeled over 3.5 million images\u201d so far to use for training machine-learning algorithms, he said \u2013 and that labeling requires trained human analysts to look at each picture and tell the computer what it was: \u201cThat\u2019s someone sitting there and going, \u2018that\u2019s a T-72; that\u2019s a BMP,\u2019\u201d etcetera ad nauseam, he said.<\/p>\n<p>But each individual robot or drone doesn\u2019t need to carry those millions of images in its own onboard memory: It just needs the \u201cclassifier\u201d algorithms that resulting from running through images through machine-learning systems. Because those algorithms themselves don\u2019t take up a ton of memory, it\u2019s possible to run them on a computer that fits easily on the individual bot.<\/p>\n<p>\u201cWe\u2019ve proven we can do that with a tethered or untethered UAV. We\u2019ve proven we can do that with a robot. We\u2019ve proven we can do that on a vehicle,\u201d Coffman said. \u201cWe can identify the enemy by type and location.\u201d<\/p>\n<p>\u201cThat\u2019s all happening <a href=\"https:\/\/breakingdefense.com\/tag\/edge-computing\/\" target=\"_blank\" rel=\"noopener noreferrer\">on the edge<\/a>,\u201d he emphasized. \u201cThis isn\u2019t having to go back to some mainframe [to] get processed.\u201d<\/p>\n<figure id=\"attachment_6635\" aria-describedby=\"caption-attachment-6635\" style=\"width: 1024px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" class=\"size-full wp-image-6635\" src=\"https:\/\/www.fie.undef.edu.ar\/ceptm\/wp-content\/uploads\/2020\/10\/Robot-from-above-200825-A-WL997-231-1024x559-1.jpg\" alt=\"\" width=\"1024\" height=\"559\" srcset=\"https:\/\/www.fie.undef.edu.ar\/ceptm\/wp-content\/uploads\/2020\/10\/Robot-from-above-200825-A-WL997-231-1024x559-1.jpg 1024w, https:\/\/www.fie.undef.edu.ar\/ceptm\/wp-content\/uploads\/2020\/10\/Robot-from-above-200825-A-WL997-231-1024x559-1-300x164.jpg 300w, https:\/\/www.fie.undef.edu.ar\/ceptm\/wp-content\/uploads\/2020\/10\/Robot-from-above-200825-A-WL997-231-1024x559-1-768x419.jpg 768w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption id=\"caption-attachment-6635\" class=\"wp-caption-text\">The Army\u2019s experimental \u201cOrigin\u201d robot during Project Convergence<\/figcaption><\/figure>\n<p>In other words, the individual robot doesn\u2019t have to constantly transmit real-time, high-res video of everything it sees to some distant human analyst or AI master brain. Sending that much data back and forth is too big a strain on <a href=\"https:\/\/breakingdefense.com\/2020\/09\/improvised-mode-the-army-network-evolves-in-project-convergence\/\" target=\"_blank\" rel=\"noopener noreferrer\">low-bandwidth tactical networks<\/a>, which are often disrupted by terrain, technical glitches, and enemy jamming. Instead, the robot can identify the potential target itself, with its onboard AI, and just transmit the essential bits \u2013 things like the type of vehicles spotted, their numbers and location, and what they\u2019re doing.<\/p>\n<p>\u201cYou want to reduce the amount of information that you pass on the network to a tweet, as small as possible, so you\u2019re not clogging the pipes,\u201d Coffman told me.<\/p>\n<p>But before the decision is made to open fire, he emphasized, a human being has to look at the sensor feed long enough to confirm the target and give the order to engage.<\/p>\n<p>\u201cThere\u2019s always a human that is looking at the sensor image,\u201d Coffman said. \u201cThen the human decides, \u2018yes, I want to prosecute that target.\u2019\u201d<\/p>\n<p>\u201cCould that be done <a href=\"https:\/\/breakingdefense.com\/2019\/11\/bipartisan-ai-commission-dod-should-consider-truly-autonomous-weapons\/\" target=\"_blank\" rel=\"noopener noreferrer\">automatically, without a human in the loop<\/a>?\u201d he said. \u201cYeah, I think it\u2019s technologically <em>feasible<\/em> to do that. But the United States Army [is] an <a href=\"https:\/\/breakingdefense.com\/2020\/09\/13-nations-meet-on-ethics-for-military-ai\/\" target=\"_blank\" rel=\"noopener noreferrer\">ethics-based organization<\/a>. There will be a human in the loop.\u201d<\/p>\n<p><strong>Fuente:<\/strong> <a href=\"https:\/\/breakingdefense.com\/2020\/09\/army-robots-hunt-tanks-in-project-convergence\/?utm_campaign=Breaking%20Defense%20Land&amp;utm_medium=email&amp;_hsmi=96373537&amp;_hsenc=p2ANqtz-_j3Wwz-8AFr44VnEzk3VJTFkIobzMhnCXM3vUzCep6NTno3LSxiDFVzsX6X3T2savrjEkQs4LC7jv55kdW4-PHECWNIw&amp;utm_content=96373537&amp;utm_source=hs_email\" target=\"_blank\" rel=\"noopener noreferrer\"><em>https:\/\/breakingdefense.com<\/em><\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>En el marco del denominado PROJECT CONVERGENSE, donde el US ARMY desarrolla y ensaya multiplicidad de sistemas aut\u00f3nomos, terrestres y a\u00e9reos, as\u00ed como sensores y&hellip; <\/p>\n","protected":false},"author":1,"featured_media":6631,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[18,11,2],"tags":[],"_links":{"self":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts\/6630"}],"collection":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=6630"}],"version-history":[{"count":1,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts\/6630\/revisions"}],"predecessor-version":[{"id":6636,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts\/6630\/revisions\/6636"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/media\/6631"}],"wp:attachment":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=6630"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=6630"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=6630"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}