{"id":13599,"date":"2023-11-17T08:29:20","date_gmt":"2023-11-17T11:29:20","guid":{"rendered":"https:\/\/www.fie.undef.edu.ar\/ceptm\/?p=13599"},"modified":"2023-11-17T08:29:20","modified_gmt":"2023-11-17T11:29:20","slug":"la-inteligencia-artificial-ya-esta-fusionando-con-la-robotica-el-resultado-podrian-ser-armas-mas-poderosas","status":"publish","type":"post","link":"https:\/\/www.fie.undef.edu.ar\/ceptm\/?p=13599","title":{"rendered":"La inteligencia artificial ya est\u00e1 fusionando con la rob\u00f3tica, el resultado podr\u00edan ser armas m\u00e1s poderosas"},"content":{"rendered":"<p>El inter\u00e9s por la incorporaci\u00f3n de robots en el \u00e1mbito de seguridad y militares, ha tenido un fuerte crecimiento en los \u00faltimos a\u00f1os. Se trata de un sector que est\u00e1 siendo explorado en muchos pa\u00edses y varias compa\u00f1\u00edas desarrollan programas espec\u00edficos. Para el caso de los productos que operan sistemas letales, los mismos est\u00e1n siendo evaluados y en muchos casos cuestionados, cuando disponen de \u201cAutonom\u00eda total\u201d. Adem\u00e1s, la fusi\u00f3n de los Robots con la Inteligencia Artificial (IA), que posibilite a los sistemas en forma individual o en equipos, realizar misiones de ataque o gestionar el ciclo de neutralizaci\u00f3n de amenazas sin la intervenci\u00f3n humana, constituye una capacidad que muchas fuerzas militares ambicionan, por las ventajas que esto otorga al minimizar las bajas humanas propias. Sin embargo, es tambi\u00e9n un motivo de preocupaci\u00f3n por los riesgos y amenazas a la seguridad, que un empleo indebido e indiscriminado de estos medios podr\u00eda ocasionar.<\/p>\n<hr \/>\n<p>Interest in the incorporation of robots into security, policing and military operations has been steadily increasing over the last few years. It\u2019s an avenue already being explored in both\u00a0<a href=\"https:\/\/edition.cnn.com\/2022\/02\/19\/us\/robot-dogs-us-mexico-border-patrol-cec\/index.html\" target=\"_blank\" rel=\"noopener\">North America<\/a>\u00a0and\u00a0<a href=\"https:\/\/www.bbc.co.uk\/news\/uk-england-bristol-60145220\">Europe<\/a>.<\/p>\n<p>Robot integration into these areas could be seen as analogous to the inclusion of dogs in policing and military roles in the 20th century. Dogs have served as guards, sentries, message carriers and mine detectors, among other roles.<\/p>\n<p>Utility robots, designed to play a support role to humans, are mimicking our four-legged companions\u00a0<a href=\"https:\/\/www.cnbc.com\/2021\/12\/26\/robotic-dogs-taking-on-jobs-in-security-inspection-and-public-safety-.html\" target=\"_blank\" rel=\"noopener\">not only in form<\/a>, but in function as well. Mounted with surveillance technology and able to ferry equipment, ammunition and more as part of resupply chains, they could significantly minimise the risk of harm to human soldiers on the battlefield.<\/p>\n<p>However, utility robots would undoubtedly take on a different dimension if weapons systems were added to them. Essentially, they would become land-based variants of the\u00a0<a href=\"https:\/\/www.af.mil\/About-Us\/Fact-Sheets\/Display\/Article\/104470\/mq-9-reaper\/\" target=\"_blank\" rel=\"noopener\">MQ-9 Predator Drone aircraft<\/a>\u00a0currently in use by the US military.<\/p>\n<p>In 2021, the company Ghost Robotics\u00a0<a href=\"https:\/\/www.thedrive.com\/the-war-zone\/42717\/robot-dogs-can-now-have-6-5mm-assault-rifles-%20mounted-on-their-backs\" target=\"_blank\" rel=\"noopener\">showcased one of their four-legged robots, called Q-UGV<\/a>, that had been armed with a\u00a0<a href=\"https:\/\/www.thedrive.com\/the-war-zone\/42717\/robot-dogs-can-now-have-6-5mm-assault-rifles-%20mounted-on-their-backs\" target=\"_blank\" rel=\"noopener\">Special Purpose Unmanned Rifle 4<\/a>. The showcase event leaned into the weaponisation of utility robots.<\/p>\n<p>It is important to take note of how each aspect of this melding of weaponry and robotics operates in a different way. Although the robot itself is semi-autonomous and can be controlled remotely, the mounted weapon has no autonomous capability and is fully controlled by an operator.<\/p>\n<p>In September 2023, US Marines\u00a0<a href=\"https:\/\/www.thedrive.com\/the-war-zone\/42717\/robot-dogs-can-now-have-6-5mm-assault-rifles-%20mounted-on-their-backs\" target=\"_blank\" rel=\"noopener\">conducted a proof of concept test<\/a>\u00a0involving another four-legged utility robot. They measured\u00a0<a href=\"https:\/\/www.marines.mil\/Photos\/igphoto\/2003324126\/\" target=\"_blank\" rel=\"noopener\">its abilities to<\/a>\u00a0\u201cacquire and prosecute targets with a M72 Light Anti-Tank Weapon\u201d.<\/p>\n<p>The test reignited the ethics debate about the use of automated and semi-automated weapon systems in warfare. It would not be such a big step for either of these platforms to incorporate AI-driven threat detection and the capability to \u201clock on\u201d to targets. In fact, sighting systems of this nature\u00a0<a href=\"https:\/\/www.thedrive.com\/the-war-zone\/42717\/robot-dogs-can-now-have-6-5mm-assault-rifles-mounted-on-their-backs\" target=\"_blank\" rel=\"noopener\">are already available on the open market<\/a>.<\/p>\n<p><iframe loading=\"lazy\" title=\"U.S. Marines test fire the M72 LAW with a Robotic Goat #roboticgoat #Unitree\" src=\"https:\/\/www.youtube.com\/embed\/hdyIB_bLeKA\" width=\"754\" height=\"424\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/p>\n<p>In 2022, a\u00a0<a href=\"https:\/\/bostondynamics.com\/news\/general-purpose-robots-should-not-be-weaponized\/\" target=\"_blank\" rel=\"noopener\">dozen leading robotics companies<\/a>\u00a0signed an open letter hosted on the website of Boston Dynamics, which created a dog-like utility robot called Spot. In the letter, the companies came out against the weaponisation of commercially available robots.<\/p>\n<p>However, the letter also said the companies did not take issue \u201cwith existing technologies that nations and their government agencies use to defend themselves and uphold their laws\u201d. On that point, it\u2019s worth considering whether the horse has already bolted with regards to the weaponisation of AI. Weapons systems with intelligent technology integrated into robotics\u00a0<a href=\"https:\/\/www.newscientist.com\/article\/2282656-israel-used-worlds-first-ai-guided-combat-drone-swarm-in-gaza-attacks\/\" target=\"_blank\" rel=\"noopener\">are already being used in combat<\/a>.<\/p>\n<p>This month, Boston Dynamics publicised a video showing how the company had added the AI chatbot ChatGPT\u00a0<a href=\"https:\/\/www.youtube.com\/watch?v=djzOBZUFzTw&amp;t=3s\" target=\"_blank\" rel=\"noopener\">to its Spot robot<\/a>. The machine can be seen responding to questions and conversation from one of the company\u2019s engineers using several different \u201cpersonalities\u201d, such as an English butler. The responses come from the AI chatbot, but Spot mouths the words.<\/p>\n<p><iframe loading=\"lazy\" title=\"Making Chat (ro)Bots\" src=\"https:\/\/www.youtube.com\/embed\/djzOBZUFzTw\" width=\"754\" height=\"424\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/p>\n<p>It\u2019s a fascinating step for the industry and, potentially, a positive one. But while Boston Dynamics may be maintaining its pledge not to weaponise their robots, other companies may not feel the same way. There\u2019s also the potential for misuse of such robots by people or institutions that lack a moral compass. As the open letter hints: \u201cWhen possible, we will carefully review our customers\u2019 intended applications to avoid potential weaponisation.\u201d<\/p>\n<p><strong>UK stance<\/strong><\/p>\n<p>The UK has already taken a stance on the weaponisation of AI with their\u00a0<a href=\"https:\/\/assets.publishing.service.gov.uk\/media\/614db4d1e90e077a2cbdf3c4\/National_AI_Strategy_-_PDF_version.pdf\" target=\"_blank\" rel=\"noopener\">Defence Artificial Intelligence Strategy<\/a>, published in 2022. The document expresses the intent to rapidly integrate artificial intelligence into ministry of defence systems to strengthen security and modernise armed forces.<\/p>\n<p>Notably, however,\u00a0<a href=\"https:\/\/www.gov.uk\/government\/publications\/ambitious-safe-responsible-our-approach-to-the-delivery-of-ai-enabled-capability-in-defence\/ambitious-safe-responsible-our-approach-to-the-delivery-of-ai-enabled-capability-in-defence#annex-b-the-ministry-of-defence-ai-ethics-advisory-panel\" target=\"_blank\" rel=\"noopener\">an annex to the strategy document<\/a>\u00a0specifically recognises the potential challenges associated with lethal autonomous weapons systems.<\/p>\n<p>For example, real world data is used to \u201ctrain\u201d AI systems, or improve them. With ChatGPT, this is gathered from the internet. While it helps AI systems become more useful, all that \u201creal world\u201d information can also pass on flawed assumptions and prejudices to the system itself. This can lead to algorithmic bias (where the AI favours one group or option over another) or inappropriate and disproportionate responses by the AI. As such, sample training data for weapons systems needs to be carefully scrutinised with ethical warfare in mind.<\/p>\n<p>This year, the House of Lords\u00a0<a href=\"https:\/\/committees.parliament.uk\/committee\/646\/ai-in-weapon-systems-committee\/\" target=\"_blank\" rel=\"noopener\">established an AI in Weapon Systems select committee<\/a>. Its brief is to see how armed forces can reap the benefits of technological advances, while minimising the risks through the implementation of technical, legal and ethical safeguards. The sufficiency of UK policy and international policymaking is also being examined.<\/p>\n<p>Robot dogs aren\u2019t aiming weapons at opposing forces just yet. But all the elements are there for this scenario to become a reality, if left unchecked. The fast pace of development in both AI and robotics is creating a perfect storm that could lead to powerful new weapons.<\/p>\n<p>The\u00a0<a href=\"https:\/\/www.theguardian.com\/technology\/2023\/nov\/02\/five-takeaways-uk-ai-safety-summit-bletchley-park-rishi-sunak\" target=\"_blank\" rel=\"noopener\">recent AI safety summit in Bletchley Park<\/a>\u00a0had a positive outcome for AI regulation, both in the UK and internationally. However, there were signs of a philosophical split between the summit goals and those of the AI in Weapon Systems committee.<\/p>\n<p>The summit was geared towards defining AI, assessing its capabilities and limitations and creating a global consensus with regard to its ethical use. It sought to do so via a declaration, very much like the Boston Dynamics open letter. Neither, however, is binding. The committee seeks to clearly and rapidly integrate the technology, albeit in accordance with ethics, regulations and international law.<\/p>\n<p>Frequent\u00a0<a href=\"https:\/\/www.theguardian.com\/technology\/2023\/may\/18\/uk-will-lead-on-guard-rails-to-limit-dangers-of-ai-says-rishi-sunak\" target=\"_blank\" rel=\"noopener\">use of the term<\/a>\u00a0\u201c<a href=\"https:\/\/www.datacenterdynamics.com\/en\/news\/global-ai-safety-summit-kicks-off-in-uk-with-bletchley-declaration\/\" target=\"_blank\" rel=\"noopener\">guard rails<\/a>\u201d in relation to the Bletchley summit and\u00a0<a href=\"https:\/\/www.gov.uk\/government\/publications\/ai-safety-summit-2023-the-bletchley-declaration\/the-bletchley-declaration-by-countries-attending-the-ai-safety-summit-1-2-november-2023\" target=\"_blank\" rel=\"noopener\">declaration<\/a>\u00a0suggests voluntary commitments. And UK prime minister Rishi Sunak has stated that countries should\u00a0<a href=\"https:\/\/www.theguardian.com\/technology\/2023\/nov\/03\/rishi-sunak-elon-musk-ai-summit-what-we-learned\" target=\"_blank\" rel=\"noopener\">not rush to regulate<\/a>.<\/p>\n<p>The nobility of such statements wanes in consideration of the enthusiasm in some quarters for integrating the technology into weapons platforms.<\/p>\n<p><strong>Fuente:<\/strong> <a href=\"https:\/\/theconversation.com\/ai-is-already-being-melded-with-robotics-one-outcome-could-be-powerful-new-weapons-216576\" target=\"_blank\" rel=\"noopener\"><em>https:\/\/theconversation.com<\/em><\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>El inter\u00e9s por la incorporaci\u00f3n de robots en el \u00e1mbito de seguridad y militares, ha tenido un fuerte crecimiento en los \u00faltimos a\u00f1os. Se trata&hellip; <\/p>\n","protected":false},"author":1,"featured_media":13600,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[18,2,23],"tags":[],"_links":{"self":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts\/13599"}],"collection":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=13599"}],"version-history":[{"count":1,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts\/13599\/revisions"}],"predecessor-version":[{"id":13601,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts\/13599\/revisions\/13601"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/media\/13600"}],"wp:attachment":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=13599"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=13599"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=13599"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}