{"id":720,"date":"2015-12-09T08:17:05","date_gmt":"2015-12-09T11:17:05","guid":{"rendered":"https:\/\/www.nachodelatorre.com.ar\/mosconi\/?p=720"},"modified":"2015-12-09T08:17:05","modified_gmt":"2015-12-09T11:17:05","slug":"mit-drone-flies-autonomously-while-avoiding-obstacle","status":"publish","type":"post","link":"https:\/\/www.fie.undef.edu.ar\/ceptm\/?p=720","title":{"rendered":"MIT Drone Flies Autonomously While Avoiding Obstacle"},"content":{"rendered":"<p>In just about every video featuring drones making aggressive maneuvers around obstacles there\u2019s some amount of \u201ccheating\u201d going on. By that we mean the drones are typically relying on an external\u00a0motion-capture system, as well as beefy offboard computers\u00a0and a rock-solid wireless link. For doing research on\u00a0aggressive maneuvers and other drone capabilities,\u00a0it\u2019s totally fine to \u201ccheat\u201d like that. But at some point you\u2019ll want your\u00a0drones to be able to\u00a0fly anywhere and not just\u00a0inside the controlled environment of a very expensive robotics lab.<!--more--><\/p>\n<p>With that goal in mind\u2014and\u00a0just US $1700 in hardware\u2014<a href=\"https:\/\/github.com\/andybarry\/flight\">MIT PhD student Andrew Barry<\/a>\u00a0has managed to fire a fixed-wing drone at some trees and not hit them, using only two cellphones worth of onboard computing hardware and real-time image processing.<\/p>\n<p>Whoa.<\/p>\n<p><iframe loading=\"lazy\" src=\"https:\/\/www.youtube-nocookie.com\/embed\/cZE01bJIgvQ?rel=0&amp;showinfo=0\" width=\"420\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/p>\n<p>Real-time image processing using hardware small enough, and efficient enough, to be used on a drone has always been a challenge. Cameras that have the necessary frame rate and resolution to enable you to clearly see obstacles in the first place pour out a\u00a0humongous\u00a0number of pixels, each one of which needs to be analyzed to determine whether the drone has to worry about it. Barry\u2019s insight was that a fast-flying drone doesn\u2019t have to care about almost everything that it can see: it only needs to be concerned with a relatively small volume at a fixed distance in front of it.<\/p>\n<p>To put this in context, consider what you\u2019re doing when you run on rough ground. If you\u2019re anything like me (which I flatter myself as \u201cnormal\u201d), you\u2019re probably perpetually looking at an area of ground a few meters in front of where you currently are as you run. The area closer to your feet you\u2019ve already seen and analyzed, and you know how to run over it without faceplanting. The area farther away isn\u2019t important because you\u2019re not there yet. All you really need to pay attention to is this narrow range in front of you and you\u2019ll be able to run indefinitely as long as <a href=\"https:\/\/youtu.be\/_d8ROhH3_vs?t=7s\">there are no abrupt environmental changes<\/a>.<\/p>\n<div class=\"imgWrapper rt med-lrg\"><img src=\"http:\/\/spectrum.ieee.org\/img\/Screen%20Shot%202015-11-03%20at%2032247%20PM-1446582198229.png\" alt=\"img\" \/><\/div>\n<p>MIT\u2019s drone flies pretty much exactly like you run. Using stereo filtering from a pair of 376-by-240-pixel resolution, 120-frames-per-second cameras spaced 34 centimeters\u00a0apart, the drone focuses its attention (for robots, this equates to obstacle avoidance algorithms) on pixels that are about 10 meters away and nothing else. It saves these pixels in memory, and the next image (taken 8.3 cm later if the drone is flying at 10 meters per second) adds more pixels beyond the previous set. In this way, the drone can very efficiently build up a 3D map of what\u2019s directly in front of it, and take action based on that map. This technique is called \u201cpushbroom stereo detection,\u201d because the detection area is like a three-dimensional broom that\u2019s being pushed forward.<\/p>\n<figure class=\"xlrg\"><img src=\"http:\/\/spectrum.ieee.org\/img\/mit-drone2-1446592899134.png\" alt=\"img\" \/><figcaption class=\"hi-cap\">Image: MIT<\/figcaption><\/figure>\n<p>The drone only remembers pixels for a second or two, so it\u2019s not building a map of the area that it\u2019s flying through (although it certainly could). The obstacle avoidance itself is dynamic, reactive, and computed entirely on the drone, which searches through an existing library of trajectories that it knows to be stable and chooses the best one (you can see this happening in the video). Because the detection horizon for obstacles is so short, the drone might not have enough time to take an effective evasive maneuver if (say) it approaches a building, but for trees and other relatively small and discrete obstacles, it seems like it should be able to continue avoiding things indefinitely. As the researchers point out, the detection horizon is primarily constrained by computer processing power, <a href=\"http:\/\/spectrum.ieee.org\/static\/special-report-50-years-of-moores-law\">so as that improves<\/a>, they\u2019ll be able to scan multiple depths to plan more complex paths around multiple obstacles at varying distances.<\/p>\n<p>This is a combination of related research on both the high-speed maneuvering around obstacles in flight and the pushbroom stereo obstacle detection. We\u2019ve covered this research in the past (2012 and 2014, respectively), but here are the videos again, because it\u2019s very cool to see the individual pieces that resulted in this new capability:<\/p>\n<div class=\"fluid-width-video-wrapper\"><iframe loading=\"lazy\" id=\"fitvid1\" src=\"http:\/\/www.youtube.com\/embed\/voN9CCmzxYk\" width=\"300\" height=\"150\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/div>\n<div class=\"fluid-width-video-wrapper\"><iframe loading=\"lazy\" id=\"fitvid2\" src=\"http:\/\/www.youtube.com\/embed\/cZE01bJIgvQ\" width=\"300\" height=\"150\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/div>\n<p>Incidentally, this is exactly what makes us think that <a href=\"http:\/\/spectrum.ieee.org\/automaton\/robotics\/aerial-robots\/amazon-prime-air-package-drone-delivery\">delivery drones aren\u2019t<\/a> <a href=\"http:\/\/spectrum.ieee.org\/automaton\/robotics\/aerial-robots\/walmart-delivery-drones\">going to be ready for a while<\/a>: getting stuff like this to work properly takes lots of incremental steps, and most of them aren\u2019t easy. It\u2019s possible that companies are throwing enough people and money at the problem to make substantial progress, but we\u2019re just not seeing anything like this from the people who promise drone delivery.<\/p>\n<p>So it\u2019s exciting to see real progress from MIT, and we\u2019d love to be wrong about the near future of drone delivery. If that\u2019s going to happen, it\u2019s going to take more research like this.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Fuente:<\/strong> <em><a href=\"http:\/\/spectrum.ieee.org\/automaton\/robotics\/aerial-robots\/mit-drone-avoids-obstacles\" target=\"_blank\" rel=\"noopener noreferrer\">http:\/\/spectrum.ieee.org<\/a><\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In just about every video featuring drones making aggressive maneuvers around obstacles there\u2019s some amount of \u201ccheating\u201d going on. By that we mean the drones&hellip; <\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[2,29],"tags":[],"_links":{"self":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts\/720"}],"collection":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=720"}],"version-history":[{"count":0,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts\/720\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=720"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=720"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=720"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}