{"id":478,"date":"2015-09-22T12:05:32","date_gmt":"2015-09-22T15:05:32","guid":{"rendered":"https:\/\/www.nachodelatorre.com.ar\/mosconi\/?p=478"},"modified":"2015-09-22T12:05:32","modified_gmt":"2015-09-22T15:05:32","slug":"multifab-3-d-prints-a-record-10-materials-at-once-no-assembly-required","status":"publish","type":"post","link":"https:\/\/www.fie.undef.edu.ar\/ceptm\/?p=478","title":{"rendered":"MultiFab\u201d 3-D prints a record 10 materials at once, no assembly required"},"content":{"rendered":"<blockquote><p>Printer from Computer Science and Artificial Intelligence Lab uses machine vision and 3-D scanning to self-correct and directly embed components.<\/p><\/blockquote>\n<p>3-D printing is great, assuming you&#8217;re printing one material for one purpose, and that you\u2019re fine with a few do-overs. But the technology is still far behind in reliably producing a variety of useful objects, with no assembly required, at a moderate cost.<!--more--><\/p>\n<p>In recent years, companies have been working to tackle some of these challenges with \u201cmulti-material\u201d 3-D printers that can fabricate many different functional items. Such printers, however, have traditionally been limited to three materials at a time, can cost as much as $250,000 each, and still require a fair amount of human intervention.<\/p>\n<p>But this week, researchers at MIT\u2019s Computer Science and Artificial Intelligence Laboratory (CSAIL) say that they\u2019ve found a way to make a better, cheaper, more user-friendly printer. In a paper accepted at the SIGGRAPH computer-graphics conference, a CSAIL team presented a 3-D printer that can print an unprecedented 10 different materials at once by using 3-D-scanning techniques that save time, energy, and money.<\/p>\n<p><iframe loading=\"lazy\" src=\"https:\/\/www.youtube-nocookie.com\/embed\/poRFPjiB9vw?rel=0&amp;showinfo=0\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/p>\n<p>Delivering resolution at 40 microns \u2014 or less than half the width of a human hair \u2014 the \u201cMultiFab\u201d system is the first 3-D printer to use 3-D-scanning techniques from machine vision, which offers two key advantages in accuracy and convenience over traditional 3-D printing.<\/p>\n<p>First, MultiFab can self-calibrate and self-correct, freeing users from having to do the fine-tuning themselves. For each layer of the design, the system\u2019s feedback loop 3-D scans and detects errors and then generates so-called \u201ccorrection masks.\u201d This approach allows the use of inexpensive hardware while ensuring print accuracy.<\/p>\n<p>Secondly, MultiFab gives users the ability to embed complex components, such as circuits and sensors, directly onto the body of an object, meaning that it can produce a finished product, moving parts and all, in one fell swoop.<\/p>\n<p>&#8220;The platform opens up new possibilities for manufacturing, giving researchers and hobbyists alike the power to create objects that have previously been difficult or even impossible to print,&#8221; says Javier Ramos, a research engineer at CSAIL who co-authored the paper with members of professor Wojciech Matusik\u2019s <a href=\"http:\/\/cfg.mit.edu\/\" target=\"_blank\" rel=\"noopener noreferrer\">Computational Fabrication Group<\/a>.<\/p>\n<p>The researchers have used MultiFab to print everything from smartphone cases to light-emitting diode lenses \u2014 and they envision an array of applications in consumer electronics, microsensing, medical imaging, and telecommunications, among other things. They plan to also experiment with embedding motors and actuators that would make it possible to 3-D print more advanced electronics, including robots.<\/p>\n<p>MultiFab was built using low-cost, off-the-shelf components that cost around $7,000 total. Besides Ramos and Matusik, the team includes lead author and former CSAIL postdoc Pitchaya Sitthi-Amorn, former graduate students Joyce Kwan and Justin Lan, research scientist Wenshou Wang, and graduate student Yu Wang of Tsinghua University.<\/p>\n<p><strong>Why multi-material printing is hard<\/strong><\/p>\n<p>There are many technical challenges to creating a printer like MultiFab: Different materials require different pressures and temperatures, so printing something complex usually involves printing all individual pieces separately, and then assembling them by hand.<\/p>\n<p>But with MultiFab, you simply put the components into the platform and the printer does the rest. Cameras automatically scan the components&#8217; three-dimensional geometries and uses that information to print other objects around them. For example, you can put an iPhone into the printer, and program the system to print a case that is directly affixed onto the phone.<\/p>\n<p>Other multi-material printers work via \u201cextrusion\u201d technologies, using nozzles that squirt out melted material, that then hardens, to build an object layer-by-layer. Such techniques, while sufficient for certain uses, often lead to low-resolution finished items.<\/p>\n<p>MultiFab, on the other hand, mixes microscopic droplets of photopolymers together that are then sent through inkjet printheads similar to those in office printers. The computationally intensive process, which involves crunching dozens of gigabytes of visual data, can be much more easily scaled to larger objects and multiple materials.<\/p>\n<p><strong>What\u2019s next?<\/strong><\/p>\n<p>Ramos says that he could imagine printers like MultiFab being used by researchers, manufacturers, and consumers.<\/p>\n<p>Companies could edit and finalize designs faster, allowing them to bring products to market sooner. Big-box stores that have already installed single-material 3-D printers could graduate to multi-material ones, for use by casual hobbyists and small business owners alike.<\/p>\n<p>\u201cPicture someone who sells electric wine openers, but doesn\u2019t have $7,000 to buy a printer like this. In the future, they could walk into a FedEx with a design and print out batches of their finished product at a reasonable price,\u201d Ramos says. \u201cFor me, a practical use like that would be the ultimate dream.\u201d<\/p>\n<p>The work was supported in part by a grant from MIT&#8217;s Deshpande Center for Technological Innovation, <span class=\"st\">the Defense Advanced Research Projects Agency<\/span>, and the National Science Foundation.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Printer from Computer Science and Artificial Intelligence Lab uses machine vision and 3-D scanning to self-correct and directly embed components. 3-D printing is great, assuming&hellip; <\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[2,23,29],"tags":[],"_links":{"self":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts\/478"}],"collection":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=478"}],"version-history":[{"count":0,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=\/wp\/v2\/posts\/478\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=478"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=478"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.fie.undef.edu.ar\/ceptm\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=478"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}