What happens when you want to take a tangible object and replicate it through the process of 3D printing? If you are like most, you will either try and find a CAD file for that object somewhere on the internet, which will probably end up leaving you empty handed, or you will use the most common method currently available — 3D scanning. 3D scanners, however, are quite pricey and unless you are using a really high quality device, you probably will need to modify the virtual version of the object quite a bit, once scanned into your computer.
Researchers at the Centro de Investigaciones en Optca (CIO) in Mexico, appear to have come up with a better solution; a solution that could be quite a bit more affordable, yet even more reliable than current 3D scanning technology.
The process which is referred to as “Co-phased 360-degree profilometry” involves 5 things: A camera, 2 light projectors, the subject (positioned and able to be rotated 360 degrees), and a computer. It is based off a discovery outlined in a classical research paper by Takeda et al. in 1982, which results in 3D models of quasi-cylindrical solids, but fails to create reliable 3D models of non-cylindrical or more detailed objects. The Takeda et al. method uses a single light projector to shine parallel lines onto an object. A camera then captures images of the object with the bent light, and a computer is able to interpret those bends in order to create a 3D model of it. This just isn’t feasible though for those wishing to create 3D models of most objects, as many objects feature details that the Takeda et al. method would not be capable of modeling.
The new method, realized by researchers, Manuel Servin, Guillermo Garnica and J. M. Padilla, utilizes an additional projector which is positioned at an identical angle on both sides of the camera. Photos are taken with the camera, first with one projector beaming light toward the object, and then with the second doing the same thing from the other side. The object is then turned slightly and the same exact process takes place again. This takes place until enough photographs are taken to entirely model the object. It is based on many complicated equations that I can’t even begin to understand.
The reason that this method is so successful in modeling a 3-dimensional object is similar to the reason of why 3D glasses work to trick our brains into thinking we are seeing a 3-dimensional object, when in fact we are not. With 3D photography, 2 camera angles are used, one representing each of our eyes. Then, when combined into a movie or photograph, our individual eyes only see one of the combined images, due to the 3D glasses cancelling the other out. Each eye sees a slightly different image, one from the right, and one from the left, tricking our brains into believing that we are actually seeing it in 3 dimensions.
In the case of this 3D modeling experiment, we aren’t exactly tricking our brains, but we are providing the computers with sophisticated data about the way in which the light is being bent from both sides of the camera. Like our brains, with 3D glasses, the computer can comprehend the 3-dimensional characteristics of the object.
In the research paper, the team was able to successfully model a rather complex plastic skull, by using this method.
This could go a long way in helping create affordable methods of fabricating 3D printable models of tangible objects, rather than having to spend several thousand dollars on a 3D scanner that probably won’t provide quite as good results as this new method has the potential for. With the correct computer software, it would seem as though a machine capable of what these researchers have come up with could be built by the DIY community at quite an affordable price. It should be interesting to see how this evolves over the course of the next few years.
My question is, “How long until everyone will have access to technology like this, which allows them to create virtual objects of just about anything that exists in the tangible world?” This would lead to being able to view and print 3D models of virtually anything in ton the planet at the touch of a button.
Will this new method make 3D scanners obsolete anytime soon? Most definitely not, but it does provide yet another option for us to consider. What do you think of this new 3D modeling technique? Discuss in the Co-phased 360-degree profilometry forum thread on 3DPB.com.
Subscribe to Our Email Newsletter
Stay up-to-date on all the latest news from the 3D printing industry and receive information and offers from third party vendors.
You May Also Like
3D Printing Market Reaches $3.45B in Q2 2024, Marking 8.4% Year-Over-Year Growth
The global 3D printing market continued its upward trajectory in the second quarter of 2024, totaling $3.45 billion—a year-over-year increase of 8.4%. Despite a slight sequential decline from $3.47 billion...
New ABB Cobots Are 10 Times More Accurate for 3D Printing and More
ABB has introduced Ultra Accuracy GoFa cobots, which are ten times more accurate than the company’s previous cobots. While older industrial robots have driven innovation in concrete 3D printing, wire...
AM Expands Beyond 3D Printing at IMTS 2024
As discussed in our previous article on the Western hemisphere’s largest manufacturing trade show, the International Manufacturing Technology Show (IMTS), the industrialization of 3D printing was on display. This was...
Ursa Major & US Navy Make $25M Joint Investment in New 3D Printed Rocket Motor Prototype
Ursa Major, the Colorado-based company dedicated to building a North American rocket propulsion supply chain with advanced manufacturing, has become one of the first recipients of funding from the DoD’s...