There seems to be a growing trend fueled by several companies, a trend in which I have yet to take part in, but certainly look forward to doing so sometime in the future. That trend is the 3D printing of ‘mini-me’s’. Several companies are scanning individuals, and then printing out miniature versions of them (in rare cases full size versions) usually from a sandstone material. Recently, Amazon even got in on a version of this service, by
teaming with Mixee Labs to allow their customers to personalize bobblehead dolls to their likeness. Although in Amazon’s case, scanning wasn’t involved, this has brought the technology further into the mainstream.
Many of these printed ‘mini-me’s’ are fairly accurate, to an extent. The one aspect of these prints, which the printers and scanners typically have a hard time capturing, is that of a person’s hair. Not only the individual hairs which may be sticking up in varying directions, but the actual patterns of the hair on a person’s head.
With this said, it appears as if Disney Researchers, including Derek Bradley, associate research scientist at Disney Research Zurich, Dr. Thabo Beeler of Disney Research Zurich and Jose I. Echevarria, a Ph.D. student who interned at the Disney lab, and Dr. Diego Gutierrez, both of the University of Zaragoza, Spain have developed a method to deal with this common issue, enabling them to capture stylized hair within a 3D scan, and represent that style via a printed figurine. The main key in how they approached this problem is “a novel multi-view stylization algorithm, which ex-tends feature-preserving color filtering from 2D images to irregular manifolds in 3D, and introduces abstract geometric details that are coherent with the color stylization,” states the research paper.
The researchers argue that a person’s hairstyle is a defining characteristic, one which is almost as important as a person’s face. With that said, no other means of scanning has been able to capture the flow, colors and patterns within complicated hairstyles, usually leaving the printed figurines missing these defining characteristics.
Inspired by artistic sculptures of hair in the real world and within CG models, what their method does is take a vastly complicated feature of a person, their hair style, and simplify it, while still being able to capture the main defining features. The process starts with the data collection, via multiple images taken by SLR cameras of a person’s head. The multi-view stereo reconstruction algorithm within the software then creates a course proxy surface of the person’s hair from these images. The proxy surface is then used to generate key details about the color and shape of the person’s hair. It doesn’t capture characteristics fiber-by-fiber, but instead captures the general color and a simplified version of the texture and flow, which is easily printable.
As you can see from the images within this article, the researchers have very successfully mimicked the various hairstyles of individuals, as well as their facial hair and the fur on animals or clothing. They have also scanned and printed two individuals with four different hairstyles each, showing that the hairstyles are the defining features of each individual, in the photos, as well as in the printed figurines.
The researchers have concluded that their “method generates hair as a closed-manifold surface, yet contains the structural and color elements stylized in a way that captures the defining characteristics of the hair-style”.
Certainly this research could go a long way in the commercialization of 3D printing, as well as for other applications. Let’s hear your thoughts on these new methods in the 3D printed hair texture forum thread on 3DPB.com. Check out the video provided by Disney Research, further expanding on the capabilities of this technology.
[Source: Disney Research]Subscribe to Our Email Newsletter
Stay up-to-date on all the latest news from the 3D printing industry and receive information and offers from third party vendors.
Print Services
Upload your 3D Models and get them printed quickly and efficiently.
You May Also Like
Additive Manufacturing’s Next Chapter: From Prototype Tool to Operating Model
Additive manufacturing is shedding its legacy reputation. While it remains the go-to for prototyping, its potential now extends beyond. Today, it’s increasingly used to support real production requirements, helping teams...
Targeted Applications, Expanded Platform: XJet’s Strategic Vision for AM’s Next Chapter
The additive manufacturing sector has undergone a period of “creative destruction” over the past months, moving beyond a “growth at all costs” mentality into an era of consolidation and strategic...
3D Printing News Briefs: February 19, 2026: Market Data, Africa, Metal Parts for Defense, & More
We’re starting with some business news for you in today’s 3D Printing News Briefs! The Wohlers Report 2026 is now available, Carbon announced its new Chief Technology Officer, and Farsoon...
From “Magic” to Metal: How Intrepid Automation Wants to Make 3D Printing Matter at Scale
Ben Wynne still talks about 3D printing the way people do when they’ve felt that “wow” moment up close. Back in the early 2000s, he was working at HP’s advanced...


























