Formnext Germany

3D Printing, AI & the Future of Traceability: University of Illinois’s Bill King on 3D Printer “Fingerprints”

Share this Article

One of the most powerful aspects of academic research is its propensity to lead to discoveries that weren’t even on the researchers’ radar at the start of a given investigation. Such circumstances led to a University of Illinois (UI) research team’s realization that 3D printed parts leave signatures of the machines that produced them, which are detectable by AI.

Led by UI professor of mechanical science and engineering Bill King, the team recently published the results of their study in an npj Advanced Manufacturing paper, “Additive manufacturing source identification from photographs using deep learning.” King was as surprised as anyone else to see what the project demonstrated.

“When I first saw it, I didn’t believe it. We were looking to do something else,” King told me. “This was just a little side exploration, but it felt like serendipity. Once we saw it, we were like, this could be a big deal. I really wanted it to work! So I told the team that if we were going to do it, we had to do it the right way — we had to be 100 percent sure — because people are, rightly, going to be really skeptical.

“That’s why we designed the very elaborate study that we conducted: 9,000 parts, different suppliers, with some cooperating with us and some unaware of what we were doing, different machines, different processes, different materials and part designs. We thought of everything we could throw at it, and built the model to work for everything.”

AI-detected “fingerprints” in 3D printed parts: Four 3D printed parts made on four different printers. A deep learning model can determine the source machine of each part (Scale bar is 5 mm).

In total, the team used 21 different machines representing four unique AM processes:

  • Digital light synthesis (DLS), using the Carbon printers;
  • Multi jet fusion (MJF), using HP printers;
  • Stereolithography (SLA), with the Formlabs ecosystem;
  • Fused deposition modeling (FDM) printers made by Stratasys.

The team sourced a total of 9,192 parts, printed in six distinct materials, with three separate designs. 2100 of the parts were used to train the software, and 1050 were ultimately tested in the experiment.

The study’s overwhelming success is encapsulated by one number: 98.5, which is the accuracy percentage achieved by the AI model in tracing parts back to specific printers. Additionally, for just over half (12) of the printers utilized, the model correctly identified parts without making any errors at all.

Researchers trained an AI model to match small sections of 3D printed parts to the printer, process, and material used to make them.

Notably, these weren’t parts made “in the lab.” The team worked directly with Chicago-based service bureau SyBridge Technologies (a close partner of Carbon) and ordered the rest of the parts from suppliers who weren’t aware that the parts were being tested for an experiment:

“About half the parts in the study were made by SyBridge, who we were collaborating with, and for the other half, we just ordered parts from contract manufacturers without telling them what we were doing,” said King. “When those parts showed up, we took them out of the box and photographed them right away.

“It works the same whether or not the factory knows what you’re doing. I think that’s one of the biggest findings from the study: the manufacturers don’t have to know, and they don’t have to help. You, as the customer and as the user of the technology, get all the benefit without the supplier’s participation, without them even having to understand what you’re doing.”

From a practical standpoint, the capability could yield a seemingly endless flow of potential use-cases:

Professor William P. King.

“Everybody who works in manufacturing has a story about a supplier changing something without permission, and that’s true for all production processes including AM,” noted King. “But supply chains are based on trust. The AI model can tell you if the supplier is continuing to use the machine you approved, if they did maintenance on the machine, if the supplier outsourced the parts, etc. Suddenly, you can see multiple layers into your supply chain.

“You could really use this capability for anything, and I think it has great commercial utility — I see it being commercialized. My vision is that ultimately, you could walk up to a part that’s sitting in your factory, or in a warehouse or on a loading dock, take a photograph with your phone, and your phone tells you where the part came from.

“In terms of where the AI model could make the biggest immediate impact, there’s three industries that require 100 percent inspection: aerospace, medical, and nuclear energy. In those industries, suppliers are already inspecting every single part at every step along the way. Since there’s already such a comprehensive existing audit trail, I think those supply chains are particularly primed for this technology to be incorporated into the audit trail.”

While King sees the software as having the broadest commercial appeal concerning its potential for use with industrial-grade machines, there are certainly cases to be made, as well, for how the technology could have a major impact on parts made with desktop 3D printers:

“If you had a database of printers, you could analyze the parts on a ghost gun and trace it back to where it was made,” King said. “The same goes for any sort of illicit good — if law enforcement made a large seizure, they could figure out what parts were made by a specific organization, for instance.”

Parts used in the study were made with different printers, materials, and designs across four 3D printing processes.

As the source identification project moves into its next phase, King is also busy working on establishing a large-format, metal AM research center up and running at UI. Announced in early May, the new site is backed by over $8 million in Department of Defense (DoD) funding and will focus on parts made using additive friction stir deposition (AFSD), as well as wire DED. Specifically, the research will target the ground vehicle supply chain for the US Army:

“The short-term goal is to be able to make spare parts, since that’s a real pain point for the Army. But as the branch starts to design new vehicles and platforms, Army engineers want to be able to take advantage of AM for that, as well. And private industry is obviously interested in that, too.

“There are all kinds of benefits to vehicle design in terms of survivability, lightweighting and efficiency, things of that nature, and both the Army and the automotive sector are interested in how AM-enabled design freedom can open up new possibilities for mechanical performance. That’s a longer cycle, though. The spare parts are more of a ‘right now’ sort of thing.”

And although King has no immediate plans to incorporate the source identification research into the work being done at the new facility, he is certainly interested in incorporating AI into the mix:

“We think that AI has a major role to play helping us to figure out the process and property relationships of these newer AM technologies,” affirmed King. “The processes are super complicated, and what we want to be able to do is develop those material science relationships while minimizing the build time and minimizing the number of specimens required to get the data that we need. So that’s where AI comes in. We can bring in physical vision sensors combined with modeling and simulation, and use data science approaches to help us flesh out the properties of the materials.

“What we’re trying to do is replace metal fabrications that require castings and forgings. That capability has really eroded in the U.S. over the last several decades. As we think about the future of supply chains and how the global economy has started to evolve into an environment with multiple different power centers, it’s going to be really important for the U.S. to make metal parts. There’s a really compelling case that some of it could pivot to being made with AM instead of us having to rebuild the old ways of doing things.”

Images courtesy of the University of Illinois, Miles Bimrose, and npj Advanced Manufacturing



Share this Article


Recent News

3D Printing News Briefs, June 21, 2025: AI Co-Pilot, Plastics Recycling, & More

The Ethics of Reviewing 3D Printers and How I Go About Them



Categories

3D Design

3D Printed Art

3D Printed Food

3D Printed Guns


You May Also Like

Featured

EufyMake E1: Full-Color 3D Textured UV Printing For Everyone?!

Disclosure: The E1 was provided to me by EufyMake (AnkerMake) free of charge for the purpose of this review. I have not received any other compensation. All opinions expressed are...

Hands on with Formlabs’ New Form 4 3D Printer

Thanks to the incredible team over at Formlabs, I was given the chance to test out the new Form 4 and its accessories. To learn more about what sets it...

3D Printing Ethics: Navigating the Gray Areas of 3D Technology

From crafting custom birthday presents to building life-saving prosthetics, 3D printing has revolutionized how we interact with the physical world. But with great power comes great responsibility, and the democratization...

3D Printing for Dummies, 3rd Edition: Breezy, Educational, Essential

I’ve never had the highest opinion of the For Dummies series. I always thought of these books as the kind that people give to dads and uncles to support a...