Topology optimization is an important tool within the “generative design” toolbox that can be used as a component of a larger engineering and product design workflow. Let’s explore how topology optimization exists – in fact thrives – within a framework that utilizes field data, implicit geometry, and block architecture.
Topology Optimization and Generative Design
Let’s start by clarifying some terms often seen in the industry, most notably “generative design” and how it is viewed with respect to topology optimization. A brief internet search of the two terms yields mixed results and in general it seems fair to use them interchangeably. However, after converging on more precise definitions, it becomes clear that some people do just want traditional topology optimization, while others are envisioning a more “generative” technique.
Topology optimization has been known for decades and is becoming more practical thanks in part to advances in computational technology and additive manufacturing. The approach is typically formulated as a conventional optimization problem, where an algorithm is used to minimize some chosen objective function subject to several constraints. For example, minimizing the weight of a structural bracket subject to constraints including stress, deflection, or overhang angles in the case of additive manufacturing. Topology optimization problems that are quite complex can be formulated, where they involve multi-objective, multiphysics applications with millions or even billions of design variables2. Whatever the scale and complexity, the optimization process is conceptually operating in a continuous design space and converging on a global optimum while satisfying the known constraints.
What is Generative Design?
Generative design is the larger computational process and workflow built by an engineer for their specific product and application, of which topology optimization is simply one function that can be called upon. What in part makes a design platform “generative” is the way its connected workflows can be used to explore the design space, perhaps cycling through different materials, manufacturing processes, functional requirements, or even the most basic assumptions built into the process by the engineer. Whether this happens using simple parameter sweeps or more advanced design-of-experiments, for each iteration the process has generated a new design. We could stop there, but an observation worth noting is that these fully connected processes essentially generate data, and that is the key ingredient to even more advanced algorithms that are now emerging.
The Details of Topology Optimization
Considering topology optimization as a “function call” and a subset of generative design, let’s explore some topology optimization possibilities within an advanced software platform. The examples in this discussion were generated using the Solid Isotropic Material with Penalization (SIMP) method implemented within nTop Platform for a basic compliance minimization problem. For each element in the design space, the optimizer will attempt to find a density value that minimizes compliance (i.e., maximizes stiffness) while satisfying a simple volume constraint.
The simple structural bracket example shown in Figure 1 highlights a few key concepts of the optimization process and results. First, a discretized domain is provided as the design space, materials are defined, and boundary conditions are applied much like any other finite-element model setup. In this example a volume fraction of 30% was applied, meaning the volume of the optimal design must be less than or equal to 30% of the original.
Interpolation of Data Fields
Interpreting the results of density-based topology optimization is where most software applications begin to divert from each other. Interpolation schemes, such as SIMP, are used to convert the element densities into continuous design variables better suited for optimization algorithms. The result is the element densities can range from 0 (no material) to 1 (solid material) but also vary in between. This is shown in Figure 1 where the red regions correspond to the highest density and then the values gradually decrease towards zero density (elements not shown), with intermediate results blending between the two. While a rough representation of the optimal material distribution starts to appear, one implication of these interpolations schemes is that there is no definitive boundary in the result and the user is usually left to pick a threshold value between 0 and 1.
At this point, most software will extract an isosurface at the selected threshold value to form a rough approximation of a boundary which can be used as inspiration for final geometry reconstruction. Before we get to geometry construction, note that the most basic result of the density-based optimization process is a spatially varying set of scalar values, a density “field.” Figure 2 shows a planar section cut through the original design space which has been filled with an isotruss lattice structure. Here, the density field from the topology optimization is used to directly drive the thickness of the lattice, resulting in thicker lattices in the regions of higher density.
This example demonstrates the fundamental idea that topology optimization results, as fields, are naturally represented in this kind of platform and can be used like any other in the software. While this is a very simple demonstration with an isotropic material model, it is the key technique underlying some soon-to-come physics-based cellular material design capabilities.
While the usual goal of topology optimization is to ultimately find some optimal shape, the raw result is typically some inspirational representation and not a definitive geometry ready for verification or manufacturing. It’s at this point an engineer must usually intervene to manually reconstruct a geometric shape—which can be a major bottleneck in the workflow. While there do exist some powerful tools to do this manually, the lack of an automated and traceable process will likely not keep pace with the rapid and iterative nature of modern engineering workflows.
Geometry Reconstruction
The geometry reconstruction problem is addressed by leveraging implicit-modeling technology. With only a few inputs, an implicit geometric representation can be derived straight from a topology optimization result. The results of the simple structural bracket example are shown in Figure 3.
Until recently, manual reconstruction of geometry after topology optimization has generally been viewed as a major impediment but, using implicit technology, a smooth geometry is generated in seconds and can immediately be used in downstream modeling, simulation, and manufacturing functions. This smoothing capability provides an integrated, continuous, and automated topology-optimization pipeline.
An Evolving Landscape
Whether you’re new to topology optimization or an experienced veteran, you’ve most likely noticed that the technology is moving at a rapid pace. Start-ups have been built around the technology and larger simulation providers are showing renewed interest. My company is actively engaged with a number of research institutions studying the topic and is amazed by the number of novel techniques being developed. While such continuous change may pose a challenge, in contrast, block architecture provides an extensible and robust API that can evolve with improving technology. Much like blocks can represent geometry functions, they are also used to construct optimization problems within an extensible framework, allowing engineers to configure the software for the key aspects of their product and workflow.
As previously stated, a common goal of topology optimization is to find an optimal shape that is nearly ready for verification and manufacturing. Rather, the raw result from this method is best viewed as an inspirational representation, or one of many acceptable solutions. Either way, there are still data interpretation and geometry construction challenges not easily addressed through conventional techniques. However, when data is represented as a density field, this can drive geometric parameters, including lattice thickness, in a robust way. In addition, geometry reconstruction is no longer a bottleneck because of implicit-modeling technology. These two characteristics in particular allow for a more flexible, automated way to utilizing topology optimization as a viable solution for product design.
Trevor Laughlin is the Director, Product Management of nTopology
Subscribe to Our Email Newsletter
Stay up-to-date on all the latest news from the 3D printing industry and receive information and offers from third party vendors.
You May Also Like
John Kawola on BMF’s Formnext Highlights and What’s Next
Boston Micro Fabrication (BMF) has continued to grow steadily since my last visit to its Boston headquarters. The company, known for its ultra-precise 3D printing technology, showcased new product launches,...
Formnext 2024: Sustainability, Large-Format 3D Printers, & More
The doors have closed on Formnext 2024, but we still have more news to bring you about what was introduced on the show floor this year. WASP had several product...
Nano Dimension Builds Momentum After Q3 Earnings: Julien Lederman Talks Strategy
“We’re building a business grounded in innovation but also ensuring financial sustainability for the long term.” That’s how Julien Lederman, Vice President of Corporate Development at Nano Dimension (Nasdaq: NNDM),...
3D Printing Webinar and Event Roundup: December 1, 2024
We’ve got several webinars this first week of December, plus events all around the world, from Chicago, Los Angeles, and Austin, Texas to the UK, Barcelona and beyond. Plus, there...