We may think of printed-circuit boards (PCBs) as electronic subsystems, but they exhibit many mechanical characteristics as well, raising concerns over PCB quality, reliability, and longevity.
Older PCB designs distributed their functions, spreading thermal and mechanical stresses across each board. The relatively uniform substrate conditions limited the complexity of the analysis to predict performance variations between design revisions and permitted building prototypes that reproduced final product behavior earlier than they can today. Testing the prototypes suggested changes, often allowing iteration to the desired result in only a few cycles. Prototyping costs, although not trivial, generally remained manageable.
Reducing the Prototype Cycles
As logic shrank, however, board designs evolved into a handful of dense, complex components, such as microprocessors, memories, and field-programmable gate arrays (FPGAs), connected by traces and simpler devices. The resulting uneven circuit density significantly complicated the algorithms necessary to model board behavior accurately. The dense components consumed much power and got quite hot. The board surface underneath them heated up and expanded. Combined with cooler temperatures elsewhere, the local expansion and other thermal stresses could deform the board and degrade its performance.
Without precise, comprehensive simulation tools, engineers had to create many prototype generations, dramatically increasing the effort necessary to hone the design into a viable, practical product. The time and cost of those steps encouraged developers to reduce the number of prototype generations by performing more of the work in simulation.
Sophisticated software simulation tools allow executing “what-if?” scenarios without the need for a physical prototype for each iteration. This recent white paper from ANSYS, a company claiming “the best and broadest portfolio of simulation software,” emphasizes the advantages of simulation for minimizing power losses and incorporating cooling techniques like heat sinks to reduce thermal stress on a board under development and then testing the results.
The challenge begins with the bare board, consisting of layers of dissimilar conducting and nonconducting materials with differing coefficients of thermal expansion (CTEs) that cause the board to respond inconveniently to changes in temperature. The paper contends that the board can bend with changing temperatures the way a bimetal strip in an old-style thermostat bends with increasing temperature, possibly breaking solder bonds or delaminating board traces or the board itself. Adding heat sinks and other heat-controlling appliances can mitigate the effect somewhat, but can’t generally eliminate it.
Even after you reduce temperature gradients as much as possible, dense components can still deform the board surface. Modern microprocessors, for example, can reach peak currents of 100 A. Characterizing that behavior in simulation allows modifying the thickness or composition of one or more layers to minimize the deleterious effects without actually building the board. Any changes to design information resulting from the simulation will contribute to building the prototype, so that the first physical sample of the board will conform more closely to the desired performance than would be possible without virtually manipulating the design beforehand.
The white paper outlines how to create a simulation from your CAD information and massage it to shorten the path to a successful product. Although the author describes the process using ANSYS simulation tools, the principles apply, regardless of the platform.
Moving to Simulation
Whatever software layout tools you use to create ECAD geometries, you must translate the results into a full simulation description that includes individual board layers, material properties, and voltage-regulation modules (VRMs), as well as microprocessors, field-programmable gate arrays (FPGAs) and other ICs, arbitrary power and ground planes, vias, and signal traces, producing schematics and virtual prototypes for review.
Reviewing the schematic identifies key components, sets up power sources and sinks, divides the circuit into functional blocks and reveals workhorse components, attaches voltage sources, and identifies the primary power consumers on the board.
Power integrity and DC internal resistance (DCIR) analyses pinpoint excessive current regions and other hot spots. Implementing a series of “what-if” scenarios helps to predict (and thereby minimize) failure-prone regions.
Based on the results, an automated and iterative thermal analysis allows optimizing the board’s thermal stresses, reducing the chances of board failure.
Manually tweaking the results to this point based on device manufacturers’ specifications and designer’s judgment keeps the board and its contents within operating limits. Temperature profiles ensure that the board’s actual behavior meets its performance goals.
Preparing the data for thermal analysis must include all heat sinks or other dissipation features that the designer has included. The software then assembles the product for simulation. Importing the board’s layer metallization contributes significantly to its final temperature profile. The simulation provides a next-stage temperature map and analyzes it for thermal stress, deformation, and elastic strain. That analysis includes predicting bending moments resulting from the dissimilar material layers in the bare board.
The source article features a much more detailed description of each process step. It also links to a series of short videos that provide clarification and additional information for implementing the technique.