Log In   |   Sign up

New User Registration

Article / Abstract Submission
Register here
Register
Press Release Submission
Register here
Register
coolingZONE Supplier
Register here
Register

Existing User


            Forgot your password
December 2005
library  >  PAPERS  >  Analysis

Thermal analysis moves into the 21st century


introduction


in the last decade, we've come a long way in the application of thermal analysis to the design of electronics. and there's no sign of the pace of innovation changing. engineers are still being challenged to build faster, smaller and cheaper products in ever-decreasing design times.

 

fortunately, the engineering software industry has been able to respond by providing tools that help designers both analyze and understand the complex fluid flow and heat transfer mechanisms within their equipment.

 

in this article, i'll be giving a personal perspective on the future of thermal analysis. i'll start by looking at the changes in the underlying technology then chance some predictions on what the next 5 to 10 years will have to offer. in particular, i'll talk about:

 

  • how advances in hardware and software are affecting thermal analysis tools.
  • the realization of the dream of practical component characterization.
  • integration of thermal analysis with eda and mcad systems, and other analysis tools.
  • the impact of the internet and the emergence of web-based applications.

 

thermal analysis today from ic package to equipment room . and everything in between.

 


figure 1a. package-level analysis, pentium®ii processor & heatsink.

 


figure 1b. board-level analysis, temperatures in the motherboard of the motorola
powerpc® reference platform "yellowknife".

 


figure 1c. system-level analysis, airflows through a pc chassis.

 


figure 1d. room-level analysis, a telecommunications switching room.

 

 

advances in hardware and software

 

perhaps the biggest changes that we are seeing today result from the ready availability of processing power that, 10 years ago, would have been unimaginable. ironically, the very same moore's law that predicts the growth in processor speed, also predicts the increases in dissipated power, which are heightening the need for thermal modeling!

 

today's computers are orders of magnitude faster than their ancestors a decade ago. the results can be seen everywhere from increasingly sophisticated solvers to the high-end 3d graphics that are now commonplace. however, we have also seen improvements to the software. tuning of software algorithms have made significant improvements in processing time. in certain areas, more radical techniques have produced worthwhile improvements.

 

for instance, a monte carlo method has recently been employed in one tool to determine radiation exchange factors for the highly cluttered geometries typical of electronics systems with order of magnitude improvements in run-time compared to more conventional analytical techniques optimized for sparsely populated geometries.

 

computer operating systems are also evolving and, like it or not, the trend is inexorable: windows nt has all but won the battle for the engineering desktop. in fact, a recent poll of thermal analysts attending the 1999 flotherm user conference showed that over 80% of them had access to windows nt on their desktops, compared with 30% having access to solaris machines (the next biggest group).

 

although there is much talk about the emergence of linux, it is hard to believe that it will find more than a niche role in engineering companies as a server platform and, in the context of thermal analysis, possibly as a "calculation engine". it is unlikely to displace windows nt from the desktop.

 

the prevalence of windows nt is also fueling the growth of "intel-based" hardware, including compatible processors from amd and cyrix. increasingly, we see multi-processor machines becoming standard on engineers' desktops. to take full advantage of the additional processing power that they offer, software is being rewritten to be multi-threaded and even, whenever possible, fully parallelized (figure 2-overleaf).

 


figure 2. multi-threading (top) and parallelization (bottom).

 

but speed itself isn't the ultimate goal. it merely allows a designer to explore a wider design-space in order to identify problem areas and to develop an optimum solution. using thermal modeling techniques to model "what-if" scenarios makes it much easier and faster for engineers to determine the effects of repositioning objects, such as vents, fans and heatsinks, for a given product design before physical prototyping begins.

 

unfortunately, the very success of thermal analysis creates another problem, which is the sheer volume of data which can (and will) be produced. the management of this data will become an increasingly taxing problem that will be addressed by a new breed of tools. these tools will allow not only rapid generation of parametric cases, but also facilitate the post-processing and comparison of multiple runs so that the designer can concentrate on the optimization of the product design.

 

finally, while optimization techniques have already been applied to problems, such as stress analysis and coupled thermal/stress problems, applications involving fluid motion have proven far more difficult due to the strong non-linear coupling and the large number of calculations involved. the continuing pace of computer hardware development will change this and will make it possible to apply optimization techniques to practical design problems, such as the positioning of fans, design of heatsinks, etc.

 

my predictions:

  • increases in processing power will continue to fuel the explosive growth in analysis capabilities. today's fastest workstations will be obsolete by the end of 2003, replaced by machines with power we can only dream of today!
  • windows nt (and its descendants) will - despite the emergence of linux in specialized applications - dominate engineers' desktops.
  • parallel processing and multi-threading will become commonplace in analysis tools as multi-processor "wintel" machines become the norm.
  • watch out for a new range of data management tools, which will help users both organize analysis data and post-process parametric runs, allowing them to harness the raw power of the new hardware.
  • later in the next decade, keep an eye out for the first practical examples of automatic optimization involving fluid motion.

 

practical component characterization

it has been known for a long time that existing  θja component characterization techniques are inadequate. the errors are too great and, with the advent of ball grid array (bga) and chip scale packaging (csp) technology, the effect of the board cannot be ignored. systems designers want and need a quick estimating technique.

 

the european delphi and seed projects have mapped out a way forward by identifying both a methodology for determining environmentally independent component thermal characteristics and experimental techniques to validate the models. but it will be some time before the full infrastructure to support these techniques is available, including:

 

  • analysis tools to determine the models.
  • implementation of the models in system-level design tools.
  • support for the models in component libraries and board layout files.

 

an intermediate step is likely to precede the adoption of the full delphi models. two-resistor ( θjc + θjb )  models - although the subject of some controversy - are part-way to full delphi models and can be determined either by analysis tools or experimental techniques. although they are not fully environmentally independent, they are a significant improvement over existing methods  ( θja) and can already be utilized in system-level analysis tools and included in board layout files.

 

my predictions:

 

  • environmentally independent component thermal models will become commonplace during the middle years of the next decade.
  • in the meantime, 2-resistor models will provide a partial solution to system designer's needs for a quick estimating technique.

 

integration with eda, mcad and other analysis tools

the true benefits of vendor-supplied component characteristic data will really only be felt when thermal analysis tools work closely with the electronic design automation (eda) tools used to develop board layouts. efforts to integrate these tools in the past have been hindered by the lack of a common standard for data transfer with each eda vendor using their own proprietary data format. fortunately, the emergence of idf in the mid-1990's means that this is now a far less difficult task. idf files are not without their problems, including:

 

  • sparse support for component thermal data (nothing at level 2 and only 2-resistor models in level 3).
  • excessive detail requiring intelligent filtering by the system-level analysis tool.

 

however, idf level 2 is supported by all major eda vendors, and the picture is far brighter than it was two or three years ago.

 

finally i should add a concern that has been voiced many times with no satisfactory answer to date - where do we get accurate power figures? if we're going to spend a great deal of effort characterizing components and building accurate models of the system, we ought to do something more sophisticated than assume the maximum rated power for all components! thermal analysts must start looking critically at the accuracy of the tools used to provide these figures and develop means to import these values from eda systems into their simulations.

 

a similar dilemma is faced in integrating thermal analysis tools with mcad. mcad file formats have always been troublesome as anyone who has tried to move data from one system to another will attest. however, the past few years have seen the situation improve with the stabilization of iges as a global format, and the emergence of alternatives, such as sat (for acis-based tools), stl and even vrml. somewhat surprisingly, the predicted trend towards step (iso 10303) has not materialized - at least not among design engineers in the electronics industry. i would expect iges to continue to dominate this area as we move into the next century.

 

but filtering of excessive detail in the mcad file (such as radii, fillets, draft angles and small holes) remains the most critical problem. a certain degree of simplification can be achieved in the cad tool itself. but sometimes the features are so deeply embedded in the solid model that this is difficult to achieve.

 

so a new breed of interface tools, based on solid modeling kernels such as acis, have been developed to enable the thermal analyst to load, simplify and then enhance the mcad geometry ready for analysis (figure 3).

 





figure 3. use of an advanced mcad ionterface to simplify iges data; (top)
before simplification and (bottom) after simplification and thermal analysis.

 

a further improvement is the gradual introduction of "healing technology". this is advanced intelligence built into software that can read poorly composed mcad files and "mend" surfaces that don't quite match up. such tools are absolutely essential if thermal analysis tools are to be closely integrated with mcad systems.

 

thermal analysis doesn't exist in isolation from other analysis tools. in some areas, the links are strong and have long been acknowledged. for example, in the ic packaging industry, calculation of thermally-induced stresses have always been a critical part of the design process to ensure that the package reliability figures are met. such coupled analysis using fea tools is commonplace at package level. its extension to board level may be a natural consequence of the increasing integration of eda and thermal analysis tools through the medium of idf (see above).

 

another area where links are strong is emi/emc. at the package level, a design that may be excellent for electrical performance may be thermally poor or may be far too expensive to manufacture. trade-offs are inevitable in this situation and tools are beginning to come to market that allow package designers to use a common data definition to assess the effect of design changes on both thermal and electrical performance.

 

there are other areas of electronics where thermal and emi/emc issues are closely linked. in the telecommunications and networking industries, a cabinet of high speed switching equipment may be installed in close proximity to other equipment and so must be shielded to prevent both emission and reception of electromagnetic radiation. unfortunately, the act of shielding the shelves has an adverse effect on the natural convection cooling, which is becoming a requirement for designers.

 

since there are many similarities in both the physics and the definition of the emi/emc problems, i would expect this to be leveraged over the next ten years as combined emi/emc and thermal analysis software is produced based on a common data definition.

 

my predictions:

 

  • idf 3 will become commonplace for eda integration. idf 4 will remain in the shadow because of its radical differences with idf 2 and 3.
  • iges remains the primary data file exchange format for mcad. step might emerge in the latter part of the decade.
  • mcad healing and simplification technology will become an intrinsic part of thermal analysis software.
  • stress and thermal analysis will converge in board-level analysis tools as eda integration increases.
  • look for the extension of thermal analysis to include emi/emc - particularly in telecommunications  and networking applications - based on a common underlying data model and gui.

 

the impact of the internet

you can't pick up a paper today without reading about the impact of internet on businesses such as banking, share trading, book sellers, etc. and the internet is beginning to make its mark in the field of thermal modeling. since its origins in the early 1990's, the internet has passed through many stages. today forward-thinking software companies keep their users fully up-to-date with support information, software patches, bug lists, and modeling advice.

 

more recently, thermal models of parts such as processors, fans and heatsinks have been posted on web sites for thermal analysts to download and to use in their models (figure 4-overleaf). in addition, libraries of commercially sensitive thermal data are available directly from leading manufacturers, such as intel and motorola.

 





figure 4 - examples of library data from the web: (top) an axial flow fan from papst
and (bottom) high performance heatsinks from johnson matthey electronics.

 

but perhaps the most radical changes are only beginning with the development of web-based applications. these applications, the first of which was seen in late 1998, are accessible to users around the world using a standard web browser. they can be tailored for specific industry sectors, involve no installation, and present no configuration headaches (figure 5-overleaf).

 





figure 5 - a web-based application to create and analyze ic package thermal models:
(top) vrml visualization of cbga, pbga and disk fin heatsink and
(bottom) web form for creating a cbga.

 

so what does the internet hold for the future? anticipating the evolution of the internet is even more difficult than predicting changes in thermal management. but here are some trends that i'm seeing today:

 

  • increasing use of vrml (virtual reality modeling language) as a method of viewing geometry and (eventually) analysis results.
  • the vast potential for xml - "extensible markup language" (www.xml.org) to integrate thermal analysis tools with new generations of pdm systems based on internet standards, for example, "windchill" from ptc (www.ptc.com).
  • web-hosted applications for specific thermal analysis tasks such as heatsink design (e.g., r-tools from r-theta [www.r-theta.com]), component thermal analysis (e.g., flopack from flomerics [www.flopack.com]), or for healing flawed mcad data files (e.g., 3dmodelserver from spatial technology [www.3dmodelserver.com]).

 

my predictions:

 

  • more web-based applications will be introduced for parts of the thermal analysis process complementing desktop analysis tools.
  • wide availability of library data will be ensured through vendor and central web sites.
  • vrml and xml will become standards for graphics and data exchange respectively.

 

conclusions

so what will tomorrow's thermal analyst be using? although computing power will yield faster solutions and smoother graphics, i believe that tomorrow's analysis tools will still be recognizable to today's engineers. some of the functionality may migrate to the web, but this will be transparent to the end-user.

 

the main difference will be seen in the increasing number of data sources that the analyst will be able to draw on, including importing geometrical and board layout files, web-based tools for specialized applications and more accurate component thermal models in public and corporate libraries. the end result - models that today take one or two hours to create will take minutes in future.

 

some final thoughts

if i seem to have been concentrating on areas such as integration with other design and analysis tools, it is because i see these as the areas that will change the most over the next five to ten years. today's analysis software tools - whatever their underlying technical basis, gridding system and turbulence model - are all more than capable of calculating flow and temperature fields with a degree of accuracy that is more than acceptable for design purposes. after all, what value is there is a calculation accurate to 0.1°c when the input power levels are 100% out!

 

the real advances that the next decade will bring are in areas which tangibly and significantly enhance the productivity of thermal designers by making it easier and quicker to assimilate and simplify the varied data sources contributing to the analysis, and to manage and digest the resulting volumes of data.

 

but if anything is certain about the future, it is that it is unpredictable! and when one tries to look into the future for an industry as dynamic as today's electronics design industry, one is doomed to almost inevitable failure. if nothing else, i hope that this might spark some thoughts in your own application of thermal analysis to the very real and pressing problems of designing today's electronic systems. and if you have any predictions of your own, please let me know.

 

steve addison, ph.d.
director of marketing

flomerics inc.
2 mount royal avenue
marlborough, ma 01752

 

Choose category and click GO to search for thermal solutions

 
 

Subscribe to Qpedia

a subscription to qpedia monthly thermal magazine from the media partner advanced thermal solutions, inc. (ats)  will give you the most comprehensive and up-to-date source of information about the thermal management of electronics

subscribe

Submit Article

if you have a technical article, and would like it to be published on coolingzone
please send your article in word format to [email protected] or upload it here

Subscribe to coolingZONE

Submit Press Release

if you have a press release and would like it to be published on coolingzone please upload your pr  here

Member Login

Supplier's Directory

Search coolingZONE's Supplier Directory
GO
become a coolingzone supplier

list your company in the coolingzone supplier directory

suppliers log in

Media Partner, Qpedia

qpedia_158_120






Heat Transfer Calculators