Thermal Interface Materials (TIMs) are useful for thermal management in electronic components, as they enhance heat transfer from a heat-generating component to a heat dissipater, or heat sink. One important aspect when selecting a TIM for your application is knowing the material’s ability to transfer heat, which is often given by way of thermal conductivity and/or thermal impedance.
Across the industry, manufacturers often publish thermal conductivity in units of Watts / meter-Kelvin as well as thermal impedance in units of °C – inches2 / Watt on their datasheets. So, what is the difference between these two, and how should you consider them when selecting a TIM?
Thermal conductivity is a material property and describes the ability of the given material to conduct heat. Therefore, when a material’s thermal conductivity is high, the material is a better thermal conductor. This property is independent of material size, shape or orientation in a homogeneous material, and because of this, thermal conductivity is an idealized value.
To understand thermal impedance, we must first understand thermal resistance and thermal contact resistance.
Therefore, the thermal impedance of a material is the sum of its thermal resistance and all contact resistances. When a material’s thermal impedance is lower, the material is a better thermal conductor in that application. Based on this, it is understandable that factors such as surface roughness, surface flatness, clamping pressure, presence of adhesive, non-homogeneous, and material thickness all have large impacts on the material’s thermal impedance. Thus, thermal impedance is a better “real world” thermal property, as it accounts for more variables specific to the application.
In summary, when comparing different TIMs for a specific application, you can begin with thermal conductivity for general comparisons, but having thermal impedance versus pressure data will be far more accurate to your “real world” conditions.