A correction scheme for thermal conductivity measurement using the comparative cut-bar technique based on 3D numerical simulation

Measurement Science and Technology Volume 25 Number 5. 2014.

Changhu Xing1, Charles Folsom1, Colby Jensen1, Heng Ban1 and Douglas W Marshall2

1 Mechanical & Aerospace Engineering Department, Utah State University, Logan, UT, 84322, USA 
2 Idaho National Laboratory, Idaho Falls, ID, 83415, USA .

 

ABSTRACT

The comparative-axial-heat-flow (cut-bar) technique is a steady-state method for the measurement of thermal conductivity of solids. Since its origination in the 1950s, it has largely been replaced by faster transient measurement methods. However, certain sample characteristics may restrict application of transient measurement techniques such as particular sample geometries and/or composite sample materials that require bulk measurement. In the cut-bar technique, the sample is sandwiched between two reference materials (meter bars) of known thermal conductivity to form the test stack. By measuring the temperature distribution of the test stack, which is controlled by a system of heaters and a “guard”, the unknown thermal conductivity of the sample may be calculated  by application of the 1-dimensional (1D) Fourier’s law (see figure). In the past, three main drawbacks to the technique have been identified including: uncertainty of reference material data, the effect of interfacial thermal contact resistance, and the effect of thermal conductivity mismatch between reference and sample. The first implies that measurement uncertainty can never exceed reference material uncertainty. The remaining effects essentially amount to increased deviation from assumed 1D heat flow, even with guarding. A combination of experimental and computational studies found that using the currently recommended guarding schemes cannot produce accurate results when sample thermal conductivity is markedly different from that of the reference material. However, for a linear guard-temperature distribution applied to a given configuration, the bias error caused by thermal conductivity mismatch can be minimized at a particular guard temperature profile, termed the optimal guarding condition. Through simulation, the mechanism for optimal guarding was explained: essentially it creates 1D heat flow through the test stack. Experimentally, the developed guarding scheme was verified for a range of thermal conductivity mismatches.  As an alternative to using the optimal guarding condition, a correction scheme was developed to minimize the system-induced bias error. The correction scheme compensates for any 3D heat flow caused by non-optimal guarding through parallel simulation of the measurement system. The scheme was validated experimentally by measurement of four samples covering a range of thermal conductivity ratios between sample and meter bar of ~0.15-4. The samples at the extreme limits of the measured ratio range were certified reference materials. Application of the optimal guarding condition or the correction methodology can provide measurement uncertainty to the level of the uncertainty of the meter-bar thermal conductivity and becomes increasingly important for greater deviation of sample-to-meter-bar thermal conductivity mismatch.

Go To Journal

 

A correction scheme for thermal conductivity measurement using the comparative cut-bar technique based on 3D numerical simulation

 

Check Also

Fatigue Crack Growth Behavior of WAAM Steel Plates: Experimental Analysis and Comparative Study - Advances in Engineering

Fatigue Crack Growth Behavior of WAAM Steel Plates: Experimental Analysis and Comparative Study