Interview with: Lindsay Chapman from the National Physical Laboratory in the UK about thermal analysis.

LISTEN NOW

Learn more about thermal analysis techniques by watching our free presentations, covering Differential Scanning Calorimetry, ThermoGravimetric Analysis, Thermomechanical Analysis, ...and many more...

Laurie Winkless:

If we could get started by maybe introducing yourself and telling us a little bit about your research, and your background?

Lindsay Chapman:

Well, I work at the National Physical Laboratory, as a senior research scientist.  My background is in materials science and engineering. At the National Physical Laboratory, we’re trying to develop techniques for accurately measuring properties of materials.  My background is in high temperature measurement specifically, and so I look at measurement techniques that provide properties for input into models, whether they be for processing or performance models, where we can optimize the properties of components, or alloys, by optimizing the composition, or the structure of the materials in use.

Laurie Winkless:

That’s perfect.  It’s a really interesting area actually, because I know that NPL is the National Metrology Lab with the UK, so the expertise on measurement goes just beyond materials, but obviously, because of your background, and your research area, why don’t we talk a little bit more about the measurement challenges, specifically in thermal analysis, and particularly, I guess, of higher temperatures. So if you could maybe tell me about some of the measurement techniques that you use within your work at NPL, and maybe give us a flavor of why thermal analysis at elevated temperatures brings with it so many challenges?

Lindsay Chapman:

Well, thermal analysis is used to provide properties that are commonly used to model production or performance of components that are actively used in service, and so these either tell you how to optimize the production, or optimize the performance, and for those, everything has to operate at a temperature. Some temperatures are more elevated than others, and some obviously are below ambient.  Some of the issues come from the fact that, of course, as soon as you try to measure something, you’re not exactly replicating the conditions that it would experience in reality.  So we’re trying to develop techniques that will allow us to evaluate the properties as closely as we can, to those conditions.  The challenge comes from the limitations of the apparatus, that we can either buy, or build and maintain.  So for example, thermal conductivity is a critical property for process modelling and performance modelling, but at the moment, in order to measure that directly, there was a temperature limitation of about 500 degrees Celsius. This means that, when you want to obtain values above that, you have to use a combination of techniques. Now, at NPL, we have one of the most accurate capabilities for determining thermal conductivity at or below 500 degrees Celsius. It’s UKAS-accredited, and has a very low uncertainty of measurement.  However, when you start using multiple techniques in order to determine the  properties to make a calculation of thermal conductivity, you are introducing extra uncertainties.  So a common technique would be to measure density, and with respect to temperature, so you need thermal expansion; specific heat, again, with respect to temperature, and also thermal diffusivity with respect to temperature. All of those require different measurement techniques, so for density, you can measure it by the Archimedean method, or you can use pycnometry, or, and for thermal expansion, in order to determine the density at higher temperatures, you can use, for example, a piston dilatometer.  However, when you want to measure specific heat, you have to use a separate calorimeter, and when you try to determine thermal diffusivity, there are a few different techniques: thermography, or the one that I use, which is laser flash, and all of these different techniques use different sizes of sample.  They also use different conditions, when you’re making the measurement.  So thermal expansion, typically you would measure at a heating rate of between one or two degrees Celsius per minute.  Laser flash, for thermal diffusivity, requires the sample to be stabilized at temperature, before you rapidly increase the temperature of the sample to determine how quickly that heat travels through the sample, and then, for specific heat, there are various different techniques.  A commonly-used technique, which has the benefit of rapidity, is differential scanning calorimetry.  However, this is often carried out at a ramp rate of 5, 10 or 20 degrees C a minute.  So before you even start looking at the materials that you’re measuring, you’ve got different conditions within the apparatus, and all of these conditions may actually bear no resemblance to what happens during processing.  So you could have cooling rates of hundreds of degrees Celsius per minute, or per second, depending on the manufacturing technique. So the challenge, when you’re providing values for this, is not only getting the number, but getting a high-quality number out of the apparatus, and, as I was taught by Professor Ken Mills, when I first started working at high temperatures, there are two rules of high temperatures.  The first rule is that everything reacts with everything else; the second rule being, they react quickly, and my own personal addendum to that is that, once it’s started, it’s very difficult, when it’s in an apparatus, for you to do anything to stop it. Well, when you’re making measurements on these materials, you have to be very sure that the materials you’re measuring are not going to react with the containment system. For some methods of calorimetry, you can get away without having the container, you can use levitation techniques, but normally, there will be contact between the material you’re trying to measure, and the measurement apparatus.

Laurie Winkless:

And Lindsay, in these different techniques, then, if you’ve got lots and lots of different techniques, and you’ve got lots of sample sizes, so you may not even be measuring the same material necessarily, within the systems, just a very simple question is, how do you measure temperature within these systems? Is there always a thermocouple involved?

Lindsay Chapman:

There is usually a temperature measurement system.  It depends on the type of technique.  So for thermal expansion, you will normally have a thermocouple quite close to the sample, but not touching the sample, because that, in itself, could alter the temperature measurement.  In thermal diffusivity measurements, you use two different techniques to measure the temperature, because you have the stable temperature before you use the laser pulse on the sample. That’s determined by the thermocouple, and then, of course, you’ve got the temperature detection on the opposite face of the sample, which is determined by an infra-red detector, which, of course, has a different range of measurements than the thermocouple, so all of those aspects have to be calibrated separately.

The other problem you have, in specific heat measurement, for example in differential scanning calorimetry, is that because of the problems of sample containment, and reaction, you have to isolate the sample, and often at high temperatures, you’re very restricted to the sample containment you can use.  So for example, you might want to use platinum pans, because they can withstand high temperatures.  But they will react with an alloy at low temperatures, and potentially destroy the apparatus, so it’s common to use an alumina, or an inert sort of material, ceramic, to make the measurement. They have the disadvantage that, at higher temperatures, they effectively can become transparent, so you’re changing the characteristic of the crucible with respect to the temperature determination, throughout the measurement.  If you use a combination of those two containment systems to protect, but also prevent the radiation effects from becoming dominant, then you’re introducing significant layers between what’s actually happening in the sample, and the temperature determination which will be outside all of these crucibles.  So it is possible, and we’ve done work to try to model what’s going on inside the apparatus, to take into account these different layers where the sample is situated, to fully characterize the apparatus that we’re using, and try to minimize the uncertainties associated with that temperature measurement.

Laurie Winkless:

So then, is modelling one of your key tools, in terms of trying to establish a good measurement practice across all of these techniques, with their huge number of variables?

Lindsay Chapman:

Modelling is certainly one of the tools that we’re trying to use to understand the apparatus.  I think, from a good practice point of view, it’s best to start with analyzing the individual components that you’re trying to measure, to get the best possible value for uncertainty of those measurements. So for example, we calibrate our thermocouples on a regular basis, but we also evaluate the uncertainty in the sample thickness measurement, in the time response of the equipment of the analogue-to-digital convertor, and all of the separate measurements that go into making the calculation of the value.  But yes, when it comes to actually what’s physically going on in the apparatus, then modelling is a helpful tool. We recently published a paper written with my colleague, Louise Wright, where we’re trying to model the calorimeter.  But there’s two aspects to any of that kind of modelling, in that the actual instrument, we can obtain diagrams for, and we can determine the boundary conditions, and we can measure. We can actually determine the dimensions, for example. However, the second part of modelling, which we like to attempt, is to model what’s happening within the sample, because where the sample meets the apparatus, is going to have an impact on the heat transfer through the sample; also the temperature measurement; and of course, different emissivities of samples, if you’re using the thermography sort of method, will influence the temperature measured from that sample. So it’s important to have modelling of the apparatus to support it, but, from a measurement institute point of view, the starting point has always got to be the initial input parameters.

Laurie Winkless:

It’s really interesting.  I wasn’t aware that there were so many different uncertainties within your system, and then, of course, within your sample too. So, I just have a really quick question, which is, if you’re trying to model, say, the system that you’re using, if you’ve purchased it from a manufacturer, what’s your relationship like with those manufacturers? Do they see you as difficult customers, because you ask questions of their kit that others don’t? Or do they see it more of, as a collaboration, and that they learn from you?

Lindsay Chapman:

More often, it’s seen as a collaboration, because of course, if they can demonstrate that their apparatus has been investigated in this way, and can be shown to be very effective at making these measurements, then it’s a selling point for them.  It does become difficult sometimes, when you have software involved in making those measurements, which obviously is proprietary, and there’s IP to be considered.  So there does come a point where you have to appreciate that they have a commercial interest that they want to keep away from the measurement institute, but there are certainly opportunities for collaboration across different manufacturers, and through example key comparison exercises, where, for example, in Germany there was a huge comparison exercise where several institutes took part, but also several manufacturers volunteered their apparatus as well, to make measurements on the same material, to compare the properties that were being measured on a potential reference material.  This is sort of an example, science is challenging and unusual in that you’re often competing and collaborating with exactly the same people.   You’re competing for funding, but you also have to collaborate in order to validate your results.

Laurie Winkless:

Yeah, that’s a really good point actually.  It’s true, you have to both, you all want to agree on a value, you all want to get a real value, and not just a generic number, so you do have to collaborate.  I wonder then, is that something that NPL does a lot of? Is this part of NPL’s role, in establishing good, low uncertainties within thermal analysis systems, for example?

Lindsay Chapman:

Absolutely. The BIPM has a huge amount of activity in this area, and also through EURAMET, NPL are involved in various collaborative projects, generally across all of the activities at NPL, and in the materials division, yes, we’ve completed comparison activities, to look at reference materials for high temperature. With the application of trying to measure accurately for the engineering associated with, for example, safety case for nuclear power stations, if you have good reference materials, for the thermal analysis for the engineering materials used to build power stations, then it’s more likely that these will be built safely, and will get approval to be built from the design point of view. So it’s very important that you have a good collaboration with the measurement institutes around the world really, and I’m about to participate in another international key comparison with colleagues from Japan, France and Korea and China, in order to look at suitable reference materials for the laser flash apparatus, but it does become quite difficult, when you’re looking at what’s ideal for a reference material, to then relate that to what I would call engineering materials, because the qualities that a reference material requires are that they’re homogenous, that they’re hopefully inert throughout the temperature range, so that they don’t change over the entire temperature range that you’re going to use them, and that they also don’t react with the apparatus, so that the containment can be easily solved. However, when it comes to measuring the engineering materials, you’ve got very complex alloys sometimes, or ceramic materials, that aren’t going to behave, and we don’t want them to behave in the same way as a reference material, but it then comes, can we be sure that the technique that we’ve characterized for reference materials work accurately, is also going to behave in the same way when it comes to measuring the material used for engineering applications. So, for example, a nickel alloy, which is a complex chemistry to start with, considered unlikely to be in chemical equilibrium, even at room temperature, and we optimise the properties of nickel alloys by various methods, by additions to the composition, or by sometimes removing elements from the composition, but also by heat treatment effects, and what we need to be sure about, when we’re measuring these alloys, is that we’re not introducing new effects through the measurement technique, that will unduly influence the values that we’re trying to determine.

Laurie Winkless:

That’s a very interesting point, actually.  So you could potentially change the material, just by measuring it?  It almost seems like a quantum effect, in some ways. But I was just wondering, so we’ve talked about the kind of, the system, and we’ve talked a little bit about the materials, and the manufacturers that you have done some work with, and the international collaborations you have. What about the end users of these materials, of these nickel alloys, for example? I’m guessing they’re kind of aeronautical, engineering-type companies.  Do you do any work with the end users of these materials, too?

Lindsay Chapman:

Yes, it’s very important to demonstrate, for a materials-based project, that you’re measuring something which is needed for the real world, and, as well as for the sort of aero-engine manufacturers, there’s also power generation applications, and also marine applications. We’ve done a lot of work for processing of all different kinds of alloys, which would use a similar technique, but perhaps they are trying out new compositions.   We have end users who manufacture, for example, medical implants that are metallic, and so, we do have to demonstrate that we have interest from end users, in order to make our projects viable, and, of course, we need to make sure that our techniques are available before end users realize that they may be useful.  So we have to be working on apparatus, for example, at temperatures that end users aren’t yet pushing for, so for example, a lot of the measurements I do, there’s a lot of interest around 1,200, 1,300, but I’m trying to optimize my apparatus so that we can use it up to 2,000 degrees Celsius, because as they strive for efficiency in particular in engines, whether it’s power generation, or in aero engines, they’re going to try to push the engines to be working at higher temperatures, to increase the efficiency, and so we need to be sure that we’ve got the measurement techniques to be able to measure those materials, when the manufacturers decide that that’s where their research is going to take them.

Laurie Winkless:

Yes, so you’re trying to keep NPL ahead of the game really, it’s really excellent. It’s so great speaking to you, Lindsay.  I was just wondering, for the listeners of the Materials Today podcast, if any of them have the opportunity to maybe hear you speak more on this topic, and on your research any time in the future?

Lindsay Chapman:

Well, there’s two opportunities, one which is more accessible than the other.  I’ll be speaking at the Optimum conference in September, which is the Optimizing Performance Through Integrated Modelling of Microstructure, which is the Institute of Materials, Minerals and Mining conference. There’s also the European Conference on Thermophysical Properties, but I suspect that that’s a niche interest, for a lot of the listeners.

Laurie Winkless:

Perfect, thank you so much. Now, before we finish up, I have to ask you a final big question that we ask all of those we interview on this podcast.  So, in your opinion, what are the other hot topics (no pun intended!) in materials science?

Lindsay Chapman:

Considering my background was in engineering, and although I find science to be absolutely fascinating, for me the application of science is the important thing, and we are currently experiencing the climate change that has come about because of our adventures dating from the Industrial Revolution. So, in terms of how we are going to respond to that challenge, whether it’s by how we are going to ensure that the population has water, how that we ensure that we have enough power generation to meet our energy needs in the future, by whatever method that is, whether it be nuclear or wind or wave or solar, or indeed fusion, the engineering challenges associated with that will be where a lot of materials science will play a huge part, as we try to optimize the techniques we currently have, and also develop things like thermoelectric, which we’re trying to work with more and more.