Setting the standard
8 Aug 2010 by Evoluted New Media
ISO13320:2009, the revised standard for laser diffraction particle size analysis, offers valuable advice for anyone seeking to optimise their use of this technology – Alan Rawle and Paul Kippax tell us how
ISO13320:2009, the revised standard for laser diffraction particle size analysis, offers valuable advice for anyone seeking to optimise their use of this technology – Alan Rawle and Paul Kippax tell us how
Fast, non-destructive and fully automatable - laser diffraction particle size analysis is extremely well established and is widely applied to many different particulate systems. Making routine measurements is now often simply a matter of loading a sample and pressing ‘go’. Along the way, however, users have to consider a range of factors, including instrument hardware, measurement methodology, results verification and optical model selection.
Following ten years of significant technological advance, the ISO13320:2009 for laser diffraction particle size analysis replaces the original standard issued in 1999. But what are the changes and how do these relate to everyday use of laser diffraction?
Knowing how a laser diffraction system works helps when assessing hardware differences. As sample passes through a laser beam the particles in it scatter light. Smaller particles scatter light at relatively low intensity to wide angles with large particles scattering more strongly at narrow angles. The resulting scattering pattern (Figure 1) is converted to a particle size distribution using an optical model of light behaviour.
According to the new standard, laser diffraction is applicable to particles from 0.1 to 3000 ïm. Hardware features listed as improving analysis within this range (and sometimes extending it) include additional light sources as well as forward scatter and backscatter light detectors.
As a first principles technique, laser diffraction requires no calibration. The standard does however stress the need to verify system performance, normally by measuring an appropriate standard. Reference material requirements remain unchanged and should “possess sufficient background data and a robust, written sample dispersion/measurement protocol suitable for laser diffraction analysis”. Non-spherical reference particles are permitted provided they are within an aspect ratio limit of 1:3. Preferred are certified reference materials (CRMs) with a polydisperse distribution (x90/x10 in the range 1.5 to 10) of spherical particles (where x10 is the particle diameter below which 10% of the particle population lies, etc). Known optical properties are essential.
Where the new ISO13320 does move away from the original version on this topic is the definition of accuracy acceptance criteria. Since laser diffraction is a volume-based measurement technique, sampling errors for large particles will cause greater uncertainty in the x90 than in the x10 areas and the revised acceptance criteria for reference materials reflect this. If CRM analysis fails to produce results that meet the criteria then instrument performance is unacceptable. Many manufacturers already meet this specification with existing CRMs, but these revised acceptance criteria set the standard for instrument performance into the future.
Successful laser diffraction measurement requires appropriate method development, with sampling, sample preparation and measurement all very important. Here advice in the standard is much improved, reflecting the improved application knowledge of recent years.
It reinforces, for example, the need to ensure that the sample really represents the bulk. The largest errors in laser diffraction measurements are often traced back to sampling issues. These can be critical when measuring large particles or if a specification is based on size parameters close to the extremes of the distribution.
For sample dispersion, ISO13320:2009 highlights the importance of determining whether a fully dispersed or an agglomerated sample is to be preferred; something that will be application dependent. Where dispersion is necessary, optimum conditions can be established by monitoring particle size as a function of energy input, which for dry powders usually involves a ‘pressure/particle size’ titration. According to the standard, this will ideally identify a region where the particle size is nearly constant over a range of pressures, suggesting agglomerate dispersion without particle break-up. However, it makes clear that this is rarely achieved, and it is important to reference dry results against measurements made using wet dispersion, to avoid break-up and/or milling of the primary particles.
Energy input for a wet dispersion is quantified by sonication time. Reference is made to ISO14887 which describes how to achieve fully controlled dispersion. The problem of excessive energy input remains, the standard mentioning microscopy as useful in assessing the state of dispersion in wet systems (see Figure 2).
Finally, ISO13320:2009 includes a useful new appendix devoted specifically to the topic of achieving optimum measurement precision. This includes a recommendation to test at least five independent samples to assess the precision of a new method and to compare the precision achieved with requirements for product performance, to confirm suitability.
The ultimate step in laser diffraction analysis is conversion of the raw data into a particle size distribution. Successful deconvolution relies on an appropriate description of light behaviour: either Mie theory or the Fraunhofer approximation (of Mie theory).
ISO13320:2009 provides a detailed description of both models, including the underlying assumptions applied in each case. In particular the assumptions relating to Fraunhofer, especially with respect to particle shape, are described more fully than in the previous version. It provides a complete summary of application for each model and emphasises the need to measure optical properties in every case. Pertinent considerations in optical model selection include: the particle imaginary refractive index (RI) (is it transparent or absorbing?); the refractive index difference between the particle and dispersant; and particle size.
Mie theory is confirmed as the preferred model for wide dynamic range measurements, providing similar results to Fraunhofer at large particle sizes and improved accuracy for small. Fraunhofer may be suitable for smaller (5 – 10ïm, for example) absorbing particles if the RI difference is high, but can be a problem, even for larger particles (>50 ï€ ïm) if they are transparent and the RI difference is low. It is important to recognise the unpredictability of the errors introduced by the Fraunhofer approximation, which may result in inaccuracies in reported particle size, the amount of material in each size fraction, or both.
Distributions for a refractory material produced from the same dataset using the two different models are shown in Figure 3. The aim was to detect the proportion of coarse and fine material. Using Mie theory the amount of fine material in the sample is determined correctly (78% < 80ïm) but the Fraunhofer approximation generates a particle size distribution that is clearly different. It overestimates the fine fraction in the sample because it cannot properly account for refraction of light in the particle phase (the particles were transparent).
Today, laser diffraction is the most widely used of the particle sizing technologies, with applications throughout industry and academia. ISO13320:2009 substantially updates the laser diffraction analysis standard, including much improved advice on method development and optical model selection, and reflecting a marked growth in application knowledge over the past decade. There is now substantial guidance to enable users to get the most from an investment in a laser diffraction analyser. Established systems such as the Mastersizer 2000 already enable the guidance to be fully implemented with respect to sampling, dispersion and measurement. Users can therefore take full advantage of the latest advice. The challenge now for manufacturers is to lessen the applications development load so that users can very simply and easily measure the parameters of interest for their products.