An expert at your shoulder

August 7, 2007
Uncategorised
In Depth

Most modern analytical systems help the user to make good measurements, some advertising 'single click' analysis. Generating meaningful data, however, is not simply about making a measurement. Now 'expert systems' are providing a helping hand to get that little bit extra out of your analysis

Most modern analytical systems help the user to make good measurements, some advertising 'single click' analysis. Generating meaningful data, however, is not simply about making a measurement. Now 'expert systems' are providing a helping hand to get that little bit extra out of your analysis

MAKING a good particle size measurement has much in common with all analytical techniques. Underpinning the measurement is the development and use of a methodology designed to give confidence that results produced are meaningful. Probably the most important requirement in the development of such a method is the experience and knowledge of an expert in the subject.

The common elements for most techniques include:

• obtaining a representative sample of material
• sample preparation
• selecting a suitable technique to measure the required parameter
• experiment design
• data collection
• data analysis

All of these elements are important but, the first five [see box] can eventually be incorporated into a defined method, which can become a standard measurement procedure or standard operating procedure.

The difficulty usually arises when we come to analysis of the data. This is because assessing the quality of the data requires the person interpreting the data to have considerable experience and training in the analytical technique and instrument being used. The example we take here is that of light scattering for nanoparticle size analysis. With many combinations of diagnostic parameters to consider, this is a field that can be challenging for less experienced users to pinpoint the most important information.

Critical factors in good measurement

Representative sampling
This is probably the most important, yet studiously ignored part of any analytical method. It is obvious that any technique can only report measurements of the material with which it is presented. However, the difficulty of representatively selecting a gram, a milligram, or less from a large batch often results in the use of poor techniques. Errors at this stage can be one or two orders of magnitude higher than any errors produced by the analytical technique itself.

Sample preparation
Many samples will require some method of preparation to meet the requirements of the measurement technique or the information being sought. This can be the removal of oversize particles, dispersion so the primary particle size is measured, or dilution. As anyone who has managed to flocculate a stable dispersion knows, the dilution medium can be critical. Knowledge of the concentration and constituents of the continuous phase of the sample will assist in replicating this as the dilution medium.

Suitability of technique
Selecting a suitable technique for the analytical parameter required will depend on the information needed. The method designer must be aware not only of the specifications of the analytical system proposed, but also any drawbacks to the technique. This is important because most specifications given for a system will be sample dependent. In other words, a particular material may meet some of the requirements of the technique, but possibly not all. An awareness of the sensitivity of the technique to these issues, and the effect on the error in the result, is important. An interesting point to consider is the actual information being sought in terms of its accuracy and detail. Real understanding at this stage can be valuable for method development, especially when the full spectrum of possible data is not required.

Method development
Method development should involve validation or calibration of the technique or method used. A good modern analytical instrument will provide the necessary guidance.

Data collection
Again, a good analytical system will entirely automate data collection.

Webopedia - an internet search engine for computing terms - has a good definition; 'A computer application that performs a task that would otherwise be performed by a human expert'.

The ideal situation for any analysis technique would be to have an 'expert' available to consult at all times, giving advice on the quality of the data, and if necessary how to improve it. As this would be an expensive luxury, the next best option is to distil the experience of the expert, and instead of compiling this as an indigestible book, incorporate it into a program that gives appropriate information at the time it is actually required. This advice should give comments not just on individual measurements, but on groups of measurements to spot trends. The success of such an expert system can be judged by whether the data that passes its scrutiny would be acceptable to an experienced user of the system.

Modern analytical and university laboratories have to deal with two significant issues. The first is staff turnover. By the time a student, for example, has become sufficiently trained to be confident in a technique, it is time to write up the project and start the next one.
The second is the cost of staff in a laboratory. Laboratory personnel today are expected to make measurements using a wide range of techniques. Inevitably this makes it difficult, perhaps impossible, to become expert in each one. Comments from an expert – or an expert system – should therefore increase confidence in the results, or where there are problems in the data they should give suggestions for improvements.

Using light scattering techniques, the Zetasizer Nano from Malvern Instruments can measure the size, zeta potential and molecular weight of colloidal dispersions, emulsions and molecules in solution. Sample preparation has been simplified because samples can be measured at high concentration, reducing the need for dilution, and performing a measurement is a simple automated procedure.

Quality reports introduced in recent software releases allow examination of the data and warn if a parameter exceeds an acceptability threshold. These quality tests are further extended by taking the experience of a number of experts in light scattering, and incorporating this knowledge into the software, to continuously monitor the data produced. This is one step further towards the ideal of an 'expert at your shoulder'. It provides a simple 'good data quality' on reports, or if a comment on the data is deemed necessary it gives the most likely cause of the problem and suggests a solution. It is possible for example to suggest whether or not extending the measurement time would improve the data, or indicate if the sample is changing in some way during a series of measurements. This gives the operator a single place to look to confirm the data quality. If a 'good' data comment is displayed, the user can be confident that the measurements are reliable, repeatable and that the sample is suited to the technique.

A well designed expert system will incorporate the knowledge and understanding of experts in the field to assist with the use of a system or in subsequent data evaluation. Not only can this deliver cautionary messages for issues with data that a less experienced user may miss, but can also provide information to help optimise the measurement, and hence get the most from the technique.

By Malcolm Connah. Malcolm is the product manager for the Nanometrics group at Malvern Instruments that includes a range of technologies for nanomaterials characterisation.

Related Content

Pin It on Pinterest

Share This