Sampling – the four essentials in summary

  1. Physical sampling
  2. Sampling in time
  3. Presenting the sample t the analyser
  4. Optical scanning

1. Physical sampling

Revealing the true chemical composition of a sample by FT-NIR is not trivial, even though it may be very easy to reach some kind of result in seconds. It is even less trivial to document that not only do you know what is in the sample, but this new information speaks for an entire field, tank, truckload, or process state.

To understand the essence of proper sampling requires but one fundamental principle to be followed. Here written a bit simplified for clarity in this introduction

“All particles/parts most have the same (non zero) probability to influence the spectrum on which a later prediction will occur”

The practical world will present us with a few more challenges than can be fixed with one rule. Let us look at a few of them to get an understanding of what a Quant or InSight Pro system must handle. For a full understanding we encourage a deeper story of the theory of sampling.

Sampling can also be viewed as the steps one needs to undertake to battle the unavoidable heterogenic nature of most products and processes.

The heterogeneity will often present itself in two basic expressions. In the Theory of sampling (TOS) we denominate these DH and CH, short for Distributed vs Content heterogeneity.

Distribute heterogeneity means simply that a given concentration is not constant over time, whereas content heterogeneity means that all particles at a given time (t) is not equally distribute in the sample. This is illustrated below

Another fundamental challenge is what we could call the accessibility challenge for our sampling. To really follow the first and only principle, namely the equal probability paradigm – all particles must be equally accessible and that is fare from true in most cases.

The theory of sampling operates with dimensionality in sampling and have create four dimensions of which three are practical (1D, 2D and 3D) and one ideal but hypothetical scenario called 0D.

A tank, truck, ship or pile is seen as three-dimensional, whereas a green-field is seen as 2D and finally we generally think of smaller pipes as one-dimensional.

Needless to state that if you place your self on the edge of a ship and wishes to acquire a single sample, you are more likely to sample the top surface than any other place and hence what is hidden at the bottom will stay hidden to you, but will eventually reach the process as an unknown factor.

Anyone who ever sat on a harvester knows this and everybody else can quickly build a good intuitive sensation of the fact that the protein or moisture content is not the same all over the field. The low, pitted stretch may show a higher moisture content than the parts growing in the sandy areas of the field. Taking just one simple sample will not tell us what the field may yield in total or per hectare.

Sampling a conveyor belt or powder stream is truly a challenge which must be met with respect and proper science to yield highest quality analytical result.

In summary, the highest quality analytical results demand that: ​

  • ​Samples extracted for analysis are representative ​
  • All mass reduction (sub-sampling), handling and preparation must follow strict protocols – which ensure representativity​
  • Sample presentation to an analyzer must be representative​
  • Optical scanning of the sample must be representative

All of the above is necessary to ensures that the FT-NIR spectrum obtained is representative), and only chemometric models based on representative samples/spectra/variation should be used for calibration and measurements.

A major common error is to extract one sample of X grams which fits directly in the analyser. The error can be divided into several sub errors.

Firstly, we should never just “grab a sample” and trust this singularity in time. It would be as wrong as to ask the first voter in a public referendum and make this answer the official result. Here we all know that hundreds if not thousands of voters need a saying before we get an idea of the outcome of the election. Sadly quite often the process gets “asked” only once.

If we wish to battle especially Content heterogeneity at a given time (see below) we need a composite sample – this is also a fundamental rule. Take a few subsamples from “all over” – mix them and from that take one or maybe a few sub-samples and run them through the analyser. The average result will be a much better estimate of the current process state!

It typically not done – as we have no time – but why waste time on getting to the wrong result – fastest – this will never be a priority of Q-Interline !

2. Sampling in time

Sampling in time may be relevant for both laboratory units like the Quant and for the InSight Pro in-line systems. The core question is how often one should analyse to know enough to control the process and hence the product quality and overall yield and income. There are multiple answers to this question and many considerations to entertain, but here we will stick to a sampling perspective of the challenge.

In many practical aspects a sample is taken every X minutes or hours, indifferently of the process variation and only rarely based on statistics.

In principle this theme is about handling distributed heterogeneity and acceptance of the fact that processes are not stable.

Let’s review the below example. The data is a simple creation of a process scenario in which 3 sinusoidal curves have been overlaid with an element of noise added. The total time is 120 minutes.

The process is in principle over time stable around 0 offset from target, but a certain dynamics is present and the timing of sampling becomes evident.

If a sample is drawn after 20 minuttes (red arrow) the QC function would conclude that the process is running some 12% of target. If the operators would then tune down the process at that time the process would no longer reach a zero-average error over the 2 hours.

If we sample more often the process dynamics may be revealed (green arrows) and with better Insights it is possible to make better decisions. If the process is not stable over time, this argues for an InSight Pro solution over the laboratory units and vice versa.

3. Presenting the sample to the analyser

It will all be of little or no use to acquire a perfectly representative sample in time, space and all dimensionalities if at the end the sample is not presented to the analyser in a proper way.

It may seem that by going on-line we skip a lot of problems, but keep in mind that the on-line analysers sees the process through a relatively small probe which can be equivalated to a man studying the world through binoculars – we better point him in the right direction. Likewise, a process analyser must have the probes and cells placed in a proper position to sample the process right. Think of powder sampling and how to avoid all the fines or more coarse particles to have an overrepresentation in the measurement.

Our laboratory Quant units can handle near all sample types, by proper choice of sampling device (accessory). We offer accessories supporting very different physical setups from clean transmission to diffuse transmission and diffuse reflection in several variations.

Physical principleIllustrationSample typesQ-Interline ref.
Transmission Clear liquidsVial sampling
Diffuse transmission Milk productsGO
Diffuse reflection Dairy productsCup, Petri sampling
  Soil and compostBottle sampling
  Silage and heySpiral sampling

The InSightPro unit can with five different probe and cell configurations be adapted for a wide range of  process streams applying the same physical principles as for the Quant units.

Physical principleIllustrationSample typesQ-Interline ref.
Transmission Clear liquids like edible oils, water, chemicalsProbe sampling
   Cross pipe cell
Diffuse transmission Milk productsCross pipe cell
Diffuse reflection Dairy (butter, cheese)Diff.refl. probe
  Semi fine powdersSpoon probe

4. Optical scanning

Optical scanning is the final discipline to master in sampling to acquire optimal spectral data which if combined with premium quality reference chemical data will yield great results with a minimum of effort.

All Q-Interline systems are a bit like a camera. You may go with the settings we have found most optimal in most cases for the specific application (auto) or you may endeavor into a world of settings and options (manual) – the latter for the specialist only – but sure it may be better than auto in some cases – just like a camera.

Very much in line with the “public referendum” in the chapter of physical sampling, here in optical sampling we should also apply the fundamental principles. All particles should have the same chance of becoming part of the spectrum and we need to accept CH.

This means that all sampling options from Q-Interline intended for heterogenic samples will argue not to look at a small spot but rather the entire sample. For on-line systems, the sample is moving past the probes and cells and by tuning observation time we create a spectrum of a given “area”. All laboratory models will typically spin or even tumble the sample and in this way deliver on the promise to obey the fundamental sampling rules!

Once the sample is moving, we still have a number of parameters to play with.

The first is the so-called resolution. This is a fundamental difference between the base Quant engine of
Q-Interline and products based on diode array or dispersive technology where resolution is fixed.

Resolution is the ability to separate to adjacent peaks in the spectrum. Be careful, the band shape of peaks in the NIR region are very broad and overlapping and can only very rarely be resolved and overdoing the resolution will only lead to noisy data. All Q-Interline systems can run with resolution from 2-64 cm-1. Most application of Q-Interline uses a resolution of 32 cm-1 which means a datapoint spacing of 16 cm-1 which is more than enough to get all details at very low noise.

The next parameter is observation time and number of scans to co-add into the final spectrum. The number of measurements pr second varies with resolution so again a trade-off. Less resolution better sampling and vice versa. Below a few examples

ResolutionScans pr min.

The number of scans can be freely set for online, but we recommend setting observation time so that repeatability is the minor factor in the total error budget.

For Quant units with a spinning accessory, number of scans and observation time, are set to match one or eventually two full rotations, but NEVER fractions of rotations ! If we run 1.5 turn in a petrisampler – half the sample got twice the saying and that would be a grouse violation of good sampling practice and a IWE error if one studies the theory of sampling.

en_GBEnglish (UK)