Being certain of the uncertainties
Secure the instruments performance
You have already ensured the benefit of quick sample measurement with technologies such as FT-IR and FT-NIR. Taking a sample and measuring specific content can now be achieved in minutes rather than days. Your company has improved its ability to produce goods according to specifications and is saving significant money using fast and reliable instruments.
The instruments, however, need supervision. They must be checked and re-calibrated at regular intervals in order to maintain the precision required to sustain the business case. Some results are used in Big Data interpretations and internal data flows and the effect of lost data quality is almost certainly bad data out.
AnalyticTrust are the Veracity in the 4V Big Data Model It has been shown by AnalyticTrust customers that savings between 0.1% and 1% on raw materials and finished products are possible if the instruments can be trusted to be continuously precise and accurate.
Making sure the “must do’s” are done
In daily operation AnalyticTrust manages a customisable schedule of verification activities very much like a workflow management tool. Validation must be performed at regular intervals and reminders are sent to operators. In cases of non-compliance escalation emails are sent to management.
In addition AnalyticTrust integrates directly to instruments and computers, receiving the results of measurements and storing them in the instrument database. This allows AnalyticTrust to perform analysis of the validation results and predict instrument degradation and failure. Automatic Performance Degradation detection is used to request unscheduled instruments calibration or parts replacement.
5 critical aspects of Validation
AnalyticTrust helps you answer 5 critical questions
- Is the hardware okay?
- How about the lab precision?
- What is the influence of sampling errors?
- How precise is the analyser?
- What level of agreement can I trust between the lab and the analyser?
Not only will these five answers help provide a detailed understanding of the analyser’s performance in a wider understanding, but also, if the hardware or lab is showing deviations, do not start recalibrating the analyser, fix the hardware and tune the lab method.
Trusted results – every time.
Sampling and analysis must be carried out at regular intervals to ensure that the product content always follows the specifications. To ensure analytically correct measurements and validated test results, quality monitoring, scheduled control of the analytical equipment and regular calibration and adjustment of the instruments are extremely important. All FT-NIR and FT-IR instruments require ongoing maintenance and adjustment. With AnalyticTrust it is now possible to ensure the simple, fast and safe management of all your analytical instruments.
Easy and safe monitoring
AnalyticTrust monitors the analyser’s hardware performance and may be set to automatically secure all data from instruments, sampling, calibrations, applications and reference laboratories. AnalyticTrust has a SPC based built-in alert system that automatically sends messages when results and trends deviate from the criteria in the QA plan. You can immediately adjust the analyser and return to the highest product quality.
Multiple test settings
With AnalyticTrust it is easy to set up tests for individual product types by selecting the type of test, the analysis frequency and specifying the high-low limits for the instrument’s performance. Choose between the traditional test types, where you look directly at compliance of the instrument’s performance and reference method. Alternatively, you can select a series of tests that examine part of the total analysis plan such as the sampling process, uncertainty from the reference method and instrument stability – areas that each affect the accuracy of the results and precision. Finally, you can implement the same tests globally throughout your organisation.
Put your structure in one place
Organise AnalyticTrust to align with your analytical organisation and structure. List your production sites and labs the way you prefer. Add your instruments including all relevant information, such as serial number, software version, support agreement and anything else that you find important. Define your applications for each instrument and select whether all products and parameters are equally important. Aim too high and end up going too low – know the feeling? Set the users up with roles and privileges. These could be operators, managers, administrators or specialists.
Establish your local and global plans
Now you have completed the basic information, you design the structure of your QA plans. You can have as many plans as you need. A QA plan is the way of arranging validation in schedules – both local or global plans – which instruments it is applicable for, which applications, how often and how much.
You can adapt the QA-plans to contain validation of any mix of:
- Reference laboratory performance data from your own labs or from external service providers
- Primary sampling robustness
- Instrument repeatability
- Hardware performance
- Overall Instrument agreement with reference methods
- Periodic Calibration samples
- Daily pilot samples
Increase your daily productivity
The QA plans are executed by the built-in QA scheduler, creating an optimal sequence of actions. These actions are seen in task lists for work to be planned optimally.
Should the operator miss a due date for a task, a reminder email is sent. When the job is completed, the results are entered in the fields offered by the system. In this way the system automatically keeps track of what to do and when.
If tasks are not performed as specified by the QA plan, an email l and alert is escalated to the site manager and, eventually, all the way to the global QA manager.
At all organisational levels, a login is available with relevant status information, including an executive summary at the top level. All the data that flow to the top management dashboards via huge data structures are valid and the quality of the data is suitable for management-level decisions in taking actions to optimise work processes.
Total analysis quality management tool
Once collected, all data are immediately processed and presented in control charts of various types. All charts can be configured with limits and alerts will automatically be sent to the relevant personnel.
All charts use regular statistics such as standard deviations, correlations, mean and average, but even more powerful is the use of statistical process control with configurable alerts on significant trends and tendencies – the system offers you a chance to react before it becomes a real problem.