Presentation Profile

*KEYNOTE* Activities of The NELAC Institute (TNI) to Improve Data Quality

Currently Scheduled: 10/12/2022 - 1:00 PM - 2:00 PM
Room: Exhibit Hall A4

Main Author
Jerry Parr - The NELAC Institute

Abstract Number: 99
Abstract:

Part 1:  How a Quality Management System Improves Data Quality and Laboratory Performance Since its inception, individuals active in the US National Environmental Laboratory Accreditation Program (NELAP) have struggled to try and find information to convince others that implementing a Quality Management System (QMS) according to the requirements in the TNI Standard Management and Technical Requirements for Laboratories Performing Environmental Measurements improves data quality.  Over the years, various surveys and studies have been conducted to try and prove this assertion.  However, the problem has been, and always will be, we never know the true concentration of a contaminant in a sample.  All measurements are only estimates of the true concentration, and quality control data does not always directly correlate with the measurement of the actual contaminant. In 2019, TNI’s Advocacy Committee initiated a new effort to document improved laboratory performance and trust in the data provided by accredited laboratories.  This presentation will summarize the outcomes from the Committee’s efforts and provide real examples of why implementing a QMS based the TNI standard makes a real difference in both data quality and laboratory performance.

Part 2:  Evaluating the Goodness of Instrument Calibration for Chromatography Procedures Instrument calibrations in analysis methods using gas chromatography (GC) or liquid chromatography (LC) are usually created using either average response factor (RF) or linear (sometimes quadratic) regression equations. The quality, or “goodness,” of an Average RF calibration is measured by relative standard deviation (RSD) whereas the linear regression has historically been measured by correlation coefficient (r) or coefficient of determination (r2). This presentation will demonstrate that the goodness of a calibration measured by RSD is quite different than r/r2 and that while one is suitable as a calibration measure, the other is not. Relative standard error (RSE) is a way of extending the use of relative measures of calibration quality beyond the average RF calibration type.

Back to speaker bio