Weights and Scales – Copy

Flowmeters 101 – Turbine and PD meters 

Flowmeters play a vital role in sanitary processing. They are used to measure incoming raw materials, incoming water supply, CIP solutions, ingredients in your formulation, final product production and even waste water leaving the plant. Considering their use in critical applications, ensuring that you are using the right type of meter with the correct level of accuracy for your application can be the difference in the quality of your product and save you thousands of dollars in lost revenue or profit. Before we begin, let’s cover a few basics of flow. Both gas and liquid flow can be measured in volumetric or mass flow rates such as gallons per minute or pounds per minute, respectively. These measurements are related to each other by the density of the product. In engineering terms, the volumetric flow rate is usually given the symbol 𝑸 and the mass flow rate is given the symbol ṁ. For a fluid having a density 𝝆, mass and volumetric flow rates are related by ṁ = 𝝆 ∗ 𝑸. In sanitary processing, one will typically find mechanical flowmeters (Positive Displacement, Turbine), electromagnetic and Coriolis flowmeters.

Turbine Flowmeters
Turbine flowmeters use the mechanical energy of the fluid to rotate a “pinwheel”
(rotor) in the flow stream. Blades on the rotor are angled to transform energy from
the flow stream into rotational energy. The rotor shaft spins on bearings. When the
fluid moves faster, the rotor spins proportionally faster.
Shaft rotation can be sensed mechanically or by detecting the movement of the
blades. Blade movement is often detected magnetically, with each blade or embedded piece of metal generating a
pulse. Turbine flowmeter sensors are typically located external to the flowing stream to avoid material of construction
constraints that would result if wetted sensors were used. When the fluid moves faster, more pulses are generated. The
transmitter processes the pulse signal to determine the flow of the fluid. Transmitters and sensing systems are available
to sense flow in both the forward and reverse flow directions

For example, a scale with an IP-54 Rating is “Protected against dust and splashing water”. The “5” means that protection from dust is not totally prevented, but dust does not enter in sufficient quantity to interfere with satisfactory operation of the equipment. The “4” means water splashed against the enclosure from any direction shall have no harmful effect. The highest IP rating for a scale is an IP-69K Rating. This rating means that a strong water jet directed at the sensor from 4 directions must not have any harmful effects. A jet nozzle at 0°, 30°, 60° and 90° to the scale on a rotating table at 176° + 8°F, 4-6 inches away at 1250-1500psi. The test time is 2 minutes.

If you want more information, contact us by phone or email. 

Weights and Scales

Weights and Scales 

The measurement of ingredients in processing is fundamental and it is important that the accuracy of the measurement
is fit for purpose, in other words, it meets the requirements of the application. However, every measurement is inexact
and requires a statement of uncertainty to quantify that inexactness.

Accurate measurement enables us to:
 Maintain quality control during production processes
 Calibrate instruments and achieve traceability to a national measurement standard
 Develop, maintain and compare national and international measurement standards

Successful measurement depends on:
 Accurate instruments
 Traceability to national standards
 An understanding of uncertainty
 Application of good measurement practice

Weighing Scales are devices used to determine weight and are divided into two main categories: Spring Scales and Balance Beam Scales. Balance beam type scales are the oldest type and measure weight using a fulcrum or pivot and a lever with the unknown weight placed on one end of the lever, and a counterweight applied to the other end. When the lever is balanced, the unknown weight and the counterweight are equal. Spring scales were introduced in the 1760’s as a more compact alternative to the popular steelyard balance. Spring scales work based on the principal of the spring which deforms in proportion to the weight placed on the load receiving end. Strain gauge scales became popular in the 1960’s and used a special type of spring called a load cell. Strain gauge scales are the most commonly used in today’s market but electronic force restoration balances are used in laboratory and high precision applications. When discussing weights and scales, one question that often gets asked is “What’s the difference between accuracy and precision?”

For example, a scale with an IP-54 Rating is “Protected against dust and splashing water”. The “5” means that protection from dust is not totally prevented, but dust does not enter in sufficient quantity to interfere with satisfactory operation of the equipment. The “4” means water splashed against the enclosure from any direction shall have no harmful effect. The highest IP rating for a scale is an IP-69K Rating. This rating means that a strong water jet directed at the sensor from 4 directions must not have any harmful effects. A jet nozzle at 0°, 30°, 60° and 90° to the scale on a rotating table at 176° + 8°F, 4-6 inches away at 1250-1500psi. The test time is 2 minutes.

If you want more information, contact us by phone or email. 

Shrink Processing Equipment Maintenance Costs

How To Shrink Processing Equipment Maintenance Costs  

A key to keeping a processing facility running at optimum performance is routine upkeep on equipment. Preventative maintenance on your plant’s equipment is an excellent starting point to reduce costs, but there are other ways to lower overheads and keep your plant highly efficient. Here are a few questions to ask to do so.

What’s Vital, What’s Not?

Prior to making a decision to purchase processing equipment, speak with it manufacturer and your plant’s operators and engineers about it. Be sure to discuss whether or not suggested preventative maintenance is in fact needed, as unneeded upkeep can cause equipment malfunctions. Further, be sure to review maintenance procedures annually, making any necessary adjustments as necessary. By following this strategy, your company can get the most out of its equipment while minimizing downtime caused by breakdowns.

Is Maintenance REALLY Needed?
Processing equipment manufacturers offer suggested timeframes to perform maintenance and even rebuild machines.
However, these recommendations may not be completely accurate with your facility’s needs. If a particular piece of
equipment’s manufacturer suggests maintenance every three months, but the equipment is only running for a few,
sporadic hours in that time frame, is the effort of upkeep really needed? Make sure your equipment is on a maintenance
schedule applicable to its actual use.

When’s The Best Time For Maintenance?
Routine upkeep on your facility’s equipment should be just that: on a scheduled routine. Examine the patterns of
productivity of each machine and schedule maintenance around downtimes. This will allow your plant to take
equipment out of service at a time that won’t hinder efficiency much.
Can Employees Be More Efficient?
Any time your facility can keep maintenance and repair work in house is a time of cost savings. It may be beneficial to
train employees to do these tasks on a routine basis. This will take the weight of repairs off of senior staff, and even a
paid third party. Production employees should be able to clean processing equipment, complete inspections of machines
and parts, and examine equipment for non-characteristic behavior. Should the machine begin to act in an
uncharacteristic manner, an expert should be brought in to examine the machine and plan a course of action.

Is There An Overall Upkeep Plan?
Preventative maintenance is only one form of caring for your facility’s equipment. A complete maintenance plan for
your facility’s processing equipment should include two other forms of upkeep: predictive maintenance — using best
practices and prearranged plans to determine when a machine will need attention — and reactive maintenance —
unplanned, but necessary fixes and repairs. By using all three maintenance types together can help control costs
while maximizing production times and minimizing downtime.

Here are five tips to keep your equipment in top form:
1. Upkeep
A great place for processors to start is to simply follow the equipment manufacturers’ recommendations for planning
preventative maintenance. Regularly inspecting all of the equipment’s components, replacing worn out parts, and
upgrading various components to higher-quality alternatives can help to extend the functional life of the equipment and
avoid any costly breakdowns. Additionally, by keeping a detailed log for each piece of equipment, processors can ensure
that preventative maintenance practices are being properly followed. Don’t forget to regularly lubricate your equipment
with the proper grade lubricant.

2. Routine Calibration
Equipment gauges can naturally fall out of alignment over time, which can cause issues such as disproportionate mixing
or inaccurate weighing of products. By regularly calibrating equipment, processors can bring gauges back into alignment
and restore accuracy throughout the production line. To be sure all equipment continues to maintain a high level of
performance, processors should aim to regularly calibrate their equipment at least once per month.

3. Keep Extra Parts
Even with diligent preventative maintenance, equipment can still experience breakdowns. In the event of a breakdown,
processors can save valuable time and reduce operational downtime by stocking the spare parts recommended by the
equipment manufacturer. Having part replacements on hand will enable processors to get the equipment up and
running again and limit the losses of any breakdowns that may occur.

4. Operator Education
As with many elements of production, there is a right way and a wrong way for processing equipment to be operated.
Incorrect operation will likely cause unnecessary increases in wear and tear, reducing the functional life of the
equipment. Worse still, improper operation can result in outright equipment breakage, which can result in costly
replacement or repair. Spending the upfront time and cost to properly train employees is an investment that will pay off
in the long run. When operators are properly trained in the setup and orientation of processing equipment, production
efficiency can be improved and equipment will last longer.

5. Inspection
Aside from preventative maintenance and calibration, equipment needs to be inspected on a regular basis. One of the
best ways to approach inspection is to create checklists for what to look for in the way of wear and tear and inspect all
equipment components thoroughly. Although inspection may cost time, close inspection can catch potential
breakdowns before they happen, limiting downtime and mitigating repair costs.

If you want more information, contact us by phone or email. 

Retention Pond Mixer – Case study

Retention Pond Mixer – Case Study  

Does your plant have a basin retention pond behind the plant? If so, how does your submersible mixer and motor stand up to these harsh conditions?

A toothpaste manufacturer was struggling to find a solution for keeping their retention basins mixed. These basins are loaded with abrasive materials, wastewater, and CIP solution. As seen in the picture, their old submersible mixer and motors were struggling in this harsh environment. Plant associates were servicing and often replacing the motors every few months. Our M.G. Newell sales associate recommended a motor from Stainless Motors, Inc (SMI). SMI is a US-based manufacturer of stainless-steel wash-down motors, gear reducers and couplings. They developed the first stainless washdown electric motor offered on the market. We took their old prop and shroud from one of their mixer setups and attached it to a new SMI stainless washdown motor. The motor has been installed and running 24/7 for nearly 2 years with NO problems and no maintenance needed. Plant personnel can simply check the amp draw for feedback on motor performance.

In our busy day to day operations, we sometimes lose sight of the time and energy spent on maintenance. The SMI motor was slightly more expensive in its up-front cost, however plant associates acknowledge that the ROI occurred within the first 6 months. It also means less time that they must spend digging into this muck.

 

 

We realize in this “challenged” economy that everyone is
looking for ways to tighten their process parameters and keep
costs down. Contact one of our associates to see how We
Make It Work Better.

 

If you want more information, contact us by phone or email. 

Purpose of a Calibration

Purpose of a Calibration 

There are three main reasons for having instruments calibrated:
1. To ensure readings from an instrument are consistent with other measurements.
2. To determine the accuracy of the instrument readings.
3. To establish the reliability of the instrument i.e. that it can be trusted.

Traceability: relating your measurements to others
The results of measurements are most useful if they relate to similar measurements, perhaps made at a different time, a different place, by a different person with a different instrument. Such measurements allow manufacturing processes to
be kept in control from one day to the next and from one factory to another. Manufacturers and exporters require such measurements to know that they will satisfy their clients’ specifications.

Most countries have a system of accreditation for calibration laboratories. Accreditation is the recognition by an official accreditation body of a laboratory’s competence to calibrate, test, or measure an instrument or product. The assessment is made against criteria laid down by international standards. Accreditation ensures that the links back to the national standard are based on sound procedures.

Uncertainty: how accurate are your measurements?

Ultimately all measurements are used to help make decisions, and poor quality measurements result in poor quality decisions. The uncertainty in a measurement is a numerical estimate of the spread of values that could reasonably be attributed to the quantity. It is a measure of the quality of a measurement and provides the means to assess and minimize the risk and possible consequences of poor decisions.

For example we may want to determine whether the diameter of a lawn mower shaft is too big, too small or just right. Our aim is to balance the cost of rejecting good shafts and of customer complaints if we were to accept faulty shafts, against the cost of an accurate but over engineered measurement system. When making these decisions the uncertainty in the measurement is as important as the measurement itself. The uncertainty reported on your certificate is information necessary for you to calculate the uncertainty in your measurements.

Reliability: can I trust the instrument?
Many measuring instruments read directly in terms of the SI units, and have a specified accuracy greater than needed for most tasks. With such an instrument, where corrections and uncertainties are negligible, the user simply wants to know that the instrument is reliable. Unfortunately a large number of instruments are not. Approximately one in six of all of the instruments sent to MSL for calibration are judged to be unreliable or unfit for purpose in some way. This failure rate is typical of that experienced by most calibration laboratories and is not related to the cost or complexity of the instrument. Reliability is judged primarily by the absence of any behavior that would indicate that the instrument is or may be faulty. 

Achieving Traceability in your measurements
Many quantities of practical interest such as color, loudness and comfort are difficult to define because they relate to human attributes. Others such as viscosity, flammability, and thermal conductivity are sensitive to the conditions under which the measurement is made, and it may not be possible to trace these measurements to the SI units. For these reasons the international measurement community establishes documentary standards (procedures) that define how such quantities are to be measured so as to provide the means for comparing the quality of goods or ensuring that safety and health requirements are satisfied.

To make a traceable measurement three elements are required:

1. An appropriate and recognized definition of how the quantity should be measured,
2. A calibrated measuring instrument, and
3. Competent staff able to interpret the standard or procedure, and use the instrument.

For those who buy their measurement services from other companies it pays to purchase from a laboratory that has been independently assessed as being technically competent to provide the measurement services.

Adjustment: what a calibration is not
Calibration does not usually involve the adjustment of an instrument so that it reads ‘true’. Indeed adjustments made as a part of a calibration often detract from the reliability of an instrument because they may destroy or weaken the instrument’s history of stability. The adjustment may also prevent the calibration from being used retrospectively. When MSL adjusts an instrument it normally issues a calibration report with both the ‘as received’ and ‘after adjustment’ values.

What a calibration certificate contains
Your calibration certificate must contain certain information if it is to fulfil its purpose of supporting traceable
measurements. This information can be divided into several categories:

 it establishes the identity and credibility of the calibrating laboratory;
 it uniquely identifies the instrument and its owner;
 it identifies the measurements made; and
 it is an unambiguous statement of the results, including an uncertainty statement.

In some cases the information contained in your certificate might seem obvious but ISO Guide 25 grew out of the experience that stating the obvious is the only reliable policy.

If you want more information, contact us by phone or email. 

Load Cell Basics

Load Cells 101

Weights play a significant role in our lives, more than we might realize. Knowing the weight of a particular substance is the most accurate measurement for private and industrial spheres alike. For example, we use weights to price food at the grocery’s self-checkout line or know how healthy we are at home. Weights also indicate precise measurements for ingredients, agricultural products, medical products, and much more. How do we know what things weigh? Load cells save the day.

WHAT IS A LOAD CELL?

A load cell is not a scale or a balance, but a transducer or sensor, which measures mechanical force and converts the energy of a force into a digital or analog measurable output. The force applied to the load is proportional to the strength of the output. A load cell can use different methods to translate force into a weight measurement. This paper will cover designs according to the type of output signal generated – hydraulic, pneumatic and strain gauge. The most common load cell used in industrial weighing are strain gauge load cells.

TYPES OF LOAD CELLS
Hydraulic:
The word hydraulic should let us know that this sensor will work by using fluid, whether water or oil. A hydraulic load cell uses water or a liquid to measure the mechanical force of an object. A change in the pressure in the internal liquid translates into weight.

Hydraulic load cells consist of:

– An elastic diaphragm
– A piston with a loading platform on top of the diaphragm
– Oil or water that will be inside the piston
– A bourdon tube pressure gauge

When a load is placed on the loading platform the piston applies pressure to the liquid contained inside it. The pressure increase of the liquid is proportional to the applied force or weight. After calibrating the pressure, you can accurately measure the force or weight applied to the hydraulic load cell. The pressure reading can be read as an analog gauge or it can be converted into an electric signal from a pressure sensor. If the load cells have been properly installed and calibrated, accuracy can be within 0.25% full scale or better, acceptable for most process weighing applications. Because this sensor has no electric components, it is ideal for use in hazardous
areas. Typical hydraulic load cell applications include tank, bin, and hopper weighing. For maximum accuracy, the weight of the tank should be obtained by locating one force sensor at each point of support and summing their outputs.

Pneumatic:
Since it is pneumatic, we know that it will deal with air pressure. A pneumatic load cell consists of an elastic diaphragm which is attached to a platform surface where the weight will be measured. There will be an air regulator that will limit the flow of air pressure to the system and a pressure gauge. Thus, when an object is placed on a pneumatic load cell, it uses pressurized air or gas to balance out the weight of the object. The air required to balance out the weight will determine how heavy the object weights. The pressure gauge can convert the air pressure reading into an electrical signal. They take relatively small weights and have multiple sensors for greater accuracy. Pneumatic load cells use multiple dampener chambers to provide higher accuracy than can a hydraulic device. In some designs, the first dampener chamber is used as a tare weight chamber. Pneumatic load cells are often used to measure relatively small weights in industries where cleanliness and safety are of prime concern. The advantages of this type of load cell include their being inherently explosion proof and insensitive to temperature variations. Additionally, they contain no fluids that might contaminate the process if the diaphragm ruptures. Disadvantages include relatively slow speed of response and the need for clean, dry, regulated air or nitrogen.

Strain Gauge:
A strain gauge load cell is a transducer that changes in electrical resistance when under stress or strain. The electrical resistance is proportional to the stress or strain placed on the cell making it easy to calibrate into an accurate measurement. The electrical resistance from the strain gauge is linear therefore it can be converted into a force and then a weight if needed. A strain gauge load cell is made up of 4 strain gauges in a “Wheatstone” bridge configuration. A Wheatstone bridge is an electrical circuit that measures unknown electrical resistance by balancing two legs of a bridge circuit, one of the legs contains the unknown component. The “Wheatstone bridge” circuit provides incredibly accurate measurements. The strain gauges that are in the Wheatstone bridge are bonded onto a beam which deforms when weight is applied.

 

How to Choose a Load Cell for Your Application
Determining which load cell your application requires depends on how sensitive and accurate your application needs to be. A strain gauge type of load cell would be first in line when it comes to accuracy and sensitivity. While still useful in certain applications, pneumatic and hydraulic load cells would be the less sensitive and accurate types.

If you want more information, contact us by phone or email. 

5 Questions to Ask your Newell Automation Controls Engineer?

5 Questions to Ask your Newell Automation Controls Engineer?

We get it – figuring out what kind of system upgrade you really need can be tough, especially when you’re not even sure which questions to start with.  You know you want to improve the efficiency and reliability of your production processes.  You need someone with proven expertise in integrated automation solutions. 

We’ve pulled together a few key questions that can reveal a lot about how an automation partner will support you, your process, and your long-term goals. These questions are designed to help you understand not just what a provider can do, but how they think, communicate, and deliver throughout the entire project.

Process Optimization & Data Collection: What specific methods or technologies do you employ to capture critical process data, and how is this data used to drive efficiency improvements and cost reduction for the customer?

Project Life Cycle & Support: Can you describe the typical project engagement process, from initial design and panel fabrication to start-up, commissioning, and ongoing post-implementation support?

Customization & Specific Requirements: Given our specific industry requirements (e.g., sanitary standards), how does your team ensure that the automation solutions are customized to meet our precise needs and compliance standards?

Troubleshooting & Risk Management: What is your philosophy for handling intermittent faults or unexpected downtime during commissioning and production, and how do you mitigate potential system failures?

System Integration & Legacy Equipment: How does Newell Automation approach integrating new automation technologies with a customer’s existing or legacy infrastructure, especially when dealing with mixed-vendor equipment?

Newell Automation’s expertise in UL® Certified control panel design, PLC/HMI programming, and system integration covers many industries and technologies.  While you may have questions about your upgrade, there should be no question that Newell Automation is your best partner!

Flowmeter Maintenance – Case Study

Flow Meter – Case Study 

Have you ever seen an error message pop up on your equipment? Did you bang your head against the wall trying to figure out what the error message was? A brewery in Tennessee was having issues with their flow meter and/or controls system on their kegging line. Kegs were overfilling, then underfilling. The readings on the flowmeter were erratic and they were having to start and stop the process manually. The process was so troublesome that the brewer was intentionally overfilling kegs just to make sure they were not shorting their customers. The brewer was getting an ‘Error 900’ message, but after scouring their paperwork and the internet, no one could find that error message ANYWHERE! The kegging system was a European system with a European flow meter. The brewer was resigned to the fact that he was going to have to pay a technician to come from Europe to help identify and fix the flowmeter. The M.G. Newell salesman and calibration technician stepped in to take a look at the process. When they first arrived, the system was down for cleaning. They took that opportunity to research the process and the equipment. The next day, they returned and spent an hour observing the process when they spotted the problem. The solution was short! Literally, a short – a loose wire on the back of the flowmeter. The M.G. Newell salesman made suggestions of how to proceed to identify the exact wire and how to fix it. Two days later, a follow up text from the brewer stated that they were up and running consistently – no overfilling, no underfilling. They were hitting their fill target exactly! He saved money on the service call and he doesn’t have to give away extra beer for free.

We all get frustrated when things don’t run the way you expect. Our engineers, sales team and calibration technicians have many years of experience across a wide range of equipment and processes. You won’t find much that they haven’t seen before. We are happy to share that experience with you. Contact one of our associates to see how We Make It Work Better.

If you want more information, contact us by phone or email. 

Extending the Life of your Thermometer

Extending the Life of your Thermometer 

Quality sanitary thermometers are built to provide a long service life.

Here are a few tips to help you get the most out of your instrumentation.

Environmental Conditions:
The ambient temperature could have a negative impact on the performance of your thermometer. Electronic thermometers tend to have a lower ambient operating range, typically -40° to 160°F, than mechanical types like bimetal thermometers which can operate in an ambient temperature of up to 200°F. Most quality sanitary thermometers are hermetically sealed and are suited for use in environments where humidity or moisture is high. If your thermometer is submerged or subjected to high pressure spray and not rated for those conditions, water damage will likely result.

Vibration:
Vibration is a main cause of loss of accuracy and failure for sanitary thermometers. A
silicone filled case should be used in applications where high vibration is present. The fluid will assist in dampening the internals of the thermometer, improving readability, and helping prolong its life. Use of silicone fill should be avoided where strong oxidizing agents such as chlorine, nitric acid and/or hydrogen peroxide are present.

Out of Range:
The measuring range should be selected so that the system temperature falls at approx. the mid-point of the scale. Care should be exercised for mechanical thermometers (bimetal, gas and vapor tension) to ensure that they are not exposed
to temperatures higher or lower than the measuring range, thus preventing damage to the bimetal element and other components. Bimetal thermometers should not be exposed continuously to process temperatures over 800°F to avoid damaging the bimetal element.

Process Fluid:
The type of process fluid may have a damaging effect on the thermometer wetted parts. The use of a thermowell for applications with corrosive or caustic fluids, or those contained under pressure, will protect the stem of the thermometer and also allow it to be removed from the process without shutting down the system.

Impact:
For applications that are prone to possible impact, lens material such as acrylic, polycarbonate or shatterproof glass will highly reduce the risk of damage.

If you want more information, contact us by phone or email. 

Calibration Principles

Calibration Principles 

Calibration is the activity of checking, by comparison with a standard, the accuracy of a measuring instrument of any type. It may also include adjustment of the instrument to bring it into alignment with the standard. Even the most precise measurement instrument is of no use if you cannot be sure that it is reading accurately – or, more realistically, that you know what the error of measurement is. Let’s begin with a few definitions:
 Calibration range – the region between the within which a quantity is measured, received or transmitted which
is expressed by stating the lower and upper range values.
 Zero value – the lower end of the calibration range
 Span – the difference between the upper and lower range
 Instrument range – the capability of the instrument; may be different than the calibration range

For example, an electronic pressure transmitter may have an instrument range of 0–750 psig and output of 4-to-20 milliamps (mA). However, the engineer has determined the instrument will be calibrated for 0-to-300 psig = 4-to-20 mA. Therefore, the calibration range would be specified as 0-to-300 psig = 4-to-20 mA. In this example, the zero input value is 0 psig and zero output value is 4 mA. The input span is 300 psig and the output span is 16 mA.

Ideally a product would produce test results that exactly match the sample value, with no error at any point within the calibrated range. This line has been labeled “Ideal Results”. However, without calibration, an actual product may produce test results different from the sample value, with a potentially large error. Calibrating the product can improve this situation significantly. During calibration, the product is “taught” using the known values of Calibrators 1 and 2 what result it should provide. The process eliminates the errors at these two points, in effect moving the “Before Calibration” curve closer to the Ideal Results line shown by the “After Calibration” curve. The error has been reduced to zero at the calibration points, and the residual error at any other point within the
operating range is within the manufacturer’s published linearity or accuracy specification.

Every calibration should be performed to a specified tolerance. The terms tolerance and accuracy are often used incorrectly. In ISA’s The Automation, Systems, and Instrumentation Dictionary, the definitions for each are as follows:
 Accuracy – the ratio of the error to the full scale output or the ratio of the error to the output, expressed in percent span or percent reading, respectively.
 Tolerance – permissible deviation from a specified value; may be expressed in measurement units, percent of span, or percent of reading.

It is recommended that the tolerance, specified in measurement units, is used for the calibration requirements performed at your facility. By specifying an actual value, mistakes caused by calculating percentages of span or reading are eliminated. Also, tolerances should be specified in the units measured for the calibration. Calibration tolerances should be determined from a combination of factors.

These factors include:
• Requirements of the process
• Capability of available test equipment
• Consistency with similar instruments at your facility
• Manufacturer’s specified tolerance

The term Accuracy Ratio was used in the past to describe the relationship between the accuracy of the test standard and the accuracy of the instrument under test. A good rule of thumb is to ensure an accuracy ratio of 4:1 when performing calibrations. This means the instrument or standard used should be four times more accurate than the instrument being checked. In other words, the test equipment (such as a field standard) used to calibrate the process instrument should be four times more accurate than the process instrument. With today’s technology, an accuracy ratio of 4:1 is becoming more difficult to achieve. Why is a 4:1 ratio recommended? Ensuring a 4:1 ratio will minimize the effect of the accuracy of the standard on the overall calibration accuracy. If a higher level standard is found to be out of tolerance by a factor of two, for example, the calibrations performed using that standard are less likely to be compromised. The out-of-tolerance standard still needs to be investigated by reverse traceability of all calibrations performed using the test standard. However, our assurance is high that the process instrument is within tolerance.

Traceability

Last but not least, all calibrations should be performed traceable to a nationally or internationally recognized standard. For example, in the United States, the National Institute of Standards and Technology (NIST) maintains the nationally recognized standards. Traceability is defined by ANSI/NCSL Z540-1-1994 as “the property of a result of a measurement whereby it can be related to appropriate standards, generally national or international standards, through an unbroken chain of comparisons.” Note this does not mean a calibration shop needs to have its standards calibrated with a primary standard. It means that the calibrations performed are traceable to NIST through all the standards used to calibrate the
standards, no matter how many levels exist between the shop and NIST. Traceability is accomplished by ensuring the test standards we use are routinely calibrated by “higher level” reference standards. Typically the standards we use from the shop are sent out periodically to a standards lab which has more accurate test equipment. The standards from the calibration lab are periodically checked for calibration by “higher level” standards, and so on until eventually the standards are tested against Primary Standards maintained by NIST or another internationally recognized standard.
The calibration technician’s role in maintaining traceability is to ensure the test standard is within its calibration interval and the unique identifier is recorded on the applicable calibration data sheet when the instrument calibration is performed. Additionally, when test standards are calibrated, the calibration documentation must bereviewed for accuracy and to ensure it was performed using NIST traceable equipment. M.G. Newell offers a variety of calibration services that keep your operations consistent and cost effective. Contact your local account manager for rates and plan options. 

If you want more information, contact us by phone or email.