Multi-Point Calibration
Total Page:16
File Type:pdf, Size:1020Kb
API MODEL 200A NO/NO2/NOX – NITROGEN OXIDES ANALYZER MULTI-POINT CALIBRATION
The method of performing the Multi-point Calibration of the Model 200A NOX Analyzer is using the Dasibi Model 5008 Multi-Gas Calibrator at selected ten concentration points. A super blend gas cylinder diluted to an NIST-traceable NO concentration is manufactured by Scott- Marrin. The recorder data is either ESC 8816 or ESC 8800 Data Logger. The calibration spreadsheet for data record is provided in the network drives P:\airmon\forms\airmon.for\mc- noxca. The Multi-point Calibration is regularly schedule every three months or after instrument repairs and any conditions that would affect the calibration and the performance of operation. Prior to calibration of API NOX analyzer, review the status of all TEST values from front panel display that should be within the nominal operation ranges. If any of these values are not within ranges or any fault conditions has to be corrected before proceeding the calibration. The instruction refers to a designated section on particular performance problems through the course of the procedure. Not to be confused between the term PPB and PPM concentration units of this procedure. The API analyzers are setup to gas range units in PPB. The Multi-Gas Calibrators and 8800/8816 Data Loggers are configured to gas level units in PPM. The difference value between concentration units is place 3 decimals to the left for PPB to change the concentration units as PPM as it displayed on data logger. Practically this may all be setup in PPM.
1. Disable channel 5,6 and 7 (NO-NO2-NOX) in the data logger if this is at field site.
2. The analyzer should be operating several hours (preferably overnight) before calibration so it is fully warmed up and operation has stabilized.
3. Unplug the sample pump temporarily to replace the in-line filter with a new filter. If the unit has a flow valve on exhaust port, close the valve to replace the filter.
Perform Analog Output test
4. Verify the analyzer NO/NO2/NOX display output voltage matches the voltage input of the data logger (DAS) NO/NO2/NOX readings. a) The Analog Output test outputs a step voltage pattern and verifies the values of ADC/DAC calibration reflects the measured test channel output voltage.(Data Logger or DVM) b) DVM (Digital Volt Multimeter) should be use for the test confirming correct voltage output against DAS measurements, preferably 8800 data logger. Make all measurements from analyzer output terminal on back panel. c) From ESC 8816 logger Home Menu select and enter Real-Time Display Menu/Display Raw Readings. For ESC 8800 press F3 and key in 00 for Repeat Raw 00. d) Display the Analog Output test on analyzer by pressing button in sequence SETUP- MORE-DIAG-ENTR-NEXT-ENTR. e) Observe the instrument display 0% and scrolls to 20%, 40%, 60%, 80% and 100%. f) Observe the ESC 8816 logger channel display NO (A05)= x.xxxx v, NO2 (A06)= x.xxxx, NOX (A07)= x.xxxx. For 8800 logger find SO2 channel, 05 .xxxxD display, 06 .xxxxD, and 07 .xxxxD display. (x as numeric value) g) Confirm the correct % values on analyzer to logger voltage readings as 0%= 0.000v, 20%= 0.200v, 40%= 0.400v, etc. for 8816 logger and 0%=.0000D, 20%=.2000D, 40%=.4000D etc. for 8800 logger. h) The analyzer displayed scrolling can be stopped by pressing button below percent display to observe the comparison closely, press percent button again to continue. i) If the voltages do not match, D/A Calibration has to be performed. Refer to Section 2 I 1-9. j) Exit back to sample menu on instrument and set the logger back to display Display Readings w/flags for 8816. Press F4 then key in 00 to Repeat Read 00 for 8800 data logger.
Perform Leak Check
4. Disconnect the sample line from sample port if this is at field site.
5. Turn off instrument power switch and disconnect pump power.
6. Remove the top cover of the analyzer.
7. Cap the sample inlet port, remove the DFU filter from ozone generator air inlet at ¼ inch fitting and cap the air inlet.
8. Turn on the instrument, apply pump power and set the TEST function of analyzer to display RCEL.
9. Close the shut-off valve between the pump and the exhaust port.
10. Monitor the RCEL pressure from the point the valve was close to measure reaction cell pressure for 5 minutes. If the pressure changes more than 1 in Hg in 5 minutes, there is a leak. Turn off the instrument power switch, disconnect pump power and tighten all fitting flow lines and seals to prevent the leak. Redo steps 8 to 10 for leak check pressure change in 5 minutes. If the instrument still has a leak, refer to Section A II for Pressure Method Leak Check.
11. If the leak check passed and completed, remove the cap from sample inlet port, from ozone generator air inlet and re-install the DFU filter.
Perform Pressure Check - RCEL Pressure Check and Adjustments
12. Set the TEST function of the analyzer to display SAMP for sample pressure.
13. Obtain an ambient pressure either from monitoring site (channel 10) or call the local weather station for current uncorrected barometric pressure (602) 379-4630. If you call National Weather, please identify yourself your with Air Quality. For shop bench, use either the bench digital Temperature/Pressure gauge or Laboratory Barometer instrument. The laboratory barometer should be converted to uncorrected pressure by subtracting 1.18 of the current measurement during the summer season and subtract 1.175 from correct pressure during the winter season.
14. Disconnect the power from the pump or close the shut-off valve.
15. Allow the pressure reading to stabilize and verify the SAMP display match the obtained ambient pressure. If not, adjust R1 trimpot on Flow/Pressure Sensor board to the obtained ambient pressure. 16. Prior to performing these adjustments check and make necessary corrective action on the performance of pressure and flow systems like: clean inlet filter, orifices and orifice screen filters not plugged up, tighten the flow fittings, and pump is producing enough vacuum.
17. The RCEL reaction cell pressure can be checked or adjusted at this moment. Set the TEST function of the analyzer to display RCEL.
18. Verify the RCEL display match the obtained ambient pressure. If not, adjust R2 trimpot on Flow/Pressure Sensor board to the obtained ambient pressure.
19. Apply power back to the pump or open the shut-off valve.
20. The RCEL nominal operating ranges are 4 to 10 in Hg.
Perform Ozone Flow Check and Adjustment
21. Set the TEST function of the analyzer to display OZONE FL. The OZONE FL nominal operating ranges are 65 to 95 cc/m.
22. If the flow are out of acceptable range, it should be adjusted. Enter the selected function by pressing button in sequence on analyzer SETUP-MORE-DIAG-ENTR-find SIGNAL I/O and ENTR-JUMP-key in 32 then ENTR.
23. The display features “32) OZONE FLOW=xxxx.x MV.
24. Make adjustments on R3 trimpot of Pressure/Sensor board on analyzer to read 2000.0 MV. The reading sets to approximate flow value.
25. EXIT back to sample menu and display the OZONE FL. The flow reading should be between 65 to 95 cc/m.
Perform Sample Flow Check and Adjustment
26. Set the TEST function of the analyzer to display SAMP FLW. The nominal operating flow ranges are 420 to 580 cc/m.
27. If the sample flows are not within the acceptable range, perform sample flow check and adjustment using a mass flowmeter.
28. Connect the Flowmeter (BIOS) to the analyzer sample inlet. The flow correction for standard temperature and pressure is not required. Obtain the Vavg average reading if using BIOS DC-2 flowmeter.
29. Observe the BIOS flow measurement then obtain the 20th average reading and record the flow reading from analyzer. Subtract the analyzer SAMP FLW reading from the BIOS flow value. This result value will be edited in sample flow setting.
30. Display the sample flow adjustment by pressing button in sequence SETUP-MORE-VARS- ENTR-JUMP-key in 4-then ENTR. 31. The display features “4) SFLOW_SET=xxxx CCM”.
32. Press EDIT, then add or subtract the result value from step 29 to the reading shown and key in the new value and press ENTR. The analyzer sample flow updates to the new value.
33. Press ENTR twice for low flow and high flow settings to check the new value does not exceed the warning limits.
34. Press EXIT returns to sample menu and display SAMP FLW to verify flows now match the BIOS average reading.
Calibration
Before starting the Multi-point calibration carefully check the analyzer’s condition such as slope and offset values to ensure if Factory Calibration is recommended if they are not in acceptable range are during the course of the calibration unable to zero or span the instrument. Refer to Section A III Factory Calibration (PMT Calibration). Verify the instruments NO OFFS and NOX OFFS are within –10 to 150 nominal range, and NO SLOPE and NOX SLOPE are within .700 to 1.300 nominal range. If the slope and offset values are outside of the acceptable range, instrument Factory Calibration has to be performed. 8 35. Connect a calibration gas line to sample inlet. Note that the line should have a T-connection with at least three feet vent line.
36. For shop bench calibration, connect the calibration line from the 5008 to a T-connection and to sample inlet. Other end of T-connection should be approximate three feet vent line to avoid pressurize the unit.
37. From the Main Menu of 5008 Multi-Gas Calibrator, begin zero air flow by entering CONTROL/SELECT LEVEL (2,1).
38. Enter the level assigned to zero air (00), then ENT.
39. The display will ask, “Are You Sure?” Press up arrow to enter “y”, then ENT.
40. Observe the 5008 display for flow. Usually zero air flow of 10.0 LPM.
41. Press CAL button on the analyzer front panel. Cal LED should be blinking and the display menu changes to
42. If the ZERO is not displayed, the reading is too far out of adjustment to do a reliable calibration. This result should be corrected before continuing the calibration.
43. While the instrument is stabilizing to a designate setting zero, record the appropriate information about the calibration on NO/NO2/NOX Calibration spreadsheet date, location, serial no., etc.
44. Allow the analyzer to sample zero air until it stabilizes when NOX STB reaches 0.2 PPM to initiate for zero calibration. 45. After stable reading has been obtained press ZERO and ENTR. The analyzer is now zeroed the NO and NOX values.
46. Check the analyzer NO OFFS and NOX OFFS to ensure they are still in acceptable range after zero adjustment.
47. If the DAS zero values is acceptable, record the analyzer NO/NO2/NOX display values and DAS NO/NO2/NOX values to NO/NO2/NOX Calibration spreadsheet.
48. The obtainable DAS value is +/- 0.5 ppm.
49. Press EXIT button on analyzer to complete zero adjustment and returns to instrument sample mode.
50. Enter SELECT LEVEL (1) on 5008. Press (21, ENT) to select .400-ppm NO concentration.
51. Press (Up Arrow, ENT) to change “n” to “y” to initialize .400 ppm NO gas flow.
52. Be certain and verify the actual gas concentration being produced from 5008 Multi-Gas Calibrator. This concentration value will be entered in the analyzer at step 54b-c) and step 57 of calibration spreadsheet under INPUT (PPM). Important factors because expected .400 ppm gas selected may not producing exact concentration. Use the produced gas value generated from 5008 menu.
53. Press CAL button on the analyzer front panel. Cal LED should be blinking and the display menu changes to
54. Enter the span gas value of 400-ppb gas in the analyzer to set and store NO/NOX values. Span calibration at 80% full scale. a) Press CONC button on front panel changes to CONCENTRATION MENU of NOX, NO, CONV, EXIT. b) Press NO button and enter the value 00399.0, then press ENTR. c) On Concentration Menu, press NOX button and enter the value 00400.0, then press ENTR d) Press EXIT button routes back to first calibration menu.
56. Allow the analyzer 400-ppb to stabilize 15 to 25 minutes or using the analyzer
57. If the span button is not displayed, the span is too far out of adjustment to do a reliable calibration. This result should be corrected before continuing the calibration.
55. After the stable reading has been obtained press SPAN and ENTR. The analyzer is now calibrated the NO and NOX at 400 ppm.
56. Check the analyzer NO SLOPE and NOX SLOPE to ensure they are still in acceptable range after span adjustment.
57. If the DAS span value 400 ppm is acceptable, record the analyzer NO/NO2/NOX display values and DAS NO/NO2/NOX values to NO/NO2/NOX Calibration spreadsheet. 58. The obtainable DAS value is +/- 0.5 ppm.
59. Press EXIT button on analyzer to complete span adjustment and returns to instrument sample mode.
60. Enter SELECT LEVEL (1) on 5008. Enter the level assigned for NO concentration of .450 ppm (10), then ENT.
61. Observe for actual NO concentration being produced to record at step 63-calibration spreadsheet under INPUT (ppm) NOX.
62. Allow the analyzer to stabilize at 450 ppb sampling (approximately 10 minutes) or NOX STB reaches a reading of 0.0 PPB. There is no adjustment to the analyzer.
63. Record the analyzer NO/NO2/NOX display values and DAS NO/NO2/NOX values to NO/NO2/NOX calibration spread sheet.
64. Repeat steps 60-63 for NO concentrations of .180, .090 and .055 ppm sampling points with no further adjustments to the analyzer. The assigned level concentrations are 22, 23, and 24 respectively on 5008. The calibration points result corresponds to calculated concentration in relationship with recorded DAS values to determine the analyzer response is linear.
65. Enter SELECT LEVEL (1) on 5008. Enter the level assigned for NO2 concentration of .400 ppm. (11), then ENT.
66. Press (Up Arrow, ENT) to change “n” to “y” initializes .400 ppm gas flow.
67. Observe for actual NO2 concentration being produced from 5008 to enter and record at step 69c) and at step 73-calibration spreadsheet under INPUT (ppm) NO2.
68. Press CAL button on the analyzer front panel. Cal LED should be blinking and the display menu should show
69. Enter the produced NO2 concentration of 400 ppb in the analyzer to set and store NO2 value. a) Press CONC button on front panel shows a CONCENTRATION MENU of NOX, NO, CONV, EXIT. b) Press CONV button shows CONVERTER EFFICIENCY MENU of NO2, CAL, SET, EXIT. c) Press NO2 button and enter the value 00400.0, then press ENTR. d) The converter efficiency has limited valid ranges and should be between .960 to 1.02. To be efficient. If it’s outside these limits the Moly Converter should be replaced. e) Press SET button to verify the Converter efficiency value.
70. There are two methods of doing Converter Efficiency calibration and third method of setting the CE value to compensate the CE ratio. One is resetting the previous converter value to 1.0000 in step 69e) a) After CE value set, press CAL-CONC-CONV-CAL. b) Allow analyzer to stabilize at 400 ppm NO2 and ENTR button will appear when CE is within valid range limit. c) Press ENTR calculates the CE ratio. d) Recheck the CE ratio in step 69d-e) for valid ranges. 71. Second method is to let analyzer compute the efficiency by pressing CAL-CONC-CONV- CAL. a) This sequence is after step 69c), start sequence with CONC-CONV-CAL. b) Allow the analyzer to stabilize at 400 ppb NO2. c) Determine whether NO2 value of 400 ppb is within +/-5 ppb of the analyzer range. d) Press ENTR will calculate the CE ratio from previous value and calibrate the analyzer NO2 at 400 ppb. e) Recheck the CE ratio in step 69d-e) for valid ranges.
72. The third method is setting the set value for converter efficiency ratio in step 69d-e). a) The method ensures the value stays in the limits of .960 to 1.02 of CE ratio as long as current set value reflects the calibration is linear. b) This method can be performed if CE ratio is near limited end of valid range or method one and two in step 70 and 71 not achieving acceptable results. c) Change the set value in step 69c) d) After CE value set, press CAL-CONC-CONV-CAL. e) Press ENTR then calculates the new CE ratio at 400-ppb NO2.
73 Record the analyzer NO/NO2/NOX display values and DAS NO/NO2/NOX values to Gas- Phase Titration of NO/NO2/NOX Calibration spreadsheet.
74. Press EXIT button on analyzer to complete NO2 adjustment and returns to instrument sample mode.
75. Enter SELECT LEVEL (1) on 5008. Enter the level assigned for NO2 concentration of .180 ppm (12), then ENT.
76. Observe for actual NO2 concentration being produced to record at step 53-calibration spreadsheet under INPUT (ppm) NO2.
77. Allow the analyzer to stabilize at .180 ppb sampling (approximately 10 minutes) or NOX STB reaches a reading of 0.0 PPB. There is no adjustment to the analyzer.
78. If the DAS NO2 value is acceptable, record the analyzer NO/NO2/NOX display values and DAS NO/NO2/NOX values to Gas-Phase Titration of NO/NO2/NOX calibration spread sheet.
79. Repeat steps 50-53 for NO2 concentrations of .90 and .55 ppm sampling points with no further adjustments to the analyzer. The level assigned for NO2 concentration .090 and .055 ppm is 13 and 14 respectively on 5008.
80. The calibration spreadsheet mc-noxca.xls will calculate linear regression values from step 40 equations in relationship of obtained values from steps 48 and 53 to represent the results of all ten input data points.
81. If the calibration is successful, return to the Main Menu of 5008.
82. Enter CONTROL/STOP CONTROL (2,4).
83. Remove the calibration line and reattach the sample line to the sample port. 84. Enable channel 5,6 and 7 (NO-NO2-NOX) on the data logger. SECTION 2
Corrective action outlined for performance problems. The procedures that follows require having analyzer top cover removed for adjustments and measurements.
I D/A Calibration on analyzer
1. Operate the logger to display “Real-Time Raw Readings” on 8816 and display Repeat Raw 00 for 8800 logger by selecting F3 and key in 00. 2. Remove the top cover of the analyzer. 3. DVM (Digital Volt Multimeter) should be use for the test confirming correct voltage output against DAS measurements, especially 8800 data logger. Make all measurements from the analyzer output terminal. 4. Display the analyzer to read D/A CALIBRATION by pressing SETUP-MORE-DIAG- ENTR-NEXT (twice)-ENTR. 5. Press ADC to start the calibration. The analyzer display will read “ADJUST ZERO: A/D= “xx.xmV” (x as voltage value) The voltage display should be approximate 10% full scale of 0-1 volt range. 6. Observe the data logger channel 07 NOX and compare the value with analyzer voltage reading it should be the same. If they are not, adjust the zero trimpot R27 of V/F board on analyzer to the logger NOX channel 07 voltage value until the values are the same. When the voltage values are matched, press ENTR and the calibration advances to the next format. 7. The analyzer display will read “ADJUST GAIN: A/D=xx.xxmV” (x as voltage value) The display features a voltage about 90% full scale DAC output range. 8. Observe the data logger NOX channel 07 and compare the value with the analyzer voltage reading it should be the same. If they are not, adjust the zero trimpot R31 of V/F board on analyzer to the logger NOX channel 07 voltage value until the values are the same. When the voltage values are matched, press ENTR and DAC calibration is complete. 9. Exit back to sample menu and set the logger back to display Display Readings w/flags for 8816. Press F4 then key in 00 to Repeat Read 00 for 8800.
II Pressure Method Leak Check
1. The pressure method leak check is to pressurize the unit with zero air at 15 PSI and has to hold that pressure to five minutes monitoring with a Leak Checker Box. 2. Turn the power off to the analyzer. 3. Cap the exhaust port, remove the DFU filter from ozone generator air inlet at ¼ inch fitting and cap the air inlet. 4. Connect a tubing line from Leak Checker Box labeled “Analyzer” to the analyzer sample inlet. 5. Connect a tubing line from zero air sources (5008 Gas Calibrator or zero gas bottle) to Leak Checker Box labeled “Zero Air”. 6. The flowmeter knob should be closed (clockwise) on Leak Checker before introducing the airflow. This prevents accidentally exceeding instrument pneumatic system when zero air source value is open. 7. Open the valve from the zero air source and slowly turn flowmeter knob counterclockwise and observe the pressure gauge on Leak Checker increases when it reaches 15 PSI the close the flowmeter knob. 8. Register from the point pressure gauge stop and allow holding pressurization for 5 minutes. If the gauge drops more than 2 PSI within 5 minutes, there is a leak. Apply Snoop-liquid leak detector to all seals and fittings to locate the leak. 9. Tighten the seals or fittings until the leak stops. Repeat the leak check pressurization again until there are no pressure drop of more than 2 PSI on the gauge within 5 minutes. 10. When the pressure method leak check is completed. Close the air source value, remove cap from exhaust port, remove cap from ozone generator air inlet and re-install the DFU filter, disconnect the flow lines and wipe off all liquid solutions on seals or fittings before applying power to the analyzer.
III Factory Calibration (PMT Calibration)
1. The Factory Calibration – PMT Calibration should be performed if the following conditions occurs: a. Before starting a complete multi-point calibration. b. When slope and offset values are outside of acceptable range. c. After changing the output voltage ranges. d. After DAC calibration when V/F board out of calibration. e. Unable to zero or span the instrument. f. The PMT reading not justifies with span gas concentration and overall linearity during the calibration. g. If Electric Test or Optic Test readings are not responding in correct values.
2. For those instruments slope and offset values far out of range or trouble overall linearity, the exception is to reset the NO/NOX slope and NO/NOX offset values on instrument menu variables. a. Display the slope by pressing button in sequence SETUP-MORE-VARS-key in 929 then ENTR-JUMP-key in 37 then ENTR. b. The display features “37) NOX_SLOPE1=x.xxx PPB/mV” press EDIT and key in “1.000” to change the value then ENTR. c. Press NEXT to display offset value “38) NOX_OFFSET1=xmV” then EDIT. d. Change the offset value by key in to “0000.0” then press ENTR. e. The next two displays feature the NO_SLOPE1 and NO_OFFSET1 at No. 39) and 40). Edit the values for NO_SLOPE1 to 1.000 and NO_OFFSET1 to 0000.0 same pattern as in step 2.b. and step 2.d. of NOX slope and offset. f. The editing changes the NO/NOX slope to 1.000 and NO/NOX offset to 0000.0. g. EXIT to sample menu to display the slope and offset that changes made in variables is registered.
3. To start the PMT factory calibration, remove the top cover of the Preamp board.
4. Input .450-ppm of NO span gas from Gas Calibrator 5008 to sample inlet. Enter to select the gas level 10 of .450-ppm on 5008.
5. Set the analyzer to display NORM PMT voltage.
6. Calculate the expected NORM PMT voltage reading as follow: a. NORM PMT voltage = input span gas concentration multiply by 2. b. The span gas concentration of 450-ppb should have voltage reading of 900 mv. 7. If the NORM PMT voltage reading did not settle in 900 mv, the voltage adjustments should be made to correspond the input gas concentration in step 6b.
8. Adjust S2, the HVPS coarse switch, on the Preamp board to the setting that produces a NORM PMT signal closest to the voltage calculated in step 6. Adjust S1, the HVPS fine switch, to the setting that produces a NORM PMT signal closest to the voltage calculated in step 6. Note: Caution matter measure on HVPS adjustment not to exceed 860 volt. Warning message may appear due to Preamp circuit exceed voltage limit or PMT sensitivity. To check the high voltage value, display HVPS to assure voltage is not exceed 860 volt, then selected back to NORM PMT display.
9. Adjust R19 on Preamp board to trim the reading to final fine adjustment of calculated NORM PMT voltage in step 6.
10. When the expected NORM PMT value is met, input zero air gas.
11. Input zero air and allow the analyzer to sample zero air until it stabilize when STABIL reaches 0.2 then press CAL-ZERO-ENTR to calibrate zero.
12. At this time of zero air input, check and adjust the Electric Test and Optic Test signal level on Preamp board. Whenever the HVPS or PMT Preamp circuit is changed, the signal levels will change most likely the Optic Test.
13. Set the analyzer to display PMT voltage and press the button in sequence SETUP- MORE-DIAG-ENTR-NEXT-until displays ELECTRIC TEST then ENTR.
14. The PMT reading should be 2000 mV if not adjust R27 of the Preamp board until PMT reads 2000 mV then press EXIT.
15. Press PREV button until OPTIC TEST displayed then ENTR.
16. The PMT reading should be 2000 mV if not adjust R25 of the Preamp board until PMT reads 2000 mV then press EXIT to the sample menu.
17. This concludes the PMT factory calibration and multi-point calibration of 400-ppb and other selected points should be performed.
18. Refer to step 42 to continue the calibration.