1、 and o District data collection. through the survey. In a similar program, the EPA also was collecting air toxics test data at districts throughout the State. The EPA sent teams to Ventura, Santa Barbara, Sacramento, and the Bay Area. EER sent teams to San Joaquin, South Coast, San Diego, Mojave, Mo
2、nterey, Mendocino, Colusa, Yolo, and Placer. Teams were sent to the major districts identified Through these activities, most of the test data available in the State has been collected. Table 1 lists the number of tests collected at each district. 799 tests were collected at 19 out of 3 1 districts.
3、 A test includes the quantification of air toxics and other emissions from a device or group of interconnected devices operating under one condition. A report provides the results of one or more tests. 3 .O Screening One of the key goals of this project was to develop emission factors which can be u
4、sed to accurately assess air toxics emissions from key sources. Before developing emission factors it is critical to eliminate test results of unknown accuracy and device types which are not of primary interest. This allows for the best use of project resources. This section describes the procedures
5、 used to screen tests. Results from the initial screening also are provided. It should be noted that the initial screening of petroleum sources was conducted by CARB. 3.1 Procedures -. Table 2a-c lists the data types extracted from each test report for the screening analysis. The information is divi
6、ded into three categories including: e Report Information (Table 2a). This information describes the complete report. o Device Information (Table 2b). This information describes each device type tested. Each report can have multiple devices. For example, a report may include results for a boiler tes
7、t and an IC engine test. o Substance Information (Table 2c). This information describes measurements conducted on the fuel and air emissions for each device. For example, a boiler test may have included metals analysis of the fuel and PAH analysis of the stack emissions. - 2 3.2 Results Ail the info
8、rmation described in Section 3.1 has been extracted from each petroleum industry test report collected by CARB. Section 2 of Volume 3 lists the device type, test documentation information available, and substances quantified for the fuel and air emissions. Tables 3a and b summarize the test document
9、ation information available and substances quantified for several devicdprocess and material categories. Table 3a lists the percent of tests in each devicdprocess and material category which provided device, QNQC and method descriptions, sample, lab, calibration, location and blank data, and include
10、d enough information to develop emission factors. The devices and processes listed as required are those given in Al3 2588 appendix D. Appendix D specifies source testing requirements for key device and process types. The greatest number of required tests were conducted on Catalytic crackers. It sho
11、uld be noted that catalytic reformers were included with the catalytic crackers. Of the 9 tests for this category, 22% provided enough information to develop emission factors as shown in Table 3a. The fuel or raw material feed rate or production rate and emission rate are required to calculate an em
12、ission factor. The main substances quantified for catalytic crackers are metals, formaldehyde, and benzene as shown in Table 3b. Tables 3a and b also list several device and process types for which source testing is not required. Testing may have been conducted on these sources because of district r
13、equirements, a lack of accurate estimation techniques, or other facility specific concerns. Of the 16 1 tests evaluated, 134 are non-required tests. Gas fired internal combustion engines have the most source test data of any non-required category. Formaldehyde was quantified in 29% of the internal c
14、ombustion engine tests. 3.3 Source Prioritization - To develop emission factors of known quality requires a detailed evaluation of each test coIlected. To conduct an effective evaluation, the test should include device and method descriptions, and sample, laboratory, QNQC, calibration, sample locati
15、on, and blank data. In adition, the test must provide representative process rate information so that emission rates can be normalized to develop emission factors. To evaluate the quality of the collected tests, seven queries or search cases were conducted as shown in Table 4. Search cases 1-3 were
16、for required tests and search cases 4-6 were for non-required tests. Search cases 1-6 required that process rates be available. Tests with process rates can be used to develop emission factors. Search case 7 included tests without process rates. Search cases 1 and 4 required that all supporting info
17、rmation be provided. These tests have the information necessary to develop emission factors of known quality. Search cases 2 and 5 did not require that QNQC, calibration, or location data be provided. In general, this information is not of primary concern when assessing the accuracy of emission data
18、. Search cases 3,6, and 7 did not require any supporting information for the emission results. Test data matching these conditions cannot be effectively evaluated for quality. Table 4 shows that the number of tests matching the search cases ranged from 2 (search case 1) to 60 (search case 6). Overal
19、l search cases 1 , 2,4 and 5 would provide the best pool of data for emission factor development. In general, search cases 3 and 6 do not provide enough information to validate the results. As mentioned previously, 161 petroleum tests were collected by CAREL Each of these tests is listed in Section
20、2 of Volume 3. Search case 7 tests cannot be used to develop emission factors and will not be considered for this project. In addition, fugitive sources such as bulk terminals were not considered because the focus of this project is on combustion sources. Several 3 data sets also were collected whic
21、h did not provide any useful air toxic emissions and thus were eliminated from further consideration. Table 5 lists the final set of sources which be validated in detail and included in the emission factor calculations. It should be noted that some of these tests may be eliminated based on the detai
22、led validation results. As shown in the Search Case column of Table 5, some search case 3 and 6 sources were selected. In general, these sources were selected to increase the size of the sample for emission factor development or to ensure that a wide range of source types were represented. Search ca
23、se 3 and 6 tests do not provide all of the information necessary to evaluate data quality. Table 5 also lists supporting information including Number of Tests, Control Device Used, Report Information, Report Year, Sample Location, and Substance Group as described in Tables 2a-c. In the Report Inform
24、ation columns, a yes (Y) or no (N) is provided if the report provides the specific type of information requested such as Device Description (DD). In the Substance Group columns, a yes (Y) is provided if substances in the group were quantified. For example, if Arsenic and Chromium were quantified in
25、a test, a Y would be listed in the Metals (M) Substance Group column. A blank in the substance group column indicates no substances in the group were quantified. These tests were eliminated for one of the following reasons: emission factors cannot be developed (search case 7); not combustion sources
26、; or no air toxics information. Table 6 lists tests which will not be included in the emission factor development process. 4.0 Detailed Validation The detailed validation procedures include checking to ensure the correct sampling and analysis procedures were used, qualifying significant problems suc
27、h as high field blanks, checking calculations, and evaluating the accuracy of the test results. All methods needed to quantify the substances listed in AB2588 appendix D have been reviewed including: - CARB Methods Reviewed 1 1 - Hydrogen Sulfide (1983) 12 - Inorganic Lead (March, 1986) 15 - Hydroge
28、n Sulfide (June, 1983) 101-A - Mercury (1986) 104 - Beryllium (1986) 106 - Vinyl Chloride (June, 1983) 410A/B - Benzene (March, 1986) 42 1 - Hydrogen Chloride (January, 1987 and December, 199 1) 422 - Volatile Halogenated Organics (January, 1987 and December, 1991) 423 - Inorganic Arsenic (January,
29、1987) 424 - Cadmium (1987) 425 - Total and Hexavalent Chromium (January, 1987 and September, 1990) 428 - PCDDPCDF and PCB (March, 1988 and September, 1990) 429 - PAH (September, 1989) 430 - Aldehydes (September, 1989 and December, 199 1) 433 - Nickel (1989) 436 - Trace Metals (March, 199 1 and 1992)
30、 EPA MMT - Trace Metals (1 989) Section 3 of Volume 3 provides validation procedures for each method. These procedures were developed using experience gained conducting air toxic source tests, and reviewing AB2588 test reports, EPA and CARE3 test method documentation and CARB method review sheets. 4
31、 Primary parameters were identified to ensure critical data quality indicators were checked. The primary parameters provide an overall assessment of data quality but may not provide an indication of why a particular problem occurred. For example, if a method required field, reagent, and method blank
32、s, only the field blank was considered a primary parameter because it indicates the total interference and/or contamination resulting from the field and laboratory activities. However, the field blank does not indicate if the contamination resulted from the field and/or laboratory activities. For th
33、is project, it was more important to evaluate the overall quality of the emissions data. Only those parameters provided in the test reports in the form required by the method were checked. For example, if the method required that field blank levels over 20% be flagged, the flags were transferred fro
34、m the test report to the emission factor database. However, if the field blank levels were reported but not divided by the sample value, the ratios were not calculated. Instead, a notation was made to indicate that field blanks were collected and analyzed but the results were not flagged appropriate
35、ly. The only exceptions to this rule were for CARI3 Methods 430 and 436. For these two methods, field blank ratios were calculated because they were rarely reported by the contractors. All versions of the methods effective during the AB2588 program have been reviewed. This allowed review of source t
36、est data which was gathered using old test methods. Review sheets were not prepared for the fuel analyses because these results were not extracted from the reports. Fuel analyses do not provide a direct measurement of emissions from a source but are used instead to estimate emissions. The emission f
37、actors developed during this program had to provide accurate assessments of emissions. - 4.1 Detailed Validation Results Summarv O 72 tests were selected for detailed data validation and extraction. Of the 72 tests, 33 “old” tests evaluated in a previous project funded by the CARB to develop emissio
38、n factors. 2 of the 39 “new” tests were eliminated from the detailed review process. 70 tests had data validated and extracted. This section presents the results of the method validation and calculation checks discussed above for the 72 tests selected for air toxic emission factor development. A tes
39、t includes the quantification of air toxics and other emissions from a device or group of interconnected devices operating under one condition. Specifically, this section first chronicles the major problems associated with the data validation and extraction procedures and then discusses the details
40、of the method validation results. Overall, of the 39 “new” tests that passed the initial screening and were selected to undergo the method validation and calculation check procedures, 2 tests were eliminated from the process. One test was excluded from the detailed validation screening because emiss
41、ion factors could not be calculated from the reported data. Originally, this test passed the initial screening because at first glance the test appeared to report all the necessary data to develop a validated emission factor, but a more detailed review of the reports revealed data deficiencies. The
42、other test was dropped because it was determined the device tested was not a petroleum device. Considering the 2 tests that were dismissed from the original group of 72, a total of 70 tests (33 “old” tests and 37 “new”) had data validated, checked, and extracted. 5 4.1.1 Validation Problems and Calc
43、ulation Check Failures e Of all tests validated, the most common problem was th lack of a full set of internal standards used during Method 429 (PAH) analyses. failures. o Of 5 calculation check failures documented, all of them were Method 430 (HCHO) Major problems encountered during the method vali
44、dation and extraction procedures are documented in Table 7. The table lists all 72 tests according to Report ID, Device ID, Number of Tests, Contractor ID, Device Type, Material Used, Review Date, Comment, and Calculation Check Status. These criteria are discussed in detail below: o Report ID: This
45、is the number that was assigned to a device or similar group of devices in each document during the initial screening phase. Similar devices all have the same primary characteristics such as an internal combustion engine. The report ID is a four digit number followed by a letter. The four digit numb
46、er distinguishes different documents. A unique letter is assigned to each device or group of devices in a document. If, for example, a document contained results for two boilers and an internal combustion engine, the devices would be given the same four digit number (W), but each would have its own
47、letter identifier (e-g., #A for the two boilers and #B for the ICE). o Device ID: This three digit number is assigned to each device or group of interconnected devices upon entry into the database. Some facilities have a group of devices which emit to a common stack. For example, a facility may have
48、 six steam generators exhausting to one stack. .,These six steam generators would receive a single device ID. Each engineer entering data had hisher own assigned set of device ID numbers so the person responsible for validating and extracting the results from a particular test could be tracked. In m
49、any cases, the report ID and device ID can be used to reference a device or group of interconnected devices. In some cases, however, a report ID references multiple devices. For example, Report ID 2409A references 14 devices (Device IDS 1 14 to 127) as shown in Table 7. _. o Number of Tests: As mentioned earlier, a test includes the quantification of air toxics and other emissions from a device or group of interconnected devices operating under one condition. A condition is defined as a set of operating constraints which are fixed during a test. For example, one condition would be a boi