NIST’s Support of Rad/Nuc Standards Development Dr. Leticia Pibida Physicist National Institute of...
-
Upload
diane-susanna-carson -
Category
Documents
-
view
214 -
download
1
Transcript of NIST’s Support of Rad/Nuc Standards Development Dr. Leticia Pibida Physicist National Institute of...
NIST’s Support of Rad/Nuc Standards Development
Dr. Leticia Pibida
PhysicistNational Institute of Standards and Technology
Introduction
• NIST worked with the Department of Homeland Security (DHS) Science and Technology (S&T) Directorate Standards in:
− Development of consensus standards
− Development of test and evaluation protocols for testing laboratories
− Report of test results from testing against ANSI standards
− Validation of ANSI standards
− System integration efforts
• NIST works with the Department of Homeland Security (DHS) Domestic Nuclear Detection Office (DNDO) in:
− Development of government standards
− Development of test designs for operational test-campaigns
− Data analysis from test-campaigns results
− Develop “Equipment List” for radiation detector users
Standards Reviewed
• ANSI N42.32 Chair: Joe McDonald
– American National Standard Performance Criteria for Alarming Personal Radiation Detectors for Homeland Security
• ANSI N42.33 Chair: Morgan Cox
– American National Standard for Portable Radiation Detection Instrumentation for Homeland Security
• ANSI N42.34 Chair: Peter Chiaro
– American National Standard Performance Criteria for Hand-held Instruments for the Detection and Identification of Radionuclides
• ANSI N42.35 Chair: Leticia Pibida/ Brian Rees– American National Standard for Evaluation and Performance of Radiation
Detection Portal Monitors for Use in Homeland Security • Revisions published January 2007
• Include lessons learned from equipment testing
• Development of data sheets for data acquisition at test labs
http://standards.ieee.org/getN42/index.html
Recently Published Standards
• ANSI N42.37 Chair: Morgan Cox/ Alex Boerner (April 2007)– Standard for Training Homeland Security Emergency Responders in
the Uses and Maintenance of Radiation Detection Instruments
• ANSI N42.38 Chair: Peter Chiaro (January 2007)– Standard for Spectroscopy-Based Portal Monitors used for Homeland
Security
• ANSI N42.42 Chair: George Lasche/ Leticia Pibida (March 2007)– Data format standard for radiation detectors used for Homeland
Security
• ANSI N42.43 Chair: Peter Chiaro (April 2007)– Standard for Mobile and Transportable Systems
Including Cranes used for Homeland Security Applications
Standards Under Development
• ANSI N42.39 Chair: Alan Thompson/ Joe McDonald– Standard for Performance Criteria for Neutron Detectors for
Homeland Security
• ANSI N42.41 Chair: David Gilliam– Performance Criteria for Active Interrogation Systems used for
Homeland Security
• ANSI N42.48 Chair: Peter Chiaro
– American National Standard Performance Requirements for Spectroscopic Personal Radiation Detectors (SPRDs) for Homeland Security
• ANSI N42.49 Chair: Morgan Cox/ Joe McDonald– American National Standard for Performance
Criteria for Personal Emergency Radiation Detectors (PERDs) for Exposure Control
Standards Under Development (cont.)
• ANSI N42.44 Chair: Mike Barrientos (TSL/SRA)– Performance and evaluation of checkpoint cabinet x-ray imaging
security-screening systems
• ANSI N42.45 Chair: Lok Koo (TSL/SRA)/ Jim Connelly L3 Corporation
– Evaluating the image quality of x-ray computed tomography security-screening systems
• ANSI N42.46 Chair: Stacy Wright (TMEC)/ Jim Lamers (NTMI)
– Measuring the performance of imagining x-ray and gamma-ray systems for cargo and vehicle security screening
• ANSI N42.47 Chair: Frank Cerra– Measuring the Imaging performance of X-ray and Gamma-ray
Systems for Security Screening of Humans
Proposed Standards
• ANSI N42.XX Chair: Ed Groeber– Standard or appendage to existing standards with
regards to the use of radiation detection instruments under extreme conditions
• ANSI N42.XX Chair: TBD
– Performance criteria for handheld survey meter used in response and recovery for homeland security applications
• ANSI/ASTM Chair: TBD– Standards for sampling and forensics in
radiological events
International Standards Work
Working group WG B15
• Radiation protection instrumentation – Highly sensitive hand-held instruments for detection of radioactive material
• IEC 62327 - Radiation protection instrumentation - Hand-held instruments for the detection and identification of radionuclides and for the indication of ambient dose equivalent rate from photon radiation.
• IEC 62484 - Radiation protection instrumentation - Spectroscopy-based portal monitors used for the detection and identification of illicit trafficking of radioactive material.
• IEC 62401- Radiation protection instrumentation - Alarming Personal Radiation Devices (PRD) for detection of illicit trafficking of radioactive material
Equipment Testing Against Standards
DHS
NIST
Manufacturers Testing Labs
ReportTask
Management
Sent Equipment for Testing
Equipment ArrangementsSign Agreement
Coordinate TestingSend Test Results for
Review and Report
• Radiological tests: exposure rate, background, false alarm, gamma and neutron response, radionuclide identification (strongly depend on detector type)
• Environmental tests: temperature, humidity, sealing (similar for all type of detectors)
• Mechanical tests: mechanical shocks, vibration, drop test (strongly depend on detector type)
• Electromagnetic tests: external magnetic fields, radio frequency, conducted disturbances (burst and radio frequencies), surges and oscillatory waves, electrostatic discharges (similar for all type of detectors)
ANSI N42 Standards General Tests
Equipment Testing Round 1
• Number of Companies= 28
• Number of PRDs (Pagers) = 18
• Number of Survey Meters = 23 (33 with probes)
• Number of Radionuclide Identifiers = 7
• Number of Portals = 14
• Total number of instruments to test = 172 (202 with probes)
Test Report Published in
www.rkb.mipt.org
Responder Knowledge Base (RKB)
Labs involved: NIST, ORNL, PNNL, LLNL and LANL
Implications of Round 1 Results and Report
For DHS• This information could assist DHS in making
informed procurement choices and make more effective use of grant money.
• Improve ANSI/IEEE Standards’ requirements
For Users• Compare test results for deployment of appropriate
equipment
For Manufacturers• Improve instrument performance to meet DHS needs
Equipment Testing Round 2
• Number of Companies= 11
• Number of PRDs (Pagers) = 16
• Number of Survey Meters = 9
• Number of Radionuclide Identifiers = 8
• Number of Portals = 3
• Total number of instruments to test = 105
Test 2 Deadlines
• April 25 equipment submission
• April 29 update of T&E protocols and data sheets
• May 2 start equipment testing
• April 2006, Complete equipment testing
Labs involved: NIST, ORNL and PNNL
Format of Report
Table of Requirements: ANSI N42.3x + Testing and Evaluation Protocol
– General– Radiological– Environmental – Electrical and Electromagnetic– Mechanical
Detailed tables for each instrument
Summary Table – all instruments
Pass / Fail / Conditional
Example: Test Results RIIDs Round 2
Example: Test Results RIIDs Round 2
Example: Test Results RIDs Round 2
Equipment List for Radiation Detectors
• NIST is working with DNDO to develop a system for testing commercially available equipment
• The equipment that pass the required tests will be listed in an “Equipment List” to assist users during purchasing and deployment
• Equipment testing will be split in 3 tiers– Tier 1: Testing against consensus standards (ANSI/IEEE standards)– Tier 2: Testing against government standards (to be developed by DNDO)– Tier 3: Scenario testing (testing performed by DNDO with user input)
• Tier 1 testing will be carried out by NVLAP accredited laboratories – NVLAP Handbook final available at:
http://ts.nist.gov/ts/htdocs/210/214/214.htm– For information on how to apply: http://222.nist.gov/nvlap
• Tier 2 and Tier 3 testing could take place at assigned DNDO laboratories
• Post market surveillance– Equipment testing– Supplier’s declaration of conformity
Operational/Scenario Testing Efforts
NIST is working with DNDO and users in the development of test designs and data analysis for operational/scenario test-campaigns
Completed test campaigns:– Anole: included testing of RIIDs and Backpacks. Results available at
www.rkb.mipt.org
– Bobcat: included testing of Personal Radiation Detectors (Pagers Type). Results will be available soon
Future test campaigns:– Crawdad: will include testing of equipment used in maritime
applications
Additional Testing at NIST
• Validation of ANSI standards for radiological requirements for N42.32, N42.33, N42.34, N42.35, N42.38, N42.43 (for backpacks)
• Will start validation of N42.49 standard Personal Radiation Dosimeters
• Testing of portal monitor at NIST C-Gate– Develop new software for optimization of source detection for gross
count systems
• Radiation detection sensor integration with robots– DHS/NIST Sponsored Evaluation Exercise Maryland Task Force 1
Training Facility
Robot/Sensor Testing: Lessons Learned and Needs
• For sensor plug-and-play capability on robots need to: - Define wire connection in robot - Define communication protocol between sensor and robot
(2-way communications)- Transmit ANSI N42.42 data format (XML) files to
save spectra and data- Define display in robot control unit
• Cameras are not very useful as sensor screens were not readable with sunlight
• Need audible capability – Rad sensors need to meet 85 dB at 30cm
• Sensor mounting capabilities will depend on mission type
• High sensitivity instruments gross counting and/or ID capabilities
Summary of N42.43 Backpack Testing
• Sensitivity of present backpacks do not meet ANSI standard requirement for all tested parameters– Need to review ANSI standard requirements – Need to interact with manufacturers to ensure optimal system
performance
• Suggestions for Standard Revision: – For the radionuclides specified in the standard, testing Distance should be
closer than 3m.– Only 180º tests are required to reduce the angular dependence testing
time.– Neutron moderator thickness could be reduced (after complete study). – Phantom type (water, PMMA) and size might be specified in standard.
• Alarm thresholds for the backpacks should be investigated as to meet the false alarm rate requirement. Users should be aware of instruments performance changes at difference alarm thresholds.
Source Development and Calibration
• Supply and calibration of gamma-ray and neutron sources to DOE labs, for use in equipment testing against ANSI N42.35 and N42.38.
• Developed new 232Th (14 Ci) and 226Ra (8 Ci) sources for use in testing against N42.38.
• Helped source manufacturers with design, calibration and development of new sources – Commercialized by AEA technology (QSA Global) provides private sector participation
• Sources can be made available to users for instrument checking
RadionuclideActivity at time of Testing (MBq)
Minimum Nominal Maximum
57Co (T1/2 = 270 d) 2.75 3.44 4.13
60Co (T1/2 = 5.2714 y) 0.119 0.148 0.177
133Ba (T1/2 = 10.51 y) 0.681 0.851 1.021
137Cs (T1/2 = 30.07 y) 0.474 0.592 0.710
228Th (T1/2 = 1.9116 y) 0.208 0.259 0.310
241Am (T1/2 = 432.2 y) 13.7 17.1 20.5
Source Activities (ANSI/IEEE N42.35)
RadionuclideActivity at time of Testing (MBq)
Minimum Nominal Maximum
57Co (for id) (T1/2 = 270 d) 0.444 0.555 0.666
57Co (for gross counting) (T1/2 = 270 d) 0.148 0.185 0.222
60Co (T1/2 = 5.2714 y) 0.207 0.259 0.311
133Ba (i) (T1/2 = 10.51 y) 0.266 0.333 0.399
133Ba (g) (T1/2 = 10.51 y) 0.089 0.111 0.133
137Cs (T1/2 = 30.07 y) 0.474 0.592 0.710
192Ir (T1/2 = 73.83 d) 0.178 0.222 0.266
226Ra (T1/2 = 1600 y) 0.237 0.296 0.355
232Th (T1/2 = 1.405 1010 y) 0.414 0.518 0.622
241Am (T1/2 = 432.2 y) 1.391 1.739 2.087
Source Activities (ANSI/IEEE N42.38)
NCRP Work
NIST was part of the National Council on Radiation Protection and Measurements (NCRP) working group in the Development of NCRP
Commentary No. 19
– Key Elements of Preparing Emergency Response for Nuclear and Radiological Terrorism
Present Gaps and Concerns• Standards for response and recovery efforts
• Implementation of “Equipment List” program
• Manufacturers need to provide a test result summary for each instrument shipped to user
• Acceptance testing performed by users for equipment purchased before deployment – Users need to screen for dead on arrival instruments
• Calibration and maintenance of equipment once in the fields– Users should have a mechanism to ensure that instrument performance is
not degraded with use
• Functionality tests for deployed instruments– Users should have a routine procedure to check for malfunctioning
instruments
• Develop system to ensure that software upgrades performed by manufacturers do not affect users– Users need to be aware of changes
Conclusions
• Lots of work still needs to be done
• User input is critical so address existing needs
• Coordination between different agencies and user communities is key for success
• Dissemination of information to users about existing standards and technology is needed to ensure the deployment of appropriate instrumentation to match mission
• Performance, operational and field testing results should be integrated to provide users will all the necessary information to evaluate and compare instrument priori to acquisition