Automatic Surface Defect Detection in Superconducting ...yataiiya/UNDER_GRAD_RESEARCH/... · OpenCV...
Transcript of Automatic Surface Defect Detection in Superconducting ...yataiiya/UNDER_GRAD_RESEARCH/... · OpenCV...
-
Conclusions Testing results were inconclusive as to the feasibility of this program’s use for automatic cavity inspection. While sufficient detection and false positive rates were not attained in this project, it may be possible to do so by implementing certain improvements in the training procedure and making changes to certain program parameters.
Improvements possible in the cascade training procedure include using more positives and negatives and letting the training program run longer. Changing the values of certain arguments of cvHaarDetectObjects was found to affect detection rate, false positive rate, and program run time.
Automatic Surface Defect Detection
in Superconducting Radio Frequency Cavities
using C++ and OpenCV
Daniel Iriks Mentor: Grigory Eremeev 2014 REU, Jefferson Lab/Old Dominion University Jefferson Lab – SRF R&D
Santa Rosa Junior College, Santa Rosa, CA /
Sonoma State University , Rohnert Park, CA
Thank you so, so very much to…
Hari Areti Crystal Baker Telesha Brown Stephen Bueltmann Document Control Group Grigory Eremeev fellow summer 2014 student interns Steve Gagnon
Jefferson Lab National Science Foundation Amit Arup Nayak Old Dominion University Quark Café staff James and Samantha, my predecessors on this project
Santa Rosa Junior College: in general, my recommenders, and those who helped with the REU application Sonoma State University Josh Spradlin SURA Residence Facility, management and staff Lisa Surles-Law Team Test Lab
Jan Tyler U.S. Department of Energy, Office of Science W&M’s Surface Characterization Lab and the staff there (also my sincerest apologies!) anyone I neglected to mention
SRF cavities in the middle. Source: http://education.jlab.org/sitetour/ccentercavity.l.jpg
Manual inspection set-up. Source: The research paper of my predecessor
on this project, Samantha
Thomas Jefferson National Accelerator Facility, also known as Jefferson Lab, conducts research, development, and production of superconducting radio frequency cavities for particle accelerators. These cavities, when cooled to very low temperatures, become superconducting and allow for high efficiency and high acceleratory capabilities.
In cavity production, serious albeit tiny surface defects can remain after polishing and cleaning, and these defects limit accelerator efficiency and performance capabilities. They cause resistance, wasting energy through heat. If enough heat is generated, the temperature increases beyond the range necessary for superconductivity.
A cascade is trained using positives (pictures with the object to be detected) and negatives (pictures without it). My mentor provided me with niobium samples both with and without the defect. Positives and negatives were obtained using a digital microscope in the College of William & Mary’s Surface Characterization Lab in the Applied Research Center at Jefferson Lab. The more positives and negatives, the better the detection performance. I obtained about a thousand positives and about two thousand negatives. A cascade training program was used. It takes the positives and negatives as input and outputs the trained cascade. It runs for quite some time (13 days for this project, and it could have gone longer!).
The manual inspection process for these defects is very time-consuming, such that, for large production projects, it’s only used to investigate surface areas indicted by testing.
At the start of this project, the computer program could be applied to one picture at a time, outputted a copy of the picture with rectangles drawn around detected defects, and detected a certain type of defect with promising success. In this project, the features of automatic application to multiple pictures, name recording of pictures with detected defects, and defect sizing were added to the program. Also, the program was trained to detect a different type of defect caused by the formation of hydrogen precipitates in the metal.
Testing was conducted on four folders. One contained 90 negatives, and the other three were positive folders separated according to apparent detection difficulty. There were 50 positive test images and 1709 test defects in total. Testing showed the program had a detection rate of 53% and a false positive rate of 29%, that the average calculated size was 5 times larger than the actual value with a percent relative average deviation of 30%, and that the average program run time was 2.1 seconds per image. 11 out of the 90 negative test images had false positive detections, and there were 18 total false positives in the 11 of them.
A typical positive. A typical negative.
The purpose of this project was to further develop and test an approach to automating defect detection, which could increase cavity production efficiency by decreasing the time and labor necessary for inspection and testing. This approach was to use the programming library OpenCV and the language C++ to write a computer program that would inspect pictures of cavity surfaces and detect defects. While sufficient detection and false positive rates were not attained in this project, it may be possible to do so by implementing certain improvements in the detection training procedure and making changes to certain program parameters.
Introduction
Method Results
The indicated surface feature did not count as
a defect because it did not satisfy the
requirement of having a major axis of at least
10µm. The other requirements were to be a
surface feature caused by the formation of
hydrogen precipitates in the metal and to be
surrounded completely by negative space.
The indicated detection counted as a
false positive for two reasons: more
than 50% of contained defect area was
contained in other, true positives, and
it contained defects separated by more
than 10µm.
This was a negative image, so any
detection would be a false positive.
The detection here is a false positive
specifically because it does not
contain at least 50% of a defect.
The program approximates defect area using the
area of the rectangle drawn around the defect. To test
this, the detected defect closest to the center in each
test positive was examined. The rectangle’s
dimensions and the defect’s major and minor axes
(shown in blue in the image to the left) were
measured, and their areas were compared.
The program also counts and outputs how many
defects are detected within certain area ranges. The
histogram to the right is constructed using this data
from one of the positive test folders. The size ranges
are scaled by one-fifth since the calculated area was
found to be five times larger than the actual value on
average. From this data, it can also be determined that
the average defect area for this folder (using the
scaled values) was 90 µm2 with a standard deviation
of 50 µm2.
In order to test
detection and false
positive rates,
defects and false
positives in each
test positive were
counted (the
program outputs
the number of
detections).
Quantitative
criteria were
established for
defect and false
positive
identification.
The primary OpenCV function used was cvHaarDetectObjects. The program applies this function to every picture in a folder. The function detects defects in each picture using a cascade, which you can think of as the knowledge of what the thing you want to be detected looks like.
Defect area (square micrometers)