ORNL Review - Volume 14, Number 2

52
"" Oak Ridge National Laboratory I revie State of the Laboratory-1980

Transcript of ORNL Review - Volume 14, Number 2

"" Oak Ridge National Laboratory

Irevie

State of the Laboratory-1980

THE COVER: Productive interaction be­tween government-sponsored research and industrial development has long been a goal of both the public and private sectors. In his State of the Laboratory address early this year, Director Herman Postma cited the instances in which such interaction has b~en effected over the past year. Moreover, he made the point that each participant instigates action and progress in the other, resulting in what he termed "technology push and market pull," the theme he chose for his talk.

Editor BARBARA LYON

Staff Writer CAROLYN KRAUSE

Consulting Editor ALEX ZUCKER

Art Director BILL CLARK

Publication Staff: Technical Editing/ Cindy Sullivan, Deborah Stevens, and Ralph Sharpe; Typography/ Susanna West; Makeup/ Betty Jo Williams; ORNL Photog­raphy and Reproduction Departments.

The Review is published quarterly and distributed to employees and others asso­ciated with the Oak Ridge National Lab­oratory. The editorial office is in Building 4500-North, Oak Ridge National Labora­tory. P.O. Box X, Oak Ridge, Tennessee 37830. Telephone: Internal 4-7183; com­mercial (615) 574-7183; FI'S 624-7183.

TSSN 0048-1262

Oak Ridge National Laboratory

review VOLUME 14, NUMBER 2

1 State of the Laboratory Technology Push/ Market Pull By HERMAN POSTMA

20 Spiraling Down the Watershed How Flowing-Water Ecosystems Cycle Nutrients By JERRY ELWOOD

32 Artificial Intelligence Is Coming Applications at ORNL By CARROLL JOHNSON

39 Health Risk Analysis Science, Politics, and Public Concern

By EMILY COPENHAVER and PHIL WALSH

DEPARTMENTS Books Take a Number Information Meeting Highlights Awards and Appointments

SPRING 1981

30 38 46 48

OAK RIDGE NATIONAL LABORATORY OPERATED BY UNION CARBIDE CORPORATION · FOR THE DEPARTMENT OF ENERGY

State of the Laboratory-1980 Technology Push;Market Pull

By HERMAN POSTMA

T his annual summary always gives me achancetohighlight

trends and changes in the nature of our work and in the external fac­tors that influence what we do. Coming each year not long after the President's budget message, this talk reminds us that evolu­tion and change, particularly in matters as vital to our national well-being as science and energy, are among the constants with which we live.

The themes of the State-of-the­Laboratory talks in recent years have emphasized a number of

SPRING 1981

important and continuing changes: our work in nontradi­tional areas, the broadening range of sponsors and collaborators, and our more active role as a consultant to outside organizations and as a manager of R&D in areas where we have particular expertise.

TECHNOLOGY PUSH/ MARKET PULL

Continuing the focus on new missions and areas of interaction, the theme I have chosen is "Tech­nology Push/Market Pull." This is one way to characterize the dy-

namic relationship between R&D projects at ORNL and the ultimate users of these ideas and innova­tions. Moreover, it suggests two of the principal types of interactions by which our missions are fulfilled.

By technology push, I mean R&D results or inventions for which there is no predetermined outside interest or application. The devel­opment itself, by virtue of the improvement it makes or the poten­tial it creates, is responsible for a "push"-the positive stress on the outside world that causes things to happen. Our technology utiliza-

1

Ben Benjamin (left) and Bill Rodgers led a research effort that allayed concerns that the process solvent used in pilot-plant tests of a Kerr McGee liquefaction process was reacting chemically with coal-derived materials and thus was lost.

tion/ commercialization program may also supply some of this initial impetus. An example of technology push at ORNL is biotechnology, with its potential for contributing in novel ways to energy conserva­tion and production as well as environmental effluent control.

Market pull represents the dy­namic opposite. Here, the outside need for a product or process of a clearly defined type prompts the R&D effort. In this case, the Labo­ratory, as an institution with unique problem-solving capabili­ties and resources, focuses on items designed or invented specifically to fill a predetermined need or to satisfy a particular customer.

Our work on synfuels, support­ing many facets of a major na­tional commitment to reducing U.S. dependence on imported oil, is an example of how ORNL is responding to a strong market pull. Anotheristechnicalsupportforthe Nuclear Regulatory Commission.

Clearly, both forces are at work in many projects at ORNL- as when the initial push resulting from technological development is reciprocated by a strong pull as efforts toward commercialization begin. In fact, many will argue that a pull was earlier a push, and vice versa.

Coal Liquefaction

The vast national commitment to the development of synthetic fuels is evident as many projects move from the pilot-plant phase to large-scale demonstration. This is perhaps the most significant recent example of a strong market pull on programs at the Laboratory. Most

2

of the DOE-supported coal liquefac­tion projects are being carried out by private contractors and managed through DOE's Oak Ridge Operations. ORNL and Nu­clear Division Engineering are providing technical support to ORO in areas where staff skills and Laboratory facilities constitute a specialized resource for the govern­ment and its contractors.

The SRC-I demonstration plant, to be sited in western Kentucky, is critically dependent on a new Kerr­McGee technology for separating mineral matter and unreacted coal from the liquefaction products. Essentially complete recovery of the process solvent in pilot-plant tests of this separation process has proven difficult, and there were concerns that the solvent might be reacting chemically with the coal­derived materials and thus be irretrievably lost. In response, the process engineering skills of Bill Rodgers and his colleagues in the Chemical Technology Division were effectively coupled with the

14C labeling and analysis capabili­ties of Ben Benjamin and his group in the Chemistry Division to prove, unequivocally, that chemical reac­tions of the solvent are of negligible concern. This technical assurance afforded the contractors a much firmer design basis on which to proceed.

Another example from operating liquefaction pilot plants is a seri­ous, unexpected corrosion problem in equipment where the coal liquids are distilled. The problem threat­ened the viability of both the SRC-I and-II demonstration plants, but a strong collaborative effort involv­ing Jim Keiser, Rod Judkins, and Vivian Baylor of the Metals & Ceramics Division and three sec­tions of the Analytical Chemistry Division with the pilot-plant oper­ators provided insight into the causes of the corrosion problem. These investigators suggested several feasible approaches toward solving this problem. The private contractor has characterized ORNL's contribution as the "pre-

OAK RIDGE NATIONAL LABORATORY Review

mier effort" in solving this prob­lem.

During late September, at the H­Coal Liquefaction Pilot Plant (Cat­lesttsburg, Kentucky) a heat ex­changer failed during normal start-up operation, resulting in a shutdown. Failure was attributed to blowout of 3 of 128 U tubes in the system used to cool the process vapors. Sections of the failed mate­rials were retrieved immediately by the Metals and Ceramics Division for examination, and within two days the plant operators were notified of the cause of failure: stress-corrosion cracking in which chloride ions played a contributing role. Once the cause was known, the operators were able to take remedial action.

Still another contribution relates to demonstrating the environ­mental acceptability of the new coal-liquefaction technology. The Energy, Environmental Sciences, Health and Safety Research, and Inform~tion divisions are prepar­ing the environmental impact

SPRING 1981

statements for DOE 's coal conver­sion demonstrations. These are pioneering documents that call forth scientific, technological, and political skills in identifying poten­tial impacts. One such impact is the disposal of solid wastes, princi­pally the ash and slag that origi­nate in the feed coal. Coordinated research in Environmental Sciences is providing information on leachates. Another concern is over possible health effects of organic chemicals present in the products, process streams, and wastes of coal liquefaction pro­cesses. Here, the Laboratory has conducted a broad, critical review of research methods and results relating to the mutagerucity (and potential carcinogenicity) of these materials which is being used to assess possible environmental and health hazards and necessary mitigating measures.

This pioneering role in the envi­ronmental assessment of the emerging synfuels technologies closely parallels our work ten years

. Jim Keiser prepares equipment for tests to screen materials for resistance to stress corrosion cracking. Keiser and his colleagues recently found the causes of the corrosion problem that threatened two Solvent Refined Coal demonstration plants.

ago when we prepared the first nuclear-power-plant impact state­ments in compliance with the then new National Environmental Pol­icy Act.

Magnetic Beneficiation of Coal

A coal-related push is the use of magnetic fields to remove the many inorganic rock and mineral constituents that reduce the heat­ing value of coal, complicate han­dling and transport, and contribute to airborne pollution and waste­disposal problems. The principle behind magnetic coal cleaning is that sulfur-bearing iron pyrites and ash-forming minerals are weakly attracted by a magnetic field, whereas coal particles are mildly repelled. ORNL was the first to show that magnetic methods can remove almost all of the liberated ash and pyritic sulfur while re­covering 90% of the energy content of the feed coal.

Gene Hise and Alan Holman, however, have demonstrated that the required magnetic field shape and force magnitude can be pro­duced with a superconducting solenoid. Since this field is virtu­ally independent of the diameter of the coil, it is possible to generate the separating force in any volume desired. The technique, called open­gradient magnetic separation, can separate materials having a posi­tive magnetic susceptibility as well as or better than existing high­gradient magnetic separation methods . This technique also has the unique capability of separating materials with negative suscepti­bility, thus significantly extending its applications to ores, minerals,

3

This heat exchanger that failed at the H-Coal Liquefaction Pilot Plant in Kentucky was found by ORNL to be a victim of stress corrosion cracking in which chloride ions played a role. The missing section was sent to the Laboratory for analysis.

and industrial materials and wastes which have positive as well as negative components. Already, two manufacturers are actively pursuing magnet development.

Conservation Technology For the near term, the most

important technology push from ORNL may be in energy conserva­tion. Several developments deserve mention as examples.

• The heat-pump water heater, recently cited in EPRI Journal, is already on the market, with several firms offering units expected to use about half the energy required by resistance-type water heaters for much of the United States.

• Two other subcontracted pro­jects reduce energy use in refrigera­tor-freezers. A prototype motor­compressor unit is 30% more energy-efficient than the best units previously on the market; and refri­gerator-freezer prototypes have been demonstrated that use one­half the energy of the 1978 sales­weighted average of comparable units, without sacrificing user convenience. Both the heat-pump water heater and refrigerator­freezer were developed by subcon­tractors under a DOE program managed by ORNL's Virgil Haynes.

• ANFLOW technology for mu­nicipal wastewater treatment, to be demonstrated in Knoxville in a 200,000-L/ d plant under the man­agement of Chuck Hancher and Wes Shumate, meets the Environ-

4

mental Protection Agency's sec­ondary treatment requirements with less than half the energy consumption of conventional treat­ment systems.

Extensive tests on the Annual Cycle Energy System over a three­year period have shown it to be a conservation leader in heating, cooling, and providing hot water for buildings. The system, now under the direction of Leonard Abbatiello, uses just 57% of the energy of the control house equipped with one of the best air-to­air heat pumps available.

This year, another appliance­related technology push resulted from connecting ORNL's heat­pump model to a state-of-the-art numerical optimizer developed by Ray Ellison. The purpose was to demonstrate how a sophisticated computer program, based on the physics underlying the technology, can be used as an alternative to traditional "cut-and-try'' design methods. One result for a short­term improvement case shows that a steady-state coefficient of perfor-

mance of 3.97 can be obtained, compared with today's best of 3.1. Ellison's model, on which this work is based, has now been installed on the computers at 11 industrial sites, 8 universities, and 6 other research institutions.

Nuclear Reactor Regulation Perhaps one of our most impor­

tant nuclear-related contributions in recent times occurred this year. One of several puzzling aspects of the Three Mile Island accident, investigated by the Kemeny Com­mission, was the almost negligible release of radioiodine. Only about 15 Ci of 131J were released, com­pared to about 8 million Ci of 133Xe, even though the inventories ofthe two fission products were approxi­mately equal at the time of the accident. The expected release of radioiodine was much larger than the actual release, mainly because the staff had recalled the relative releases of the two nuclides experi­enced in the 1957 Windscale acci­dent in the United Kingdom. Re­solving this discrepancy was

OAK RIDGE NATIONAL LABORATORY Review

important in view of the supposed high volatility of iodine, which made it the dominant radiotoxic species in reactor site-safety anal­yses.

Tony Malinauskas and Dave Campbell in the Chemical Technol­ogy Division, with collaborators at Los Alamos, set out to review the evidence. They observed that an oxidizing atmosphere prevailed at Windscale, whereas the accident at TMI occurred under chemically

SPRING 1981

reducing conditions. This sug­gested that fission product iodine is released from fuel as a metal iodide (probably cesium iodide) rather than as molecular iodine, which is contrary to the assumption on which current regulatory guides and licensing models are based. Since cesium iodide is significantly less volatile than molecular iodine, it became clear that release of radioiodine may have been grossly overestimated by the licensing

Gene Hise checks the open-gradient magnetic separation device that can clean powdered coal by separating sulfur-bearing iron pyrites and ash-forming minerals from coal particles having high energy content. OGMS also has applications for ores, minerals, and industrial materials.

models. This is especially true when one considers that light­water-reactor accidents necessar­ily occur under reducing condi­tions.

Their findings, if verified by an Nuclear Regulatory Commission review now under way, can have a significant impact on nuclear reac­tor safety and the regulatory pro­cess. Governor Babbitt of Arizona, a Kemeny Commission mem her who chairs the Presidential N u­clear Safety Oversight Committee, stated in a December letter to President Carter, "This technical question should be resolved on an expedited basis, for it bears directly on fundamental assumptions un­derlying some of the most impor­tant regulatory issues facing the nation."

Biodenitrification Liquid nitrate wastes are a con­

tinuing environmental problem in several industries, including the production of nuclear fuel. Biologi­cal denitrification (i.e., using mi­croorganisms to remove the nitrate contaminant) was successfully implemented at the Y-12 Plant in 1976 using conventional technol­ogy. Work then began under Chuck Scott and Chuck Hancher in the Chemical Technology Division and under Chet Francis in the Environmental Sciences Division on advanced systems. The organ­isms involved-simple bacteria from garden soil-were shown to function over a range of operating conditions, including moderate

5

Doug Lee (top), Chuck Hancher, and Chuck Scott (standing) examine components of the ORNL fluidized-bed bioreactor that has been shown to reduce nitrate concentrations in wastewater to levels acceptable for drinking water.

radiation fields. They attained a 100-fold increase in the denitrifica­tion rate using a new concept, a fluidized-bed bioreactor, in which the active agents were attached to particles of coal.

Now the development has pro­ceeded to the pilot-plant stage, where such treatment of actual and simulated nitrate waste solutions from the Portsmouth Gaseous Diffusion Plant and the Fernald Feed Materials Processing Center has reduced nitrate concentrations to levels acceptable for drinking water. As a result, both Portsmouth and Fernald have included ad­vanced designs of the systems in planned expansions of their waste­treatment facilities. In November, ORNL dismantled and shipped the pilot plant to Portsmouth, where the plant will treat nitrate wastes until the new centrifuge enrich­ment facility is on stream and requires more capacity. Here again, technological innovation has solved a serious problem by using a concept that promises to have broad application to liquid waste treatment and environ­mental control.

Ion Beam and Laser Processing

Most often it is the limits inher­ent in materials that present a barrier to technological advance. The Solid State Division has been at the forefront of applying ion beams and lasers to materials processing. The properties of mate­rials are literally tailored to meet the requirements of the desired applications. Here, as elsewhere, the influences -of push and pull are

6

often impossible to distinguish in the vigorous interplay with outside interests. Among the techniques in which ORNL has played a key role are:

• ion implantation doping- a precisely controlled method for introducing impurities into solids to change their electroniG proper­ties;

• pulsed laser annealing- a method of giving materials heating and cooling rates 1000 times as high as those achieved by normal metallurgical methods; and

• ion beam and laser mixing - two recent methods for inducing

materials interactions in thin films on solid surfaces. These techniques were developed by Woody White, Bill Appleton, Jim Williams, and the Chemistry Division's Gene Kelly.

Possible applications include fabrication of supersaturated sub­stitutional alloys of silicon with 500 times as many dopant atoms i:p. electrically active sites and with unique semiconducting properties; production of thin films of amor­phous mixtures, new metastable alloys, and high temperature su­perconductors; creation of near­surface corrosion-resistant alloys; and the reduction of friction and

OAK RIDGE NATIONAL LABORATORY Review

wear in bearing steels, thereby conserving critical materials.

Laser processing of materials used in photovoltaic conversion of solar energy is another impressive example ofthe progress and poten­tial in this technology. By combin­ing ion implantation and laser annealing with other processing steps using lasers, Dick Wood's group now obtains efficiencies of about 17% in single-crystal silicon solar cells routinely in the labora­tory. (The bes-t obtained elsewhere by the most sophisticated tech­niques is about 18%.) Solar cells made by the low-cost, laser-induced diffusion technique, for which Jagdish Narayan and Rosa Young

SPRING 1981

hold a patent, currently have efficiencies of about 14%. In this method, a thin layer of dopant is deposited on the silicon substrate and diffused by melting the near­surface region with pulses from a Q-switched laser. Improved fabri­cation techniques should soon lead to even higher efficiencies .

Some of the most exciting devel­opments in laser processing of solar cells have occurred with polycrystalline materials. Wood's group has established that the diffusion of impurities along grain boundaries and the segregation of dopants can be controlled by laser techniques. These results led re­cently to the demonstration of

Bill Appleton works in the Solid State Division accelerator lab where the techniques of ion beam and laser m ixing were developed to tailor the properties of materials to meet the requirements of desired applications.

grain boundary passivation by lithium diffusion. This should result in substantial savings in energy, time, and money. As a result, laser processing systems are already on the market and ad­vanced systems are being designed and built.

Fuel Alcohols from Fermented Material

In the Chemistry Division, Alicia Compere and Bill Griffith have developed low-energy process for removing alcohols from aqueous solutions such as those produced by fermentation. Called Fualex (for fuel alcohol extraction) , it appears to be a promising alternative to distillation. Their tests show that alcohol mixtures obtained from dilute solutions may be incorpo­rated into fossil fuels or further purified for other uses, such as enhanced oil recovery or as replace­ments for petrochemicals. In the process, they add a small amount of a surfactant and one or more hydrocarbons, such as heptane, toluene, gasoline, diesel, or other fuel oils, to a fermentation broth or similar aqueous sol uti on. This results in the formation of an emulsion enriched in alcohols which can be removed and used either for fuel mixing or purified further for other applications. Fu­alex requires little of the energy needed for conventional methods of distilling and dewatering ethanol for use in fuels. It also may increase safety and decrease pollu­tion, since the low toxicity of the surfactants is demonstrated by their regular use in foods and personal-care products.

7

Dick Wood, Rosa Young, and Jagdish Narayan test solar cells made at ORNL using the low-cost,laser-induced diffusion technique_ These cells have been found to have efficiencies as high as 14%_

Protection of Synfuels Workers

Elements of both market pull and technology push are evident in contributions by ORNL's Life Sciences Synthetic Fuels and Fossil Energy programs to worker protection in the emerging synfuels industry_ The Health and Safety Research, Chemical Technology, and Health divisions have partici­pated in the development and application of a whole new family of monitoring instruments for detecting coal-conversion contami­nants_ These instruments are now being used in industrial hygiene programs at coal gasification and liquefaction demonstration plants. Three of these are portable sys­tems: DUV AS,aderivativeultravio­let absorption spectrometer for specific aromatic compounds, de­veloped by Alan Hawthorne and Dick Gammage; a fluorescence spill spotter for general surface contamination, developed under the direction of Dan Schuresko; and Tuan Vo-Dinh's fluorescence light pipe luminoscope, whose primary function is to detect and measure small amounts of residual skin contamination.

Vo-Dinh also has demonstrated the effectivenss of two spectro­scopic techniques for trace analy­sis of organic compounds: room temperature phosphorescence and synchronous luminescence. Most other methods involve a prefrac­tionation step prior to analysis, but ' the RTP and SL techniques can be used directly to determine the PN A compound present in the raw coal product diluted in ethanol. A com­panion to these highly sensitive spectroscopic methods is a small integrating vapor monitor worn in the pocket as a personnel exposure

8

meter_ Developed by Vo-Dinh and Gammage, it is simply a holder for a disk of filter paper that adsorbs polynuclear aromatic hydrocarbon vapors from the air. RTP is used to identify and quantify selected PNA vapors such as phenanthrene, fluoranthene, and pyrene. · The luminescence spectra also can be recorded synchronously for addi­tional selectivity.

Preservation of Early Em­bryos

Possibly the most significant technology push in our life science areas may result from Peter Mazur and Stan Leiba's pioneering work in the Biology Division on preserv­ing animal cells, organs, and even embryos by freezing. The feasibil­ity of preserving mammalian ova and embryos for extended periods was first demonstrated here in 1972. Mouse embryos survived freezing to and thawing from -l96°C, a temperature low enough to produce what can fairly be

termed suspended animation. The test showed that frozen and thawed ova and embryos developed into living animals when transferred into suitable foster mothers. The foster mothers used in these tests included rats , rabbits, hamsters , sheep, goats, and cattle.

Consider two important practi­cal implications of this develop­ment, the first related to mouse embryos. Natural reproduction of all biological life leads inevitably to changes of the species by genetic drift. This means that experiments requiring long-term examination of whole-animal responses are conducted against a continuously changing genetic background. Since embryos can be stored easily in the frozen state for extended periods, it now is possible to reduce the rate of genetic drift of mouse strains used in such studies. Al­ready, three mouse embryo banks have been established in this country and the United Kingdom, and the International Committee

OAK RIDGE NATIONAL LABORATORY Review

for Standardized Genetic Nomen­clature lists the existence of several mouse substrains carrying the notation "p" for "preserved."

With cattle embryos, selective breeding using frozen and thawed bull sperm for artificial insemina­tion has led to an enormous in­crease in the quality and producti v­ity of beef and dairy cattle, but it drastically reduces the genetic information carried in cattle. Freezing of embryos, which carry the complete genome of the male and female parents, offers an inexpensive way to preserve the entire available genetic heritage of twentieth century cattle. This sug­gests that artificial inovulation will soon have as large an impact on cattle breeding as artificial insemination does now.

TECHNICAL HIGHLIGHTS

So much for my theme of technol­ogy push and market pull. I want now to mention some examples of

SPRING 1981

other outstanding technical accom­plishments during the past year. The first come from the basic energy sciences.

NMR Studies of Coal

. One of our goals in the basic energy sciences program is to improve the scientific basis for synfuels processes such as coal liquefaction. This requires a much more detailed understanding of coal structure and of structure/ re­activity relationships, a task of major proportions for the organic chemist. This year in the Chemis­try Division, Ed Hagaman has applied a superconducting nuclear magnetic resonance device to the problem, with excellent results. NMR is especially well suited to identifying and characterizing molecular confi.gui-at1ons contain­ing carbon, hydrogen, and oxygen. Structurally, coal consists of large units formed by fusing several benzene rings together. These units are called polynuclear aromatic

Neva Harrison adjusts a bench scale fermenter used as part of the experimental Fualex process to extract fuel alcohols with improved energy efficiency.

structures which, in turn, are linked together by carbon chains, called alkyl bridges, or by ether­oxygen bridges, to form a sort of random polymeric structure. The NMR probe can follow changes in this large random structure during reactions that break down the complex polymers into simpler chemical units. Spectra show the rearrangements of aromatic units and the dramatic results of the completed reaction: broken methyl­ene bridges. The solid-state NMR technique offers a more accurate structural picture of raw coal and its chemical derivatives because the coal does not have to be dissolved before analysis.

Electron Impact on Ions

Another fundamental problem of vital technological importance is understanding how electrons inter­act with ions in fusion plasmas and the coronas of stars. Dave Cran­dall, R. A. Phaneuf, and other members of the Physics Division, with colleagues at the Joint Insti­tute of Laboratory Astrophysics in Colorado, are studying important events that happen when a beam of electrons collides with a beam of highly charged ions. In particular, they have examined two processes by which the electron beam strips the ions of an additional electron - either through direct ionization or autoionization. In the former, the impact of the electron beam knocks an electron out of the atomic shell. Autoionization occurs when an inn.er shell atomic elec­tron is excited to a higher-energy orbit and then relaxes, releasing its surplus energy by ejecting another bound electron. The problem is to determine which process will occur in a specific ion as a function of

9

This pocket exposure meter developed at ORNL contains filter paper that adsorbs vapors of polynuclear aromatic hydrocarbons from coal conversion processes. The contaminated paper is then analyzed by room temperature phosphorescence to determine if the meter wearer was exposed to hazardous levels of vapors in his work environment.

electron beam energy. Preliminary results indicate that the probabil­ity of autoionization increases with heavier, highly charged ions that have few outer-shell electrons and more numerous inner-shell elec­trons. This insight is expected to influence physicists' understand­ing of the ionic charge-state distri­bution in hot plasmas, which for fusion is closely coupled to the radiative power loss when heavy­ion impurities are present; this physics is vital to understanding the role of impurities in limiting fusion reactions in tokamak plas­mas.

Delayed Lasing Inventiveness in the analytical

application of lasers continues, as evidenced by the development of a technique called delayed lasing by Mike Ramsey and Bill Whitten in the Analytical Chemistry Division. Optical absorption spectrometry is

10

an important analytical tool which is based on the fact that the light a substance absorbs at different wavelengths is characteristic of that substance. Conventionally, optical absorbance is determined by measuring the intensity of light before and after it passes through a sample. The problem is that some materials absorb so little light that there are large measurement uncer­tainties. The solution, using a laser, involves measuring time intervals instead of the intensity of transmitted light. The technique is based on the fact that a critical number of atoms in the laser crystal must be excited before a laser beam is emitted. With a sample present in the laser cavity and absorbing photons, the onset of lasing is delayed by a time interval that is linearly propor­tional to the absorbance. Ramsey's and Whitten's experiments have shown that an absorbance as small

Alan Hawthorne adjusts DUV AS, a device he helped to develop that can monitor the atmosphere of coal conversion plants for hazardous gases and vapors.

as 10-3 can be measured, and improved experimental design should allow measurements down to 10-5• Because of the technique's sensitivity, potential applications seem likely in the detection of trace quantities of most substances, including both gaseous and liquid pollutants.

Insect Toxicology

Turning to the life sciences, we find that meaningful evaluation of chemical toxicity to plants and animals is a challenge not only to ORNL in its effort to predict deleterious effects of energy-related pollutants but also to the entire science of toxicology. Both the scientific community and the pub­lic have questioned the validity of biological assays that report toxic

OAK RIDGE NATIONAL LABORATORY Review

Dan Schuresko checks the operation of the fluorescence spill spotter used to detect surface contamination by trace amounts of polynuclear aromatic hydrocarbons from coal liquefaction plants. ·

effects based on exceedingly high concentrations of toxicants that are applied in unnatural ways to specially selected strains of organ­isms. But now an assay developed by Barbara Walton and Gerry O'Neill in the Environmental Sciences Division holds promise for more realistic assessment of the effects of synthetic fuels on the embryonic development of soil invertebrates. This assay,in which common crickets deposit eggs in sand contaminated with chemi­cals, is noteworthy because the method of exposure is environmen­tally based. Further, Walton and O'Neill have shown the assay to be sensitive to very low concentra­tions of teratogens. The structural abnormalities seen in treated in­sects, typically in the formation of

SPRING 1981

extra compound eyes and/ or heads, are easy to detect. Some of the abnormal insects have been reared to the adult stage and a small percentage of abnormal insects were found among their offspring. These findings are inter­esting and important because syn­thetic fuels from coal yield positive results in the assay, but the natural petroleum products tested thus far do not produce extra eyes or heads. (This is also what generally has been observed in the Biology Divi­sion using mice, bacterial cells in culture, and other test systems.) The de novo generation of extra body parts after treatment of crick­ets is a phenomenon previously unreported for insects. Apart from its practical application as an assay, this opens new possibilities for looking experimentally at more basic questions of cellular determi­nation and differentiation during embryonic development.

Peter Mazur works in his lab where pioneering studies have been done on preserving cells, organs, and embryos of animals by freezing. (See back cover.)

Paleoecology and Climate The addition of excess carbon

dioxide to the atmosphere through burning of fossil fuels is expected to induce changes in the global cli­mate within the next 50 years. Some models predict global warm­ing of at least 2°, which would affect the distribution and abun­dance of commercially important trees as well as displace crop and natural vegetation zones. Our best analogs for predicting future biotic responses are the climate changes that took place during warm peri­ods of the Quaternary, 5000 to 8000 years ago. The techniques avail­able for interpreting past climatic and vegetational change include (1) variance in the widths of tree rings, (2) chronological changes in the abundances of fossil pollen grains extracted from lake sedi­ments, and (3) simulation modeling of forest succession under hypo­thesized changes.

11

Ed Hagaman uses the nuclear magnetic resonance device to learn the detailed structure of coal molecules before and after reactions that break down the complex polymers into simpler chemical units. This work provides a basic scientific understanding of synfuels processes such as coal liquefaction.

In the Environmental Sciences Division, Hazel Delcourt and Allen Solomon have coupled the use of fossil-pollen chronologies with computer modeling in order to reconstruct quantitatively the vegetation changes in the region over the past 16,000 years. They have determined changing fre­quencies of tree species during late­glacial and postglacial times from radiocarbon-dated sediment sam­ples taken from Anderson Pond in Tennessee's White County and several ponds in Alabama. The results show that during the mid­postglacial period of climatic warming, vegetative composition changed from species-rich mixed hardwood forest to dry oak -hickory forest. Their model simulations can reconstruct characteristics of vege­tation which would not be inter­preted from pollen evidence alone, including the dynamics of insect­pollinated tree species and detailed structural characteristics of ecolog­ical communities. As a conse­quence of climatic warming, the total biomass and thus carbon storage of Middle Tennessee for­ests were simulated to have been lower 5000 to 8000 years ago than they are now. Today, forests of this region might provide more of a buffer against induced climate change because of their greater C02 absorbing capacity. Working with T. J. Blasing, Delcourt and Solomon are performing broader regional analysis of paleoecologic evidence which should improve predictions of future C02-induced climatic and vegetational changes.

12

Radiopharmaceutical Development

Although still in the preclinical stage, a new class of radioactive imaging agents, developed by the Nuclear Medicine Technology group led by Russ Knapp of the Health and Safety Research Divi­sion, shows promise for improved medical diagnosis of heart disease, adrenal disorders, strokes, and brain tumors. The agents, contain­ing isotopes of tellurium and tin, are superior to those currently in use in their ability to concentrate and be detected in target organs or tissues. One of the most important applications of nuclear medicine this year was achieved by Knapp, Kathleen Ambrose, Thomas But­ler, and James Hoeschele in the field of heart imaging. These re­searchers were able to tag fatty acid compounds with tellurium, which not only concentrates rn

damaged tissue at different levels than in normal tissue but also combines chemically with the tissue, a metabolic trapping which gives clearer images using less radioactivity. Knapp and asso­ciates have also developed tin­labeled steroids, the first tissue­specific tin-labeled radiopharma­ceuticals ever produced, which show pronounced adrenal uptake in experimental animals. Finally, a newer class of radiopharmaceuti­cals synthesized at ORNL are tel­lurium-labeled barbiturates which could be used as noninvasive agents to detect and locate strokes, brain tumors, and other a bnormali­ties characterized by changes in blood flow. Theseareatthestageof animal testing.

While the agents just discussed will make significant medical con­tributions in the future, an impor­tant development was achieved

OAK RIDGE NATIONAL LABORATORY Review

this year in supplying the isotope that is in greatest current demand for diagnosing heart disease. That isotope is 203Tl, which is used to produce 201Tl for heart scans. The ability of the isotope enrichment program, part of Chemical Tech­nology, to respond to this need was enhanced by development of a different charge material, thallous sulphide, used in the calutron ion source. This was done by two groups led by Bill Bell and Joe Tracy. The thallous sulfide has vapor pressure characteristics which permit considerably better control of the ion source and, therefore, of beam. current and focus. The result is that almost 3 kg 2oaTl enriched to greater than 95% was made available, which trans­lates into doses sufficient for four to five million heart scans-a remar­kable contribution to health-care delivery.

SPRING 1981

Improved Bioassays

Assessment of human health risks from inhalation of chemical substances has been limited to experiments in animals and to the study of exposed worker popula­tions. Investigation of chemical effects on the critical target tissue, human respiratory epithelium, under controlled laboratory condi­tions has not been possible because methods did not exist to maintain human epithelial cells in the labo­ratory. Now, in an important advance, Margaret Terzaghi and Andre Klein-Szanto of the :eiology Division have developed a method not only of maintaining normal epithelial cells but also of propagat­ing them so that virtually ·un­limited quantities can be made available. Their technique consists of repopulating denuded and trans­planted rodent tracheas with hu-

Barbara Walton, insect toxicologist, examines an abnormal cricket in her lab. As an embryo, the cricket was exposed to a coal-derived chemical. Walton found that some crickets hatched from eggs in sand contaminated with toxic chemicals were born with three eyes or two heads.

man bronchial cells taken from autopsy material. Once the human epithelium is established in vivo, the epithelial cells can again be isolated and used to reseed a new series of tracheas . This is intended to continuously expand the human cell population originally inocu­lated, a technique expected to receive widespread application in commercial testing laboratories.

Sodium Boiling in LMFBRs I turn now to nuclear safety

research, fusion , and coal liquefac­tion.

During normal operation, a liq­uid-metal fast breeder reactor oper­ates at temperatures well below those necessary for sodiqm boiling, but during some hypothesized off­normal conditions, boiling might occur. From the predictions of one­dimensional boiling codes, it was thought that any boiling would lead to immediate dryout (removal of the liquid film from the fue l-pin surface), clad failure, and core damage. Hence, an early safety requirement was that no boiling be allowed in the core.

To investigate this premise, John Wantland and Bill Montgomery of the Engineering Technology Divi­sion conducted extensive testing of sodium boiling in the Thermal Hydraulic Out-of-Reactor Safety (THORS) facility, an engineering. scale sodium loop which uses electrically heated test bundles to simulate segments of LMFBR core subassemblies. They have recently completed boiling tests with a 61-pin bundle which more closely approximates a reactor core subas­sembly of 217 pins. As in previous

13

This image of a rat heart and liver is particularly sharp because the tissue contains concentrations of fatty acid compounds tagged with radioactive tellurium, which combines chemically with the tissue. This metabolic trapping gives clearer images using less radioactivity.

Andre Klein-Szanto and Margaret Terzaghi work in their lab where they developed a method of maintaining and propagating human bronchial cells (from autopsy material) that can be grown on roden t tracheas for tests of the chemical effects of inhaled subs tances on the human respiratory epithelium. Previously, no method existed for maintaining human epithelial cells in the laboratory.

tests with a 19-pin bundle, dryout did not occur unless and until the boiling region covered the entire transverse area of the bundle. Then, due to the increased pressure drop caused by the vapor blanket across the bundle, the inlet flow decreased dramatically. Skip Dear­ing, AI Levin, and Simon Rose have reproduced these flow de­creases analytically, using an ORNL-modified version of the SABRE subchannel code, and have achieved excellent agreement with the experiment. Extrapolating with this code to a full-size 21 7-pin subassembly for low-flow, forced­convection boiling, they show that a fairly long time of5 to 10 s elapse between boiling initiation and dryout. These results are very important for LMFBR safety de­sign and now are being used both at Westinghouse and Argonne National Laboratory.

I have already mentioned a

14

finding from the TMI accident that may profoundly influence regula­tory guidelines. TMI also fueled a growing concern over the lack of reactor-core heat-transfer data ap­plicable to various types of acci­dents whose severity depends in large part on how well the reactor's decay heat is transferred to the coolant. Heat-transfer data that allow prediction of accident con­sequences and help identify ways to reduce their severity are difficult to obtain, however, because of the typically severe pressures, temper­atures, and flow rates involved. Further, the geometry of the reac­tor core, with bundles of closely spaced rods, makes precise mea­surements difficult. As a result, analysis often has had to employ correlations developed from data taken inside hollow tubes and then applied beyond the range of data on which they are based.

Doubt over the suitability of this

practice led NRC to request that ORNL conduct experiments with the Thermal Hydraulic Test Facil­ity to obtain the needed rod-bundle heat-transfer data. THTF, a full­scale model of a portion of a reactor core which, like THORS, uses highly instrumented electrically heated rods , was NRC's only facil­ity capable of achieving the desired information based on the true geometry and conditions of reactor accidents. Early this year, Bill Craddick and Tom Anklam placed primary emphasis on fulfilling NRC's urgent request for data applicable to small-break loss-of­coolant accidents typical of TMI conditions. More than 63 million data points generated during 1980 will be among the most important resources in NRC's selection of appropriate models for analysis of small-break loss-of-coolant acci­dents and operational upsets. In meeting NRC's needs , Craddick

OAK RIDGE NATIONAL LABORATORY Review

and Anklam have also produced data of considerable interest to reactor vendors and academic re­searchers.

High Beta Studies on ISX-B

In fusion research, the main objective of our tokamak experi­mental program with the Impurity Study Experiment device (ISX-B) is to study the equilibrium stability and confinement characteristics of what are called high-beta plasmas. Beta is the ratio of the kinetic pressure of the hot plasma to the pressure exerted by the confining magnetic field. Thus, ultimately, it will be a key economic parameter for power-proaucing fusion reac­tors, which require a volume­average betaof4to6%. WithiSX-B, a group led by John Sheffield and Masanori Murakami has obtained high plasma pressures by using powerful neutral-beam injectors

SPRING 1981

------------------------------------------

developed at ORNL. This year, Hal Haselton, Walt Gardner, and their colleagues have increased the in­jection capability to 3 MW at an energy of 45 kV by upgrading the neutral-beam system to accommo­date two new 100-amp ion sources. With the injection of up to 2.5 MW into ISX-B, they have achieved beta values of 3%.

This is encouraging not only because values approaching those required in fusion reactors have been attained, but also because they exceed the limit that had earlier been predicted from ideal magnetohydrodynamic stability theory. However, the rate of in­crease in beta with increasing injected power appears to be re­duced at the higher levels. This apparently reflects a degradation of the plasma energy confinement. In the near term, experiments will concentrate on trying to under­stand and overcome this apparent

This bundle of instrumented, electrically heated rods is used in the Thermal Hydraulic Test Facility to gather data on heat transfer during simulated reactor accident conditions, including those typical of small-break, loss-of-coolant accidents such as the one that occurred in March 1979 at Three Mile Island.

Electrician L. M. Black examines a rod assembly that has been used in the THORS facility for tests of sodium boiling in a simulated LMFBR environment.

saturation of beta. Preliminary injection experiments in elongated plasmas are encouraging in this regard. Although the operational parameters were not optimized, the beta was a!' high as that seen in a circular plasma, while the energy confinement time was signifi­cantly better. And even were beta to be saturated, it apparently is high enough to be reactor grade.

Gyrotron Development In the Elmo Bumpy Torus pro­

gram, you have read much this year about plans for the next major phase, construction of the EBT proof of principle experiment to be built in Oak Ridge. Dick Colchin and his associates have also en­joyed an important development success with the microwave sources call "gyrotrons" which supply the plasma heating for EBT. Just as with the successful application of ohmic and neutral-

15

The gyrotron for the EBT-S before installation.

beam heating to tokamaks, another pioneering ORNL effort has been in the application of microwaves to electron cyclotron heating. In EBT, not only is the steady-state plasma heated by microwaves, but also the confine­ment stability is provided by an intensely hot electron ring plasma created uniquely by this heating method.

Electron cyclotron heating is accomplished by coupling micro­wave power to plasma electrons at their gyro-resonance. In order to penetrate the dense plasma of fusion interest, very high micro­wave frequencies and correspond­ingly high magnetic fields are required. Because hundreds of kilowatts or perhaps even several megawatts of power are required, this method requires multikilowatt microwave tubes operating in the millimeter wavelength region of the spectrum. Since the power output of conventional microwave tube structures is severely limited in this range, a new technique was required.

Our development program, be­gun in 1975, identified the gyro­tron- a form of cyclotron reso­nance maser that uses fast-wave interactions- as the most promis­ing device for high-power, short­wavelength applications. (Like the tokamak, this is a Russian inven­tion.) Subsequently, we entered into a subcontract with Varian Associates to produce a 28-GHz tube with 200-kW capability. The effort culminated this year with the development of a gyrotron, which delivers over 200 kW of steady-state power at efficiencies of about 50%. By the end of the year, 128 kW of power had been successfully put into EBT.Bothplasmadensityand temperature are found to scale upward with increasing power,

16

opening new plasma parameter regimes for research.

Coil Winding Test Experiment We also have a substantial devel­

opment success to report in the area of superconducting magnets. The Coil Winding Test Experiment magnet, developed under the guid­ance of Martin Lubell in the Fusion Energy Division, represents an impressive advance in the state of the art of superconductivity. It produces a peak field of 8.6 Tin the

working bore and is the largest superconducting magnet of this field strength made to date. During the initial test in September, it performed to full design current without training or degradation. Owing to its feature of radial access, a linear dimension of about 0.5 m at a working field of 8 Tis available for tests of large conduc­tors. In addition, the magnet may be mounted with central axis either vertical or horizontal for flexibility in experiments. It will be used for

OAK RIDGE NATIONAL LABORATORY Review

testing large, high-current super­conductors being developed for fusion machines. The magnet con­struction consists of both layer and pancake windings to test a design feature which will be used in the EBT:p superconducting magnets to help cancel unwanted field components. The CWTX magnet also is designed to accept a nio­bium-tin insert split coil which will increase the field capability to approximately 12 T. This combined system is to be tested early in 1981.

Physical Properties of Coal Liquids

The last technical highlight comes from Bill Rodgers' fossil program in the Chemical Technol­ogy Division, where efforts are under way to provide the physical and rheological data for coal-

SPRING 1981

derived liquids needed by the designers of large synfuels demon­stration plants. A small-scale, con­tinuous-flow coal liquefaction sys­tem, called the Coal Liquids Flow System, is equipped to measure physical properties at process con­ditions up to 31 MPa and 810 K. Operated by George Oswald, Doug Lee, and Randy Gibson, it has provided the best measurements to date of the dramatic changes that occur as the feed slurry, composed of powdered coal and process solvent, is heated to operating temperature. The observed, in­creasingly non-Newtonian flow characteristics have been modeled to allow much more accurate pip­ing and heat exchanger designs. Measurements also define for the first time a temperature range of increased viscosity, known as the

This gyrotron on the ELMO Bumpy Torus (EBT-S) experiment is a source of microwaves for heating the electrons in the plasma. The hot electron rings created by microwave heating provide confinement stability to the plasma.

gelation region, which occurs as the slurry is heated through the temperature range of 590 to 670 K at 13.9 MPa. This is thought to be a region where swelling of the coal occurs. In addition to viscosity and density, thermal conductivity and heat capacity data soon will be available for coal-derived liquids and slurries under process condi­tions. Although similar data for pure materials and petroleum are readily available, until now none has existed for coal-derived liquids. And few other facilities have at­tempted to make these difficult measurements under process con­ditions.

SUMMARY AND OUTLOOK

In summarizing the state of the laboratory, I think that our institu­tional health and well-being is excellent, as the quality of the work I have cited testifies. We continue to be in the right place at the right time doing the right things- and doing them with ingenuity, skill, and dedication. Among line-item projects, the new budget provides for two important initiatives. One is the finish of design but delay of construction on the long-awaited Energy Systems Research Labora­tory, and the other is initial autho­rization for design and construc­tion of the $17 million High Tem­perature Materials Laboratory, to be located in the par king area across the creek and south of the 45008-4508 complex. In this latter initiative, the Laboratory again will be the custQdian of an impor­tant national resource. HTML will provide the nation with a much

17

Herb DeArmond, left, works with Donald James in making adjustments in the Coil Winding Test Experiment (CWTX) magnet, which produces a peak field of 8. 6 Tesla and is the largest superconducting magnet of this field strength made to date.

needed laboratory designed for basic and applied research on materials at elevated tempera­tures. An advisory committee con­sisting of five university and two industry representatives already is developing plans for its operation as a user facility open to the technical community. About $3 million ofthe total cost will be used to acquire expensive and unique equipment needed in the special­ized areas of high-temperature materials synthesis, preparation, and characterization.

This has been a year when the Laboratory's evolution as a user center has advanced very rapidly. Within a three-month span we dedicated the National Environ­mental Research Park, the Holi­field Heavy Ion Research Facility, and a key new instrument of the National Center for Small-Angle Scattering Research. Each is indic­ative of past and now anticipated leadership roles for the Laboratory.

Several other important new initiatives are on the horizon and have been discussed in some detail in the ORNL Institutional Plan, FY 1981-FY 1986 published in December, as well as in the current edition of the staff version, Trends and Balances. Several of these recently proposed initiatives are deserving of mention as we look ahead.

18

One is a program to be called Applied Science/ Technology Base. An intensive examination of the R&D base by DOE and the labora­tories over the past two years revealed some serious shortcom­ings, particularly in that the ap­plied science research needed to develop a healthy, competitive set of technology options. Looking at these needs in relation to conserva-

tion, fossil, fission, and fusion technologies, we believe that we can make significant contributions in areas such as multiphase flow, instrument development, separa­tion science, corrosion, waste fixa­tion, systems modeling, and, espe­cially, development of materials for service in hostile environments.

Resource recovery from uncon­ventional sources has become im-

OAK RIDGE NATIONAL LABORATORY Review

,;o··;T r

-:, .... ·:: :;. .. ·· :...::'-~.,...

portant in light of potential short­ages of a number of strategic materials, especially metals criti­cal to energy production or conser­vation. Initially our emphasis will be on recovering useful metals from the solid-waste residues of energy processes (e.g., coal combustion fly ash, coal conversion solids, mill tailings, and oil shale residues). We also expect to examine extraction processes for low-grade ores, re­covery of fuel values, multiple products recovery, and conserva­tion. Dilute aqueous sources will also be investigated.

Initiatives in light-water-reactor safety and technology will address problems that affect plant avail­ability and productivity as well as upgrading accident prevention technology (e.g., through improved control-room procedures and diag­nostic techniques).

The area of biotechnology offers great promise for both near- and long-term production of fuels as well as abatement of pollution and resource recovery. Building on past strengths, we expect to focus on development of efficient processes for fuel production from biomass, development of efficient waste­treatment techniques, resource

SPRING 1981

........ ........ . -;

recovery, and measurement of biologically important constitu­ents in complex mixtures.

Recently, we have been develop­ing the concept of a utility-oriented coal refinery, applicable to either eastern or western coals, in which the pyrolysis process, either by itself or coupled with coal benefi­ciation and waste heat recovery, is used as a pretreatment for coal destined for use as a clean boiler fuel-with the bonus of significant by-product liquids and gases. Inter­est in this approach by EPRI and several utilities is strong and will lead this year both to a joint study of techno-economic issues and to

.... laboratory exploration of combus-tion properties.

In another initiative this year, ORNL has assisted in the forma­tion of the national Society for Risk Assessment and an attendant professional journal. As an exten­sion of past work on site-specific, regional, and generic assessments of the risks associated with energy technologies, we see the need to develop and apply comprehensive principles of risk assessment that can gain both scientific and policy acceptance. Clearly, practical is­sues in risk assessment, such as

Artist's conception of the High Temperature Materials Laboratory.

risk apportionment, comparative risks, and risk-benefit tradeoffs, must receive more attention.

Our planning in these and re­lated areas has been stimulated greatly this year by a comprehen­sive energy policy study called, temporarily, the Oak Ridge Na­tional Energy Perspective (ONEP), now being prepared for publica­tion. This study, led by the Office of Program Planning and Analysis, deals with both national and world energy issues and provides the strategic framework through which we can focus our future efforts more effectively. Its conclu­sions not only are an aid to plan­ning but also are comforting in the optimism they project for the coun­try's energy future if we act ag­gressively now to reduce depen­dence on imported oil.

In conclusion, let me express again to all of the ORNL staff my appreciation for their dedicated work and outstanding contribu­tions. They are the source, each year, of what is truly an impressive catalog of accomplishments span­ning a broad range of the sciences and the technologies and represen­ting progress both on fundamental issues of importance and practical problems on which our energy future depends. To me, this record establishes ORNL as the premier institution of its type and one whose promise is as bright as its record. CD

19

It has been written that not everyone loves a river, but every river has someone to love it. Threaten to dam, channel , divert, dredge , or otherwise modify a stream or river, and people come out of the wood­work ready to defend it. Jerry Elwood 's interest in and fascination for rivers and streams were fostered during his child­hood in Montana, a state blessed with an abundance of pristine trout streams. Says he, "I loved being around rivers and 'cricks,' listening to their roar, feeling their power while I wade through rapids to reach a favorite fishing spot, and knowing that they are free to flow and do their work as nature intended." Elwood came to ORNL in 1968 upon earning his Ph.D in aquatic ecology from the University of Minnesota. His principal research interests are the ecology of streams and rivers, nutrient cycling in aquatic ecosystems, and the behavior of environmental contaminants in aquatic food chains. He is leader of the Stream Studies Project in the aquatic ecology section of ORNL's Environmental Sciences Division and is one of the principal investigators on the material spiraling in stream ecosystems project.

Spirili119 moWJt tfte ~aterc~hed How Flowing-Water Ecosystems Cycle Nutrients

By JERRY ELWOOD-

E cologists see ecosystems as energy processors, All eco­

systems require both energy inputs and an array of elements for their maintenance and growth, In aq­uatic systems such as lakes , oceans, and rivers, the r1;1te at which energy is fixed by photosy­nethetic organisms and metabo­lized and stored by other, non­photosynthetic organisms is ordinarily regulated by sunlight or

20

the supply of available nutrients. Because nutrient supply is so crucial to ecosystem persistence and productivity, a substantial amount of ecological research has focused on the mechanisms, path­ways, and rates of nutrient cycling in various ecosystems.

In an essay published forty years ago, Aldo Leopold, naturalist and conservationist, suggested that as nutrients cycle through ecosystems

they move downhill. He expressed the view that all land represents a downhill flow of nutrients, and he stressed the importance of biologi­cal organisms and food chains in To estimate the pathways and exchange fluxes of a nutrient through a stream ecosystem, a tracer of the nutrient is released to stream water. Webb Van Winkle, Pat Mulholland, Jerry Elwood, and Denis Newbold (left to right) are shown here collecting samples from Walker Branch following the release of a tracer.

OAK RIDGE NATIONAL LABORATORY Review

SPRING 1981 21

22

Artificial streams are being used to measure the role of uarious ecosystem components in controlling the uptake, turnouer, and transport of nutrients in streams. Denis Newbold, Jerry Elwood, and Paul Singley (left to right) sample one of these artificial streams during a tracer release. The streams are made in modules so that the length can be uaried from 10 to 80 m.

slowing the inexorable movement of nutrients from mountains to the sea. Leopold suggested that if this downhill flow were not slowed, plants and animals would be forced to follow their nutrient salts to the sea in order to survive.

As may be expected, the downhill flow of nutrients is more pro­nounced in streams and rivers than in other ecosystems because of the continuous, unidirectional flow of water that transports nutrients in dissolved and particulate forms. The importance of retarding this

downhill movement is thus corre­spondingly greater for these flow­ing water ecosystems. One can argue that because slowing this downhill flow of nutrients is so important to the continued exis­tence of the ecosystem, mechan­isms for reducing it should be more apparent in streams and rivers than in ecosystems where the downhill flow is less. The questions that need answering for a clear understanding of nutrient reten­tion in flowing water ecosystems are: How efficiently do streams and

OAK RIDGE NATIONAL LABORATORY Reuiew

rivers use the nutrients that enter them from the surrounding wa­tershed before these elements are lost through downstream trans­port, and what are the biological mechanisms that increas.e this efficiency and thereby retard the downstream flow of nutrients to the sea?

Limiting Nutrients Under two successive three-year

grants from the National Science Foundation's Ecosystem Studies Program, a research group in the

SPRING 1981

Cycling of phosphorus in a lake or pond

--- P04

in solution in lake or pond water

--- Phosphorus associated with particu late materials (e.g., algae, bacteria, etc.)

Spiraling of Phosphorus in a Stream

--- P04 in solution in stream water

- Phosphorus associated with particulate materials (e.g., algae, bacteria, etc.l

Environmental Sciences Division is investigating the retention and cycling of nutrients in stream ecosystems. Members of the group working with me on this project are Denis Newbold, Bob O'Neill, Webb Van Winkle, Pat Mulholland, Bob . Stark, Paul Singley, and Carole Hom. What we are trying to do is quantify the cycling of inorganic phosphorus and nitrogen, both of which enter streams via groundwa­ter from the surrounding wa­tershed, and also identify the major biotic and abiotic mechanisms

Nutrient cycling is different in a standing water environment (e.g., lake or pond) from that in a flowing water environment (e.g., stream or river), as can be seen in the path traveled by a nutrient atom in passing through a cycle. For lakes, the cycle is viewed as occurring in place, with no displacement of the nutrient atom during its passage through the cycle. The continuous, unidirectional flow of water in streams, however, results in a downstream displacement of the nutrient as it passes through the cycle. The cycle in a stream, therefore, can be seen as a spiral.

23

controlling the retention and cy­cling of these nutrients. We chose phosphorus and nitrogen for study because one or the other of these elements is generally found to be in "limiting" supply in most aquatic ecosystems. A nutrient is limiting if an increase in the supply of that nutrient increases the rate at which energy is processed in the eco­system.

Because of the importance of nutrient supply to the stability and productivity of an ecosystem, one would expect the available supply of a limiting nutrient in a stream to be used more efficiently than a nonlimiting nutrient. Biotic pro­cesses, thus, should play a major role in the retention and cycling of a limiting nutrient in streams. Further, biotic adaptations which enhance the utilization efficiency of a limiting nutrient should be

24

comparatively more important to the ecosystem than those for non­limiting nutrients, and the cycling of a limiting nutrient would be expected to be more sensitive to perturbations than that of a non­limiting nutrient.

Measures of the efficiency with which ecosystems use nutrients are typically based on rates of nutrient cycling relative to the available

supply. In general, a nutrient cycle is complete when a nutrient atom returns to a previously visited component of the ecosystem. For most aquatic and terrestrial eco­systems, it is generally assumed that the cycling occurs in place, that is, that there is no downhill displacement of the nutrient atom in completing the cycle. To apply the concept of nutrient cycling to a

OAK RIDGE NATIONAL LABORATORY Review

stream, however, where the contin­uous, unidirectional flow of water moves the nutrient atom down­stream in its passage through the cycle, requires redefinition of our concept of cycling.

Spiraling The concept must be redefined

for flowing water ecosystems to incorporate the idea that recycling

SPRING 1981

of nutrients is coupled with down­stream transport. For example, when a phosphorus atom enters a stream in groundwater or is re­leased from an organism growing on the stream bottom, it moves downstream in the dissolved inor­ganic form and becomes available either for biological uptake by a variety of microorganisms such as algae, bacteria, or fungi growing

In preparation for measuring the 16N:14N ratio of organic nitrogen in organisms and sediments from Walker Branch collected after the addition of 16N-enriched ammonium sulfate to this stream, Paul Singley is distilling the nitrogen which had previously been converted to ammonia. The ammonium borate resulting from this distillation is then concentrated in a solid crystalline form and introduced to a conversion unit attached to a mass spectrometer, where it is converted to nitrogen gas for the isotopic analysis.

on the stream bottom, or for sorp­tion onto stream sediments. Once taken up by a microorganism or sorbed by sediments, the phospho­rus atom can (1) be released directly back to water, (2) be absorbed by an animal which feeds on microorgan­isms, or (3) remain associated with the microbial cell or sediment particle, where it is subject to further downstream transport by sloughing of algae or scouring and resuspension of sediments. The cycle is complete when the phos­phorus atom returns to stream water in the dissolved inorganic form. Because nutrient cycling in streams involves some down­stream transport before the cycle is completed, one can visualize the path traveled by a nutrient atom as a spiral instead of a closed cycle.

In quantifying the cycling of a nutrient in a stream, it is thus

25

Snails have been found to be the principal grazers in the streams on Walker Branch Watershed. Here, Elwood and Webb Van Winkle remove dead snails from the artificial streams.

necessary to consider not only the time required to complete the cycle but also the downstream distance traveled by the nutrient atom completing the cycle. We refer to this cycling distance of a nutrient atom as the spiraling length.

For a conservative nutrient (i.e., one without a gaseous phase) such as phosphorus, the spiraling length consists of the average travel distance of the nutrient in solution before it is taken up by organisms or sediments on the stream bottom, plus the average travel distance of the nutrient in the particulate phase before it is released back into solution in the original inorganic form. We refer to the distance a nutrient travels in solution before beingtaken up from stream water as the uptake length. The downstream travel distance of a nutrient associated with particu­lates before returning to solution is referred to as the turnover length. Spiraling length, uptake length, and turnover length are measures of the efficiency with which stream ecosystems use the nutrients sup­plied from the surrounding wa-

26

tershed. Uptake length is a mea­sure of the stream's efficiency in removing nutrients from water, while the turnover length is a measure of the stream's ability to recycle nutrients from the particu­late phase back to water, where they are again available for up­take, and to retard the downstream flow of nutrients associated with particles. The shorter the spiraling length, the more times a nutrient atom is recycled within a given reach of stream.

Phosphorus Estimating the spiraling length

of a nutrient in a stream requires using a tracer of the nutrient to measure the uptake rates from water and the return rate from particulates back to water. We have done this for phosphorus by releas­ing, over a one-hour period, a solution containing radioactive inorganic phosphorus (32P04) to Walker Branch, a small woodland

stream in the Oak Ridge National Environmental Research Park. We have then measured the loss of the szp from organisms and sediments on the stream bottom.

Based on a release of 32P to Walker Branch in July and contirv uous sampling of the stream over a two-month period after the tracer release, we estimated the spiraling length of inorganic phosphorus during this period to be 187m. In other words, an atom of phospho­rus which enters stream water as dissolved inorganic phosphate (P04) travels an average distance of 187 m downstream in passing completely through one cycle. Of this total cycling distance, the dissolved phosphorus travels an average of 164m before it is taken up from stream water. The phos­phorus atom then travels 23 m downstream in association with predominantly fine particles before it returns to stream water. At this point, a new cycle ofthe phosphate atom begins.

OAK RIDGE NATIONAL LABORATORY Review

Expressed in terms of a cycling time, as opposed to a cycling distance, a phosphorus atom takes an average of 16 d to complete one cycle in the stream in July. Of this total time required to complete a cycle, the atom, in .the form of phosphate-phosphorus, spends an average of only 75 min in solution

SPRING 1981

in stream water and more than 15d in its particulate form. The travel distance of phosphorus in water (uptake length) is substantially greater than the average travel distance of phosphorus associated with particles (turnover length), despite the fact that a phosphorus atom is in solution an average of

Denis Newbold collects and simultaneously filters water from a small aquarium containing suspended sediments and stream water spiked with 32P04 to measure the uptake rate of phosphorus onto the suspended particles. To simulate conditions similar to that in Walker Branch, the growth chamber is maintained at a constant temperature equal to that of the ambient stream water and the water in the aquarium is stirred continuously to cause the turbulent mixing similar to that in a natural stream.

only 75 min, because ofthe velocity with which phosphorus moves downstream in solution compared to the average downstream veloc­ity of phosphorus associated with particles.

Measurements of phosphorus spiraling in Walker Branch at other times of the year show that there is considerable seasonal variation in the length of the cycle. In the spring and summer, for example, the uptake length for phosphorus is over 100m, ranging from 105 to 164 m. In contrast, during late autumn and winter, when the stream has large amounts of decomposing leaves from streamside vegetation, the uptake length of dissolved inor­ganic phosphorus is less than 10m. These seasonal differences in the efficiency of the stream ecosys­tem's use of dissolved phosphate may be due to either differences in the microbial uptake of t~s nu­trient onto the decomposing leaves in the stream or to differences in the adsorption of phosphorus, resulting from variation in the available surface area of sub-

27

strates on the stream bottom. To assess the relative importance

of the biological uptake of phospho­rus in Walker Branch and its adsorption onto stream sediments, we have conducted laboratory experiments using 32P and sterile and nonsterile sediments. Results of these experiments indicate that approximately 90% of the loss of soluble inorganic phosphorus from stream water is a result of biologi­cal uptake by bacteria, algae, and fungi associated with sediment particles, while the other 10%lossis attributed to physical or chemical adsorption. In similar experi­ments, we examined the effect of microorganisms on the loss of phosphorus from stream sedi­ments. The rate ofloss of phospho­rus from nonsterile sediments was found to be greater than that for sterile sediments, indicating that microorganisms speed up the re­lease of phosphorus from particu­late materials back to stream water. These results suggest that phosphorus spiraling involves pre­dominantly biological processes and that the variations in the uptake length are the result of differences in the biological de­mand for this nutrient. The impli­cations of this are significant because it indicates that phospho­rus concentrations in unpolluted streams may be controlled primar­ily by microorganisms growing on the stream bottom.

Nitrogen

To measure the spiraling length of nitrogen and to identify the mechanisms controlling the spiral­ing of this nutrient, we conducted similar tracer experiments in Walker Branch and in artificial streams. These experiments in­volved the use of compounds of nitrogen, in the form of nitrate and ammonia, labeled with 15N, a naturally occurring stable isotope.

28

Because 15N is already present in the environment, with a natural abundance of 0.365 at. %, enough of it must be added to the stream to measurably increase the natural 15N :14N ratio in water and in organisms that take up nitrogen from water. We can then estimate the rates of nitrogen uptake and loss in ecosystem components by measuring changes in their 15N :14N ratio using a mass spectrometer.

Robert Stark, a technician on the project, is a candidate for the master's degree at the University of Tennessee. For his thesis, he is comparing the uptake of ammonia­nitrogen and nitrate-nitrogen in artificial streams. In addition, he is attempting to identify the major pathways taken in the uptake of these two forms of soluble inor­ganic nitrogen from stream water. He has demonstrated that the uptake of ammonia-nitrogen from stream water is predominantly a biological process. The uptake length for ammonia-nitrogen in an artificial stream that had been sterilized with chlorine to stop biological uptake by algae , bacte­ria, and fungi in the stream bottom, for example, was over 300 m. In contrast, the uptake length of am­monia-nitrogen in an identical stream channel that had not been sterilized was only 30 m. Thus, although some nonbiological up­take of ammonia-nitrogen occurs in streams, the predominant me­chanism of uptake appears to be biological absorption by algae and other microorganisms associated with stream sediments.

Pathways and Mechanisms

By studying the pathways and mechanisms of nutrient spiraling in streams, we are attempting to identify the major mechanisms that enhance the conservation and cycling of essential elements which otherwise would be unavailable to

the ecos8stem, and also to under­stand the relationships between the biological structure of stream communities and ecosystem func­tion as reflected in the spiraling of nutrients and the processing of energy. Conservation mechanisms include those that reduce the loss of available nutrients through down­stream transport or those that reduce the storage of nutrients in particulate forms that are unavail­able to organisms. In our experi­ments , we are isolating and con­trolling components of stream ecosystems in artificial streams in order to study their mechanisms and determine their role in nutrient spiraling. In one such experiment, Carole Hom, also a graduate stu­dent from the University of Ten­nessee, will determine the effects of snails on phosphorus spiraling and primary production rates of stream algae. Snails in Walker Branch are the predominant grazers on algae and bacteria growing on rocks and dead leaves on the stream bottom. Some ecologists hypothesize that by grazing on microbial popula­tions, animals such as snails pre­vent nutrients from being immobi­lized in senescing algae and bacteria. In other words, animals may perpetuate nutrient cycling in the ecosystem by preventing or reducing the storage of nutrients in forms that are unavailabie to other organisms in the system. The suggestion is that if this did not occur, microbial populations could substantially reduce the available supply of nutrients in an ecosystem and thereby reduce its productiv­ity.

Another technique we are using to determine the role of various components of ecosystems in nu­trient spiraling is to experimen­tally manipulate natural streams. Using this technique in a stream at the Coweeta Hydrological Laborat­ory in North Carolina, we mea­sured the effect of aquatic inverte-

OAK RIDGE NATIONAL LABORATORY Review

brates on phosphorus spiraling. We first treated one stream with a pesticide to remove all the aquatic insects. We then compared the spiraling length of phosphorus in the treated stream with that in an adjacent stream that had not been treated with the pesticide. Prelimi­nary results from this experiment showed that removal ofthe aquatic insects reduced the downstream transport of particles in stream water. These results suggest that the presence of insects in a stream might increase the spiraling length of phosphorus by increasing the downstream transport of nutrients in the particulate phase.

Another integral part of our studies on nutrient spiraling in­volves the development and use of mathematical models for analyz­ing nutrient dynamics in streams. Denis Newbold and Bob O'Neill have developed a series of such models which we use both to analyze the tracer dynamics in streams and to examine the rela­tionships between ecosystem struc­ture and nutrient spiraling. Analy­sis of these interactions is complicated by the fact that phy­siological processes of stream or­ganisms that could shorten the spiraling length of a nutrient are coupled with other processes that tend to increase the spiraling length. The net effect of these coupled processes on spiraling length, thus, is not intuitively obvious. Insights into their possi­ble effects can, however, be o b­tained by modeling these pro­cesses. Results from these simulation models show, for exam­ple, that at reasonable densities the effect of aquatic animal species feeding · on decomposing leaves would be to shorten the spiraling length of nutrients, whereas spe­cies which feed on other materials in the stream would either have no effect on spiraling length or would increase it. This suggests that the

SPRING 1981

species composition and abun­dance of animal communities in streams may have an important influence on nutrient spiraling.

The concept that stream ecosys­tems spiral nutrients supplied from the watershed has a number of applications in analyzing relation­ships between the biological struc­ture and the metabolism and pro­ductivity of streams an drivers. For example, disturbances to streams that affect biological components could adversely affect the ability of streams to spiral nutrients. This being true, measurements of the spiraling length of essential nu­trients could be used as an indica­tor of disturbance in natural syst­ems. Another importa:nt aspect of spiraling in streams and rivers is the effect on the loading of nu­trients to lakes, reservoirs, and estuaries farther downstream. For example, spiraling may stabilize the productivity of streams by damping the extreme variations in nutrient supplies associated with fluctuating inputs from the wa­tershed. Efficient retention and recycling of nutrients when inputs are high ensures that a stable supply of nutrients is available to the ecosystem when inputs are low. By the same token, spiraling can also be beneficial when nutrient supplies are in excess. Eutrophica­tion of many lakes and reservoirs, for example, is a direct result of increased nutrient inputs to streams and rivers that flow into these systems. Retarding the down­stream loss of these nutrients would reduce the loading rate to lakes and reservoirs into which these streams and rivers flow.

We can see, then, that the ability of streams to retain nutrients and to dampen fluctuations in supplies associated with variations in nu­trient inputs from the surrounding watershed, regardless of whether the input variations are natural or man-caused, has important impli-

cations for the productivity and environmental quality of the reser­voirs, lakes, and estuaries into which these streams and rivers flow.

The Continuum

Aldo Leopold, in his perception of nutrient cycling through ecosys­tems during their downhill move­ment, revealed a fundamental concept that ecologists are only now beginning to fully appreciate. Considering all types of ecosys­tems, streams and rivers appear to best illustrate his ideas concerning both downhill flow of nutrients and the importance of organisms and food chains in retarding this flow.

A stream or river can now be seen to be a continuum, extending from its headwaters to the sea. Such ecosystems have no upstream or downstream boundaries between their source and mouth other than those arbitrarily defined by man. Losses of nutrients and organic matter through downstream trans­port from one point in the contin­uum are inputs to another part of the same ecosystem. Perturbations that modify the inputs from up­stream, then, may have important implications to processes in down­stream areas dependent on these upstream inputs. To manage these flowing water ecosystems and preserve them from human im­pacts will require an understand­ing of the spatially dependent linkages among the spiraling of nutrients, the processing of energy, and the biological structure of the system.

Forty years ago, Leopold foresaw what we now realize-the need to know the relationships between disruptions or modifications of these ecosystem processes, compo­nents along the continuum, and the propagation or damping of the effects furt4er downstream. 111

29

30

BOOKS John von Neumann and Norbert Wiener: From Mathematics to the Technologies of Life and Death by SteveJ. Heims, MIT Press, Cambridge (1980). 414 pages, $19.95. Reviewed by Len Gray, Mathematics and Statistics Research Section, Computer Sciences Division.

John von Neumann and Norbert Wiener are unquestionably two of the most prominent mathemati­cians ofthe twentieth century. Von Neumann's collected research pap­ers span six volumes, and the range of subjects investigated is truly astounding: logic, set theory, foun­dations of quantum mechanics, ergodic theory, rings of operators (now called von Neumann's algeb­ras), continuous geometry (in­vented by von Neumann), design of computers and automata theory, numerical analysis, game theory, and hydrodynamics. Wiener's in­terests were only slightly less broad: logic, integral equations, the theory of Brownian motion (Wiener integrals), harmonic and classical analysis, prediction and filtering theory, potential theory, and cyber­netics (a subject initiated by Wien­er). Almost as remarkable as their scientific careers was their emer­gence, to a certain degree, as public· figures-Wiener by his essays (autobiographical and social com­mentaries) and von Neumann by his leadership in the field of atomic energy after World War II.

Although von Neumann and Wiener were similar in many re-

spects, they differed significantly in their reactions to World War II and the atomic bomb experience and, in a vague sense, on the issue of a scientist's social responsibili­ties. This issue, as may be dis­cerned from his choice of subtitles, is the author's primary concern. The two men made significant achievements during the war in both the field of mathematical theory and its applications, but Weiner later refused to participate in any military-related projects. Von Neumann, however, became a key scientific advisor to the Depart­ment of Defense and other govern­ment agencies and eventually became an Atomic Energy Com­mission commissioner.

Heims begins this well­researched dual biography with excellent chapters on the childhood of each man. A nontechnical de­scription of a small but significant piece of each man's mathematical work then follows: von Neumann's theory of games and his work on the foundations of quantum me­chanics, and Wiener's formulation of Brownian motion. These chap­ters are less successful, and the section on Wiener's work is espe-

OAK RIDGE NATIONAL LABORATORY Review

cially puzzling. Heims concen­trates on Wiener's discovery that the Brownian paths are nowhere differentiable (a tangent does not exist anywhere along the path). This finding is certainly of mathe­matical interest but is by no means the most important aspect of this significant work. In addition to being the example of a continuous­parameter stochastic process, Wiener's measure on the Brownian motion curves (and the related Wiener integral) is intimately re­lated to the Feynman integral, which was devised by Feynman to provide an alternate (variational) formulation of quantum mechan­ics, (Wiener's discovery is now a widely used tool in physics.) As this is another connection between Wiener and von Neumann, and since Heims's background is phys­ics, it is very surprising that this was not the focus of the chapter.

The remainder of the book deals mainly with their lives outside of mathematics proper, especially the postwar period. This is a vast area (particularly for von Neumann) that could easily be the subject of a book on each man. In addition to trying to cover a large subject area,

SPRING 1981

Heims is handicapped by his own prejudices (freely admitted in the preface) which tend to view Wiener as the good guy and von Neumann as the villain. Even a reader sympathetic to Heims's philosophi­cal position is likely to be annoyed by the constant bias and, in some instances, totally unnecessary comments.

Having to shuffle back and forth between Wiener and von Neumann does not aid the flow of the narra­tive, and Heims's propensity for going off on philosophical tan­gents does not help matters. His style is not about to make him the Barbara Tuchman of the scientific world. Heims's idea, however, is an excellent one; the role of mathemat­ics in modern society is unexplored territory, and he has chosen new extraordinary subjects. Von Neu­mann, a mathematical equivalent of Einstein, is virtually unknown to the general public. Unfortunately, the end result is a less-than­definitive study of mathematicians and their mathematics.

The following books in print have members of the ORNL staff as principal authors or editors:

Laser and Electron Beam Process­ing of Materials, edited by C. W. White and P. S. Peercy, Academic Press, New York (1980).

Nuclear Air Cleaning Handbook, by C. A. Burchsted, U.S. Depart­ment of Energy, 3rd edition (1980).

Maximum Likelihood Estimation in Small Samples, by L. R. Shenton and K. 0. Bowman, Macmillan Publishing Company, New York (1977).

Selected Tables in Mathematical Statistics, Vol. IV-Dirichlet Dis­tribution-Type 1, by Milton Sobel, V. R. R. Uppuluri, and K. Ran­kowski, American Mathematical Society, Providence, Rhode Island (1977).

Moving Boundary Problems: Pro­ceedings of the Symoposium and Workshop on Moving Boundary Problems, at Gatlinburg, Ten­nessee, September 26-28, 1977, edited by D. G. Wilson, Alan D. Solomon, Paul T. Boggs, Aca­demic Press, New York (1978).

31

Carroll Johnson, now on leave for a year at the Naval Research Laboratory where he is working in the field of artificial intelli­gence, is a crystallographer with ORNL's Chemistry Division. He came to ORNL in 1963, four years after earning his Ph .D. in biophysics from the Massachusetts Insti­tute of Technology. In 1975 he took a leave of absence to study the applications of the artificial intell igence of computers to protein crystallographic problems at Stan­ford University and the University of Californ ia at San Diego. In 1977 Johnson served as president of the American Crystallographic Association.

~rtiii[Ji~l flifll i~flfi[Jfl is []llrnif1~ Applications at ORNL

ByCARROLLJOHNSON

I t is always interesting to ob-serve the response of an audi­

ence introduced to the research field called artificial intelligence. The very term implies that a computer can be programmed to reason like a person. The concept may seem frightening, preposter­ous, or stimulating, depending upon the listener's viewpoint, and these reactions are often reflected in the comments from the audience.

The field of artificial intelligence, or AI, is an interesting amalgama­tion of two dissimilar disciplines, computer science and cognitive psychology. AI is concerned with understanding the logic of human reasoning and with writing com­puter programs which model hu-

32

man reasoning logic. Research in AI often leans toward one or the other of the two parent disciplines. The psychologist is concerned with discovering the mechanism of human reasoning processes, while the computer scientist is interested in understanding the logic of hu­man problem solving and in imple­menting that logic through the most appropriate programming methods, without regard for psy­chological fidelity. My interest in AI follows the latter orientation with a specific interest in the application of AI methods to physi­cal science research problems.

Prominent contemporary re­search areas of AI include robotics, visual-scene understanding,

speech understanding, natural­language text understanding, and knowledge engineering, to name just a few. We at ORNL are most interested in knowledge engineer­ing, also called expert systems. Expert systems programs attempt to model the reasoning used by experts in such diverse activities as diagnosing medical disease, ana­lyzing electronic circuits, and writ­ing computer programs.

Expert systems are called rule­based systems because in order to acquire and store the necessary knowledge and to be able to reason with it, expertise must be coded into situation-action rules (also called production rules) that take the following form:

OAK RIDGE NATIONAL LABORATORY Review

Pedro Otaduy, Jim Mullins, and Dave Patek, from l., all of the Reactor Control Group in Instrumentation and Controls Division, confer at one of the AI meetings.

IF some set of conditions, THEN perform some action or

reach some conclusion.

To give an example from a mass spectrogram interpreter, IF high peaks are at atomic number I charge points 71, 43, and 86, and. IF there is any peak at atomic number/ charge point 58, THEN there must be ann-propyl-ketone 3 superstructure.

From Stanford to ORNL I got started in the field of AI

during a year's sabbatical at Stan­ford University in 1975 and 1976 working with the Stanford Heuris­tic Programming Project. My role was that of resident crystallogra­pher advisor for aN ational Science

SPRING 1981

Foundation-funded project under­taken jointly by Stanford Univer­sity and the University of Califor­nia at San Diego. The project, CRYSALIS, involved the applica­tion of knowledge engineering AI techniques to protein crystallogra­phy, particularly for the interpreta­tion of electron density maps. I served as "domain specialist," and my approach to problem solving was observed and modeled for input to the program. This is a common practice in expert systems projects. The role of domain expert is both an interesting and a frus­trating experience. It entails , among other exercises, meticulous analysis of the often fuzzy reason­ing used in solving day-to-day

problems. Following my return to ORNL, I

continued to participate in the CRYSALIS project on protein den­sity-map interpretation. My partic­ipation included interactive access to the Stanford SUMEX-AIM com­puter system over the TYMNET network. The concepts developed during the discussions at Stanford began to take shape here in a running computer program.

Other developments took place at Stanford as well during the inter­vening period. Experience with the Stanford program MYCIN, an expert system which aids in diag­nosing bacterial infections, led to a generalized program called EMY­CIN, which in principle could be

33

University of Tennessee Professor Sara Jordan is employed halftime in the Compu­ter Sciences Diuision_ She is a pioneer inA!.

Karl Heuslein, L, and Jonas Holdeman of Computer Applications in the Computer Sciences Diuision haue supplied the neces­sary expertise in the computer science area_

applied to almost any consulting problem. Other generalized con­sulting program systems were developed at other AI centers (such as EXPERT at Rutgers Univer­sity). The research trend was to determine how useful the new knowledge-engineering methods were in a variety of applications, and it was clear that we should be trying some of those methods on problems at ORNL.

I prepared a research prospectus memorandum outlining several possible knowledge-engineering applications, with a brief back­ground on AI methods, and sent it to managers and researchers in several ORNL and UCC-ND divi­sions. The response showed that many were interested but that no one group could undertake the proposed project alone. It became clear that the best approach was to gather an interdisciplinary critical mass of individuals interested in AI for an unofficial discussion.

The Computer Sciences Division agreed to let Sara Jordan and Karl Haeuslein spend some time on the

34

computer science aspects of the project. Further inquiry showed that the Analytical Chemistry Division was willing to let Bruce Clark and Michelle Buchanan work part time on a spectroscopy application of AI. The Chemistry Division then agreed to let me head the project, again on a part-time basis. John Allen, a physicist then consulting with the Industrial Safety and Applied Health Physics Division, was another of the found­ing members. Jonas Holdeman and Bob Meacham of the Computer Sciences Division and Ed Haga­man of the Chemistry Division joined the panel later. The AI Programmed Reasoning Methodol­ogy Panel met for the first time on October 2, 1979. Weekly two-hour meetings have been the schedule since then.

In the early stages of the study panel, we concentrated on develop­ing familiarity with basic AI con­cepts and programs. In addition to studying published articles , we relied heavily on access to the Stanford SUMEX computer for

-

firsthand familiarization with classic AI programs such as PARRY, a paranoid patient model­ing program, and MYCIN, the bacterial infection diagnosis pro­gram mentioned earlier. We stud­ied in detail a half dozen other expert systems on SUMEX. Much of the fundamental information concerning AI is available only as text flies and programs on compu­ters at AI centers or as technical reports issued by various AI com­puter science departments, so any­one starting research in the AI field without a good connection with an established AI center is at a serious disadvantage.

One of the initial projects under­taken was a tutorial lecture series for the ORNL staff. Sara Jordan and I presented the first few lec­tures. Professor Jordan did her Ph.D. dissertation in AI at the University of Wisconsin and has been teaching AI courses at the University of Tennessee since then. She works halftime at ORNL and was involved in the Russian language machine translation pro-

OAK RIDGE NATIONAL LABORATORY Reuiew

ject here in the early seventies. During the first year, eleven AI lectures were sponsored by the panel, of which four were given by panel members and seven by out­side lecturers. Most of the outside lectures were part of the consultant program begun in February 1980, when seed money funding became available.

Implementation We found that several AI expert

systems developed elsewhere could be implemented on the ORNL DEC System-10. The first one was the FORTRAN-based consulting sys­tem EXPERT, written at Rutgers University. EXPERT is a rule­based knowledge system designed for medical consulting. At Wash­ington University in St. Louis, sets of situation-action production rules specific for medical specialties such as rheumatology have been written for this system. EXPERT uses the rule sets to form chains of reasoning during interactive di­alog with the physician, and the system can explain to the physi-

SPRING 1981

cian how it came to any particular conclusion by displaying the rules applicable to the clinical symptoms of the patient. We found two ORNL applications of EXPERT to non­medical problems.

Another rule-based knowledge system operational at ORNL is the Carnegie-Mellon system OPS-5 which is implemented in MAC­LISP, the Massachusetts Institute of Technology dialect of the pro­gramming language LISP. A pow­erful programming language, OPS-5 is applicable to a broad range of AI problems. There are at least two applications of OPS-5 to problems at ORNL, and Bob Meacham can be credited with the successful implementation at ORNL of EX­PERT, OPS-5, MACLISP, and other systems.

In a major program develop­ment, Karl Haeuslein is rewriting the Stanford 'INTERLISP-based EMYCIN system in Pascal. The Pascal version will permit the writing of consulting systems that run on mini- and microcomputers, thus making the system available

Steve Hurrell, l. , and Davis Hurt, of the Engineering Technology Division, discuss applications of AI for materials control in fuel reprocessing with John Wachter, standing, who is with the Fuel Recycle Division.

to a much wider clientele. During the first year, we started

four application projects in AI in the.domains of spectroscopy, envir­onmental management, nuclear fuel reprocessing, and program­ming assistance.

Spectroscopy The panel's main project is the

spectroscopy rule-based system for the joint interpretation of infrared, nuclear magnetic resonance, and mass spectral data. The spectros­copy experts are Michelle Bu­chanan, Bruce Clark, Ed Haga­man, and John Allen, with Allen also coding the knowledge rules by using the system EXPERT. Cur­rently, the rule set has about 230 rules and can identify functional groups on organic molecules con­taining up to 15 carbon atoms. Future plans involve implemen­ting the spectroscopy interpreta­tion system on a minicomputer which can be interfaced to separate IR, NMR, and mass spectral com­puter-controlled instruments.

The spectroscopy rule set will

35

Spectroscopy expert Michelle Buchanan, center, has come from the Analytical Chemistry Division to help set up the spectroscopy rule set.

also be useful to check out the ORNL Pascal version of EMYCIN when that system becomes opera­tional. In addition, there are plans for implementing the rule set on the LISP-based OPS-5 system for com­parison purposes.

Spill Countermeasures

The second major application involves countermeasures for oil and hazardous chemical spills. This project is implemented in collaboration with Tom Oakes, Brian Kelly, and Chris Bird of the ORNL department of environmen­tal management. An emergency response application such as this requires a considerably different approach from that used for the spectroscopy problem. Sara and I made a systematic analysis of the overall problem and wrote an SO­page manuscript describing the problem in detail. This extensive documentation enabled the spills problem to be used as a "mystery focus problem" at Rand Corpora­tion's recent workshop on expert systems, where representatives of eight different expert systems worked on a common problem in a field in which none of the partici­pants had previous experience. The systems represented were AGE, EMYCIN, and RLL from Stanford; KRL from Stanford Research Insti­tute; EXPERT from Rutgers; HEARSAY III from Southern Cali­fornia; OPS-5 from Carnegie­Mellon; and ROSIE from Rand. A detailed documentation of the com­parison will appear as a Rand report. Sara and I acted as the workshop's "mystery experts" on spills. The approaches used in solving the problem were very diverse, and several of the resulting systems could provide excellent starting points for an operational

36

spills countermeasures system. The OPS-5 version started at the workshop will be the initial system implemented at ORNL.

Safeguards The panel's third .application is

in the area of safeguard monitoring of the chemical processes involved in nuclear fuel reprocessing. The AI part of that project is being carried out by John McDermott and Charles Forgy from Carnegie­Mellon in collaboration with Davis Hurt, John Wachter, and their coworkers from the Oak Ridge Consolidated Fuel Reprocessing Facility. The AI project is funded by a grant from the Department of Defense's Defense Advanced Re­search Projects Agency (DARPA) to Carnegie-Mellon, so the panel's involvement was simply to bring

the two groups together. A group at Carnegie-Mellon is currently ana­lyzing a small subset of the entire problem using data from a recent test run at the Barnwell reprocess­ing plant. The Carnegie-Mellon ·researchers will be spending a fair amount of time in Oak Ridge and the panel expects to learn a great deal from watching the project develop.

JCL Advisor The fourth application is in the

programming assistance area, where we hope to see how useful a consulting AI system can be in what amounts to a tutorial applica­tion. Using the Rutgers EXPERT system on ORNL's DEC System-10, the application involves a rule set designed to aid International Business Machine users in setting

OAK RIDGE NATIONAL LABORATORY Review

up and debugging their job control language. If the system proves useful, other programming assis­tance aids such as a graphics program advisor and a magnetic tape advisor may result.

Other AI Applications

AI programming methods are becoming increasingly important in a wide variety of application areas. The Schlumberger Com­pany, for example, has begun a major effort in the application of knowledge engineering to oil ex­ploration. In addition, computer manufacturers such as IBM, Hew­lett-Packard, and Texas Instru­ments are beginning to use AI in the computer-aided design of very large-scale integrated circuits.

The U.S. Navy is developing a

SPRING 1981

Center for Applied Research in Artificial Intelligence at theN a val Research Laboratory in Washing­ton, D.C., where I am currently a visiting scientist on temporary assignment from ORNL. One of several projects under way there is an automated message center, to be used with a strategic operation, that features machine interpreta­tion and assessment of dispatches passing through it. The informa­tion derived will update a dynamic model of the overall situation, which in turn may update and reroute the messages. Also planned is work on an AI consultant system coupled with automatic test equip­ment to help technicians trouble­shoot electronic equipment.

Another increasingly active area of AI research is robotics. Indus-

John Allen, a consultant at the Laboratory from ProPhysica, Inc., discusses a flow chart with CSD's Bob Meacham.

trial applications of robots for tasks such as welding seem to be the only key to survival for dis­tressed industries such as ship­building. There are currently over three times as many industrial robots in Japan as in the United States. The spectacular Japanese productivity is providing an indus­trial impetus-much like Sputnik in the fifties-which may give U.S. robotics research a long overdue boost. The current industrial robots do not incorporate particularly impressive AI reasoning, but that situation may change drastically in the near future. Someresearchin robotics is going on in the Instru­mentation and Controls Division and at NRL. Robots are particu­larly useful for work in hazardous environments such as high radia­tion areas (ORNL's interest) and deep-sea submarines (NRL's inter­est).

Looking Ahead

The ORNL effort in AI is an interdisciplinary exploratory pro­ject designed to evaluate the poten­tial usefulness of AI methodology for problems at the Laboratory. A wide variety of projects was chosen to provide as broad an evaluation as possible. The preliminary re­sults all seem very encouraging and one cannot help but predict that eventually AI approaches will have considerable impact on com­puter programming at ORNL. The influence of AI on medical diagno­sis, oil exploration, and a broad spectrum of other activities is beginning to emerge throughout the country. With AI techniques such as the use of heuristic reason­ing and rule-based knowledge, programmers can approach unus­ual and complex problems which previously seemed intractable. CC!I

37

38

Dice Experiments Yielding Numbers Divisible by 11, 111,... .

The number of dots that appears on the throw of a standard die can be 1, 2, 3, 4, 5, or 6. If 3 (dots) shows up, then its opposite side has 4 (dots); 1n other words the total number of d~ts on the face showing and its oppos1te side is always equal to 7.

Suppose we throw two dice on a table, and 1 (dot) and 3 (dots) appear face up. Then write the l')mber 1364, where 6 corresponds to the opposite face of 1 and 4 corresponds to the opposite face of 3. It is inte~esti ng to note that this number 1364 1s divisible by 11, and that 1364 = 11 ~ 124. Then divide 117 by 9 and obtam 13 which is the outcon:e of the original pair of dice. This is true whatever the outcome of the two dice.

The above result can be extended to three dice. Suppose we have the outcome 236; then form the number 236 541, which is divisible by 111 to make 2131 ; subtract 7 to obtain 2124.

take a number

Note that 2124 -;- 9 = 236, the original roll of the three dice.

These ideas can be extended to guess at the roll of n dice.

Only True for n = 4, 5, . . . .

101o is a large number and 1010 · • is an extremely large number.

Suppose we wish to form the largest number with three twos; then it is easy to see that it is not 222, nor 222

= 24 = 16, nor 222 = 484, but it is 222 = 4194304. Thus

2 222 > 222 > 222 > 22 .

Similarly, if we wish to form the largest number with three threes, then it is easy to see that it is not 333 , nor 333 = 321, nor 333, but it is 333 . Thus

333 > 33 3 > 333 > 333 .

It is interesting to note that this induction stops here. If we have three fours, then

44 4 > 444 > 444 > 444 0

And for n ~4. n"" is the largest number that can be formed with three n's.

OAK RIDGE NATIONAL LABORATORY Review

Emily Copenhaver is a pol icy analyst in the office of integrated assessments and policy analysis of ORNL's Health and

Safety Research Division. She was recently assistant chairman of the Third ORNL Life Sciences Symposium on

Health Risk Analysis. At ORNL since 1962, she has served as editor, as information specialist, and , for five years, as director of the Environmental Resource Center and Toxic Materials Information Center of ORNL's Information Center Complex. Phil Walsh is leader of the health effects and epidemiology group of the Health and Safety Research Division. He joined ORNL in 1976 as a research staff member of the Environmental Sciences Division. From 1968 to 1976 Walsh was radiation physicist at the National Institute of Health's National Institute of Environmental Health Sciences and served as adjunct assistant professor at the University of North Carolina, from which he earned his master's and doctorate degrees. His training and interests include physics , health physics, industrial and air hygiene, public health, epidemiology, inhalation toxicology, and biomedical engineering . Here the authors rest between sessions at the Health Risk Analysis symposium in Gatlinburg .

HE~lTH- RI~K AN~lV~I~ Science, Politics, and Public Concern By EMILY COPENHAVER and PHIL WALSH

E very day we face risks from the food we eat, the liquids we

drink, the air we breathe, the cars we drive, the sports we play, and even the jobs we hold. From our morning meal of eggs with choles­terol, bacon with nitrites, and coffee (latest on the carcinogen-of­the-month list) to an after-dinner beer or cigarette, weare bombarded with risks to our health, making it difficult to judge which danger we dread the most.

How can we balance the risks and benefits of the drugs and chemicals we use daily? What are the risks to the health of people exposed to the toxic chemicals

SPRING 1981

buried at Love Canal in Niagara Falls , New York? Is it possible to analyze the extent to which su­spected carcinogens (substances producing or inciting cancer) actu­ally contribute to the nation's number two cause of death? Which is more dangerous: a cup of coffee, a cigarette, a saccharin-sweetened diet drink, hair dye, emissions from diesel engines, or asbestos in con­sumer products?

"The search for an accurate way of measuring [health] risks is a risk in itself, a risk all of us Ameri­cans-indeed, all human beings -share." Ruth Clusen, Assistant Secretary of DOE for Environ-

ment_, used these words at the Third ORNL Life Sciences Symposium on Health Risk Analysis to sum up the vital stake that scientists, politicians, and the public at large have in health risk analysis. De­spite clear evidence that by most simple measures health is better now than it was 100 years ago and is still improving, the public has become increasingly sensitive to many dangers. not considered or not in existence 100 years ago.

Richard Wilson of Harvard be­lieves that the public is reacting to the greater attention now being given to risk and to the large number of risks we now consider

39

regulating. Both Wilson and Chet Richmond, ORNL's associate di­rector for biomedical and environ­mental sciences, believe that edu­cating the public about risks is an important step toward making regulation of risk taking clearer and more effective, since risk decisions are ultimately made by the public. Indeed, it may even be said that more widespread under­standing of the subject, with the resultant effect on public action, could actually work toward redu­cing risks. The public is vitally concerned, and the news media provide them with much of what they know about risks. If scientific facts regarding a risk are not clearly explained or put in the proper perspective, neither the public nor its elected representa­tives can be expected to understand and correctly use such data in making informed decisions about the risk.

Risk analysis for decision mak­ing must include scientific data collection, assessment and compar­ison, and a societal decision as to what risks are acceptable. Scien­tific issues should be separated from value judgment issues, and the resolution of issues should be left to the appropriate decision makers. Scientists must develop the needed data and logical risk methodologies to the highest de­gree possible, and the public and political decision makers must use these approaches to assist in mak-

40

"The search for an accurate way to rneasure risks

is a risk ... all human

beings share."

ing their decisions.

Legislative Approaches

Given that public sentiment and congressional action, among other forces, demand that some method be developed to measure or esti­mate risk in general and health risk in particular, the best ap­proach to risk analysis has yet to be identified. Both Tom Moss, a mem­ber of the congressional staff, and Dick Bates, of Clement Associates, subdivide the wide range of ap­proaches that have evolved from some 34 health-related laws into these major categories:

• risk as the only criterion (e.g., Delaney Clause), • technological feasibility criteria (e.g., Clean Air Act), and • costs and/ or risks vs benefits (e.g., Toxic Substances Control Act).

The first category, in which risk is the only criterion considered, is often called the zero-risk approach. The Delaney Clause of the Food, Drug, and Cosmetics Act (1938) is the best-known example of this approach; the clause prohibits the deliberate use of any food additive shown to be carcin.ogenic. When the clause was added to the act in 1958, science could not determine "safe" levels of a carcinogen, and Congress expected the question of chemical carcinogens in food to arise rarely. Now that the number of chemicals thought to be positive

carcinogens has risen from 10% of those tested in 1958 to 80-93% of those tested today, the Delaney Clause has become highly con­troversial. Though the Delaney Clause, or zero-risk approach, has been well publicized, only 7 of the 34 health-related laws employ this approach.

The second category, that of the technological feasibility criterion, or "best available technology," calls for the reduction of risk as much as possible via application of the best control systems technol­ogy can offer, such as scrubber systems for pollutants being emit­ted by power plants. The Clean Air Act is one of two laws using this approach.

The third category, which bal­ances costs or risks against bene­fits, gives discretionary authority, in varying degrees, to the regula­tory agencies so that they can perform risk analyses including criteria such as total exposure to the public, environmental effects, availability of substitutes,. and economic consequences. This ap­proach is the most common; 25, or 7 4%, of the laws specify some form · of balancing risk and cost. While it represents the most logical frame­work, it requires a wider range of valid data and is subject to broad and often inconsistent interpreta­tion by regulators and even by the judicial system in those cases where differing views have led to litigation. A good example of an

OAK RIDGE NATIONAL LABORATORY Review

ideal balanced approach is Wil­son 's organizational framework, with all analysis stopping if the risk is calculated to be very small, a circumstance referred to as "de minimus risk." Richmond points out that still at issue is how small is "very small" : according to which group or person, and how do we establish qualitative or quantita­tive reference systems that can be used by others to make the "very small" decision or comparison?

While Congress has attempted to legislate the best and most sophis­ticated risk analysis methodolo­gies available at the time the laws have been enacted, FredHoergerof Dow Chemical Company argues that this evolutionary approach to risk analysis , while consistent to some degree with the state of our health effects-testing technology, has deterred the development of comprehensive risk analysis by legislating very specific ap­proaches into law. He classifies the congressional approaches by the following categories and notes their limitations:

• Yes/ no classification of haz­ards-Decisions that sanction either outright ban or uncontrolled use. This approach is too simplis­tic, given today's knowledge of health effects. • Presence or absence-Same as the first category except that it pertains to a hazardous substance contained in a product rather than

SPRING 1981

to the entire product. • Burden of proof- Difficult to prove the safety or degree of risk definitively in accordance with legal concepts such as burden of proof. • Weight of the evidence- Ex­amines the quality of available information but fails to integrate all data in making judgment. • Criteria! definitions-Sets stan­dards based on key area, but denies evaluation and balancing of all data. • No-effect level plus safety fac­tor-Uses a generic assessment of safety, with little case-by-case consideration and data collection.

Just as there are those who use and trust risk analysis, there are also those who say that risk analy­sis is not sufficiently developed to be used in decision making, that it merely appears to present a ra­tional approach to regulation with­out identifying the underlying uncertainties that sometimes make it suspect. However imperfect the data and methodologies may be, they continue to be used because complex decisions must be made now. How then can we improve our data, methodologies, and their use in decision making?

More Data Needed

Much of the uncertainty in risk analysis stems from a lack of crucial data about the populations at risk and their exposures from all

sources; about the effects of concur­rent exposure to many different pollut~nts in many different media such as air, water, and food; and about the predictive capability of biological testing, such as animal, cellular, and subcellular bioassay, as well as human studies. Regula­tory agencies, such as the Environ­mental Protection Agency, require information that can be immedi­ately useful to decision-making authorities. Unfortunately, many of the experiments of the seventies have not been designed to develop the data needed for comprehensive risk assessment: much of the ani­mal bioassay testing for carcino­genic risk only indicates a yes or no answer, not quantification of the risk, and exposure studies have focused on the number of people associated with a chemical or toxic agent, without comprehensive ex­posure profiles. These comprehen­sive exposure profiles require several types of data: ambient monitoring (e.g., air and water), exposure models to link specific uses or releases of a chemical to exposure, monitoring of exposed human tissues, and development of surrogate approaches, such as when the behavior of one chemical may be predicted using the known behavior of a structurally related chemical.

The health risk analysis sympo­sium addressed these uncertainties by examining the scientific basis of human health risk estimates from

41

biological effects data gleaned from epidemiological and clinical­laboratory studies, animal bioas­says, cellular and subcellular tests or screening systems for toxicologi­cal effects, and dose-response mod­els. Though the full review of methodologies and their applica­tions, advantages, and limitations are available in the proceedings, published by Franklin Institute Press this spring, some of the appraisals of these methodologies are highlighted here.

Epidemiology, that is, the study of the incidence, distribution, and control of disease or pathogens in people, tends to take three forms: (1) concern with occupational groups where workers may have experienced concentrated and long-term exposure to potentially hazardous substances, (2) case­controlled studies for investigating illnesses which occur with relat­ively low frequency, and (3) indi­rect studies that compare vital statistics data on groups rather than individuals, usually grouped by geography, ecology, or some other category. Epidemiologic methods identify moderate to high levels of risk reasonably well but are weak in detecting low levels of risk because such detection re­quires very large population groups. Much of the epidemiology of the 1970s has been directed at identifying whether evidence of cancer has increased in various defined human populations, sug-

42

Betty Anderson , EPA

gesting a cause-and-effect relation­ship, but enough data to validate the estimate of risk is lacking. Ian Nisbet of Clement Associates ex­presses the need for more quanti­tative documentation of the cir­cumstances in which people come into contact with chemicals, whether in their jobs, in food or other consumer products, or in the environment generally.

The use of laboratory animal bioassays to estimate human risks provides at least a qualitative surrogate for humans, and lifetime exposures of mammalian test spe­cies are a generally accepted method of estimating human risk. Steve Gage of the EPA believes that methods available for estimat­ing chronic toxic effects from short­term, in vivo exposures must be examined to learn if they can quantitatively predict long-term effects of exposure. However, Bates questions the ability of bioassays in experimental animals to provide sufficiently accurate data without the use of a prohibitively large number of animals, and states that attempts to devise general princi­ples of dose response from these bioassays have been unsuccessful. Osterman-Golkar of Stockholm University points out that among the major difficulties in extrapolat­ing data from animals to man are the qualitative and quantitative differences in how different species metabolize and respond to toxic agents.

Bates notes that cellular and molecular studies may be used to extend dose-response curves to lower levels because "the ability to detect changes in a few cells among hundreds of thousands or in a relatively few molecules from a large target population should permit dose-response studies of much greater sensitivity than is possible when looking for a few tumors in a few hundred animals at most. Pharmacokinetic studies should give better information on effective exposure of target cells than can be obtained from the dose administered."

Many in vitro testing systems are available, and tier testing schemes including cellular-level and higher-level organisms have been designed to rank chemical toxicants. However, no accepted methodology is available for extra­polating the results to determine human health risks.

Paul Ts'o of Johns Hopkins University has outlined the exten­sions or extrapolations of data from cellular and subcellular stud­ies to quantitative human risk analysis, the most important being the extrapolation across the or­ganizational barrier from mole­cules and cells to living, individual animals and from results obtained between different mammalian spe­cies (i.e., animals and humans). Ts'o says future studies should focus on investigating human cells, on comparative studies of the

OAK RIDGE NATIONAL LABORATORY Review

" .. . when . .. estimates are used to set regulatory standards, the regulated industries may reasonably question the validity of basing regulations on data this uncertain. "

extrapolation across the organiza­tion barrier in an appropriate mammalian model, and on the extrapolation from human cells to man as a whole.

Further study by Dick Albertini of the University of Vermont on mutation in human cells demon­strates the advantages of direct in vivo testing of human cells. It allows

• metabolic and pharmacokinetic realism, • measurement of genotoxic ef­fects of actual environmental mix­tures of agents, • detection of individual differ­ences in genetic susceptibility to environmental agents, and • mutation data for individuals which can be correlated with their health, revealing whether the test predicts human disease.

Throughout the discussion of these experimental testing schemes, we have addressed the relationship of exposure, or dose, to the response of the agent studied. There are also statistical proce­dures or mathematical models that relate the probability of a response to the dose rate. These models assume that each individual in the population has his own tolerance to the test substance, and the choice of the tolerance distribution is largely arbitrary. Using such mod­els, Ian Munro of Health and Welfare, in Canada, has demon­strated that a virtually safe level of

SPRING 1981

exposure may be defined by deter­mining the acceptable amount of added risk above background expo­sures.

Munro, B. Altshuler of New York University, and others note that the state of the art is not advanced enough to allow the choice of any one model with confidence. The shape of the dose-response curve at moderate and high doses does not continue for the low-dose response, so Altshuler argues that the models do not reflect true behavior at low doses. Altshuler seeks to preselect the dose-response curve in the low­dose range on theoretical and mechanistic grounds, rather than extrapolating through use of a dose-response model whose shape is con trolled by the available exper­imental data.

Kenny Crump of Science Re­search Systems, on the other hand, believes that a multistage model with a linearly extrapolated low­dose relationship can be used to estimate cancer risk from low doses using animal bioassay data. He notes that differences in extrapo­lated values in the range of an order of magnitude are not large when compared with the large uncertainty associated with the total process of estimating low­level human risks from animal data. However, when such esti­mates are used to set regulatory standards, the regulated industries may reasonably question the valid­ity of basing regulations on data

this uncertain. Where do all these conflicting

views and uncertainties leave us in applying health risk analysis in decision making? Obviously we cannot as yet give individuals a ''hazards counter" that can be used for counting hazards like a calorie counter counts calories so that one does not exceed his approved daily intake. We can still attempt to compare these health risks in an orderly fashion. Though risk anal­ysis may not yet be without draw­hacks, perhaps Fred Hoerger fo­cuses on the basic issues:

We will make great progress in the next few years in sound and comprehensive risk as­sessments if we, as scientists, insist upon peer review of our experimental programs and if we are aggressive and com pre­hensive in our design of the data generation needed for a sound risk assessment. Per­haps more important, we must become active in the policy arenas and insist upon the time to get adequate data for sound risk assessment. It is very superficial to argue whether or not risk assess­ments are valid if we do not focus discussions upon the generation and quality of data needed for sound risk assess­ment. CD

43

- - ---- --- --~~~~~-~-~-~-------

44

Health Risk Analysis at ORNL A growing realization of the political nature of policymaking and the need to communicate effectively has led to increased activity in risk analysis. Many organizations are beginning to examine what constitutes risk; others are identifying ongoing work that is risk related. At ORNL, an integrated health risk assessment program has been identified as a new initiative in the Institutional Plan for Biomedical and Environmental Sciences, even though ORNL has engaged in risk-related research and development for many years.

The mammalian genetics section of the Biology Division under Bill and Liane Russell was founded about 30 years ago to use mouse experiments to estimate human genetic risk from ionizing radiation. It is now also evaluating the genetic toxicology of environmental chemicals. This is only one example of how the Laboratory's Biology, Health and Safety Research, and Environmental Sciences divisions, in collaboration with many other divisions- Analytical Chemistry, Chemical Technology, Computer Sciences, and Information, to name a few-have collected basic data and developed and applied health and ecological risk assessment methodologies for many years. Over 200 assessments of energy technologies and various components thereof

have been completed, and the Laboratory has expertise in almost every area of risk assessment.

A need to focus on potential human health problems was recognized and organizationally accommodated with the establishment of the Health and Safety Research Division. A component of HASRD's responsibility is to develop methodology and perform human health risk analyses. The health effects and epidemiology group under Phil Walsh in HASRD's health studies section is charged with two primary goals: (1) development of comprehensive health risk analysis methodologies for comparison of energy technologies on a common scale, and (2) development of a collaborative supporting research capability to test models in the analytical methodology and to collect timely data for specific assessments. Members of the group possess expertise in many areas covered at the symposium, including mathematical modeling, dosimetry, and epidemiology. Research in every area except epidemiology is conducted in collaboration with other divisions to focus a broad range of research on assessment methodology development. An example of this collaborative effort is the research on human cells being conducted by Guy Griffin of HASRD,

OAK RIDGE NATIONAL.LABORATORY Review

Dave Novelli of the Biology Division, and Alan Solomon of the University of Tennessee Memorial Research Hospital. The diversity of disciplines represented in this group requires that their methodology development and testing be an integrated process.

In addition to the mammalian genetics, the cellular and comparative mutagenesis programs under Jim Epler in the Biology Division are examining short-term assays and comparison of systems, the correlation of genetic damage in vitro and in vivo, and extrapolation of test data to man. Mike Fry's carcinogenesis toxicology programs are addressing the dose-response relationship in chemical carcinogenesis.

Bob Cumming in Biology has served as chairman for a group establishing an international society for risk analysis and serves as editor of its new quarterly journal, Risk Analysis, published by Plenum Press. Chet Richmond, Ram Uppuluri of the Computer Sciences Division, and George Flanagan of the Engineering Physics Division are members of the steering group. The first issue of Risk Analysis appears this spring.

Curtis Travis and Liz Etnier of HASRD recently co-chaired a session on health risks associated with energy technology at the American

SPRING 1981

Academy for the Advancement of Science annual meeting in Toronto. This session dealt with public perception of risks; risks specifically related to the nuclear, coal, and renewable energy fuel cycles; and risks of energy production on a global scale (e.g., COz and acid rain). The proceedings will be published in book form later this year.

In view of the growing emphasis on this work at ORNL, Richmond has appointed a task force to establish the ORNL' strategic program plan for risk assessment. The task force will work to expand and develop risk-related activities across the entire Laboratory. They will consider the components of a risk analysis program, methodology development, the resources required, potential funding sources, and linkages to developing energy technologies. Members are HASRD's Herbert Inhaber, chairman; Conrad Chester, Energy Division; W. A. Coghlan, Metals and Ceramics Division; Doug Crawford, HASRD; Jim Epler; George Flanagan; Tim Ensminger, Information Division; Carl Gehrs, Environmental Sciences Division; Elwood Gift, UCC-ND Operations Analysis; Ruth Gove, Information Division; Harry Hoy, Energy Technology Division; Helen Pfuderer, Central Management; and Walsh.

45

information meeting highlights This high-resolution transmission electron micrograph shows the lattice planes in three grains of silicon nitride and an .,... amorphous region (no fringes) at the intersection of the three crystalline grains.

Metals and Ceramics, Nov. 12-13 Reactor Materials and Analytical Electron Microscopy

When an electron beam interacts with a solid material (such as phosphors on a television screen), an array of signals is produced . The only signal the TV viewer cares about is composed of photons of light that make up the picture. However, metallurgists studying various materials are interested in all the signals emitted in response to the impingement of an electron beam on a specimen, because these signals contain information about the composition and structure of that speci­men.

To capture and analyze the majority of these signals, metallurgists are now turning to the relatively new, powerful techniques of analytical electron micros­copy. These techniques complement con­ventional transmission electron micros­copy, which provides structural informa­tion at resolutions of 0.2 nm (2 A) by additionally providing compositional in­formation at 10-nm spatial resolution.

Jim Bentley (task leader), Phil Sklad , Ed

46

Precipitates extracted from a specimen of type 321 stainless steel are shown in this micrograph. In the energy dispersive x-ray spectrum at right, we are shown the composition of part of the precipitate arrowed in the micrograph, revealing it to be a complex phospho-arsenide.

Kenik, and Alex Fisher of the Radiation Effects and Microstructural Analysis Group (REMAG) in ORNL's Metals and Ceramics Division have made significant strides in improving and applying AEM techniques to understand the structure and composition of various alloys and materi­als that are potential candidates for energy facilities , such as breeder reactors, fusion devices, and coal conversion plants. Their AEM studies have revealed information on the structure and composition of the minute microstructural features that i nflu­ence an alloy's properties.

One of the targets of AEM studies at ORNL is precipitates in alloys. When an engineer chooses a stainless steel for some structural component, what he asks for is not necessarily what he gets. For example, he may want a steel that contains 2% titanium because a deliberate addition of titanium strengthens steel. However, during the manufacturing process and subsequent thermal aging, some of the titanium may react with atmospheric nitrogen to form titanium nitride, an undesirable precipitate that degrades the alloy's performance. Undesirable precipi-

tates may also be formed when stainless steels are irradiated in breeder or fusion reactor environments.

Recently , Arthur Rowcliffe and his associates in REMAG found as many as 16 different precipitates in irradiated type316 stainless steel, which can alter the chemi­cal and mechanical properties of the steel. Of these, some precipitates create voids that make the steel more susceptible to cracking and swelling, while others may reduce these effects. In fact, in 1975, Rowcliffe and his colleagues found that adding elements such as silicon and titanium in the right amounts to type 316 stainless steel can make the material resistant to swelling during neutron bom­bardment. Consequently, this modified steel is a candidate material for future breeder and fusion reactors.

An understanding of which metallurgical processes eliminate undesirable precipi­tates and which processes enhance the formation of beneficial precipitates such as titanium carbide could facilitate deter­mining how to make new radiation­resistant alloys with potentially longer lifetimes in reactors. The AEM methods

OAK RIDGE NATIONAL LABORATORY Review

have been used to identify precipitates and to resolve the fine details of structural and compositional changes that occur in an alloy after irradiation. Because there is a correlation between each alloy's m icfo­structural details and its properties, ORNL researchers , by analyzing the structure of alloys with different properties, may now be able to design an alloy with more favorable properties such as improved resistance to swelling and fracture.

The members of REMAG have also used their microscopic techniques to study ceramic materials such as silicon nitride, silicon carbide, and composites of titanium diboride and nickel.

Sklad has been working with Vic Ten­nery, also of the M&C Division , in studying titanium diboride, a potential cutting-tool material because of its hardness · and resistance to wear. The appeal of titanium diboride as a cutting-tool material is that it could replace the cu'rrently used material consisting partly of cobalt, a metal of strategic importance which is imported from nations not always friendly to the United States. Titanium diboride bonded with nickel is under study at ORNL because it also is a potential candidate for letdown valves in coal liquefaction plants, where the structural components are subjected to the abrasive effects of solid particles in

SPRING 1981

high-temperature liquids and gases. Sklad has been using AEM to examine the composition of the phases present and the interfaces between the titanium diboride and binder phases to determine what changes might be needed to strengthen the bonding and improve the properties.

To probe the structure and composition of alloys and ceramics, Bentley and his team are concentrating on quantitative analysis of three of the "signals" produced when the 120-keV electrons in a 10-nm­diam beam strike the target material. Although some electrons are backscat­tered , most of the electrons pass through the th in ("" 100-nm) specimen and either are elastically scattered into diffracted beams or undergo small energy losses in inelastic collisions. One important inelas­tic-scattering process involves inner-shell ionization of the target atoms which then decay from their excited state by emitting Auger electrons !Jr x rays . The spectra of x rays from the specimen are detected by energy dispersive x-ray spectroscopy. and provide compositional data accurate to 2% when the x-ray intensities characteristic of each element are analyzed using an on-line minicomputer. To date, EDS has been the most often used technique.

The second process, electron energy loss spectroscopy , is not quite as accurate

as x-ray analysis ,.but it is the principal method for detectiri

1g elements lighter than

sodium, such as carbon and oxygen, which play important roles in many precipitates and are major constituents of ceramic materials. The energy lost by electrons that have passed through a thin specimen is analyzed by a magnetic spectrometer (the lower the electron energy, the more the electron beam is deflected by the magnet). Compositional data are again available from intensities at the characteristic energies for inner-shell ionization and again require computer analysis. The EELS identifies carbon, for example, by detect­ing those electrons which, in penetrating the specimen, have lost an energy of 283 eV-that required to eject an electron from the K shell of a carbon atom.

The third microanalytical technique makes use of electrons diffracted by the specimen to form a convergent-beam­electron-diffraction pattern, from which informatibn on the specimen's crystalline structure and atomic arrangement is available. Various problems require the use of different microanalytical tech­niques, but the power of AEM lies in the use of many signals simultaneously from regions of the specimen with a mass of only 10-19g.

Working at magnifications of 4.8 x 105

times to resolve the lattice planes in stainless steels, Bentley has also studied how precipitates fit into the crystal! ine structure of the steel and how misfits are accommodated by interfacial dislocations. A better understanding of the structure of boundaries between the matrix and precip­itates can contribute to finding ways to design alloys with improved properties.

Kenik uses a high-voltage electron microscope to do in situ studies. For example, he may oxidize or strain a specimen while observing it with his 1 MeV instrument to see what happens to its structure. Kenik is also responsible for the share program, which is a cooperative effort with Oak Ridge Associated Universi­ties whereby university researchers work with the analytical systems in the M&C Division in collaborative experiments with REMAG metallurgists.

Although Bentley and his colleagues are searching for the very small , their findings could well have a large influence on the types of materials that will be used in energy facilities of the future.-Car-olyn Krause

47

48

awards and appointments

The East Tennessee Chapter of the Society for Technical Commu­nication has announced the win­ners of this year's competition. Entries were solicited from areas where no competition is held and included, as always, those from nonmembers as well as members. Consequently, entries were submit­ted from more than half the states in the Union. As was done last year, the local competition was judged elsewhere, this year by the Miamisburg, Ohio, chapter. The following ORNL staff members were winners.

In the topital reports category, first prize, Engineering Technol­ogy Division, Larry Wyrick and Ann Ragan for the division's Long-Range Plan, 1980-85; second prize, Elizabeth Giddings and ORNL's Technical Publications Department for Workshop on the Global Effects of Carbon Dioxide from Fossil Fuels; and fourth prize, ORNL's Technical Publications Department, Program Planning and Analysis Office, and Colin West for ORNL's Institutional Plan, FY 1980-FY 1985.

In the annual reports category, first prize, Lamar Toomer of Oak Ridge Gaseous Diffusion Plant, Charles Carmichael and

Ginger Turpin for Uranium En­richment 1979 Annual Report.

In bulletins and brochures, sec­ond prize, Charles D. Scott and Amy Harkey for Symposium Announcement and Call for Pap­ers: Third Symposium on Biotech­nology in Energy Production and Conservation; third prize, Sherry Hawthorne for Information Cen­ter Complex-Information Divi­sion; also third prize, Hawthorne and Alice Richardson for Atomic and Plasma Physics Research Pro­gram.

In journals, second prize, Wil­liam Cottrell, the Nuclear Safety Information Center, DOE's Tech­nical Information Center, Ragan, and Jane Parrott for Nuclear Safety.

In house organs, first prize, Barbara Lyon, Carolyn Krause, and Bill Clark for the ORNL Review, Spring 1980; also third prize for ORNL Review, Summer 1980.

In newsletters, first prize, Susan Buhl, for Lab News, August 1980; also second prize for Lab News, September 1980; another second prize, Toomer, Carmichael, W. R. McCauley, and Kim Sneed for United States Enrichment News , March 1980.

OAK RIDGE NATIONAL LABORATORY Review

~U .S . GOVERNMENT PRINTING OFFICE: 1981-740-062/ 11 6

In journal articles and confer­ence papers, third prize, L. P. Leach, G. D. McPherson, ORNL's Nuclear Safety Information Center staff, Ragan, and Parrott for "Results of the First Nuclear­Powered Loss-of-Coolant Experi­ments in the LOFT Facility" (from Nuclear Safety).

In periodic progress reports, first prize, ORNL Engineering Technol­ogy Division Reports Office and ORNL Graphic Arts Department for "Poster Sessions: Engineering Technology Division 6th Annual Information Meeting, September 17-19, 1980."

In handbooks and manuals , second prize, AnnaJo Shelton, Greg Whitt, and ORNL's Techni­cal Publications Department for Guide to the Nuclear Standards Program, by the Nuclear Stand­ards Management Center; third prize, ORNL's Nuclear Data Pro­ject for 1979 Recent References (Cumulation).

And, finally, in news articles, first prize went to Buhl for "Inter­est in Mt. St. Helens Erupts at ORNL" (from Lab News), and third prize to Carolyn Krause for "RIBS: A Gain for Solar" (also from Lab News).

James L. Scott has been elected

SPRING 1981

a fellow of the American Society for Metals.

Bill Burch has been appointed director of the newly formed Fuel Recycle Division. He will continue as director of the Consolidated Fuel Reprocessing Program Technical Management Center, an office that reports directly to the Department of Energy.

Loucas G. Christophorou has been named a Corporate Research Fellow of Union Carbide Corpora­tion.

Phil Perdue has been presented with the Distinguished Service Award of the East Tennessee Chapter of the Health Physics Society in recognition of his out­standing contributions to the tech­nical advancement of health phy­sics in the form of nuclear instrumentation development.

John Sheffield has been named associate director of the Fusion Energy Division.

C. Y. Wong has been named to the editorial board of the new magazine, Chinese Physics .

Guven Yalcintas is chairman of the Organ Dose in Nuclear Medicine Committee, part of the American Nuclear Society's Biol­ogy and Medical Technology Group.

Pedro Otaduy has been

awarded one of this year's Eugene Wigner Scholarships. He will be employed in the Instrumentation and Controls Division.

Fred Mynatt is replacing Herb Hill as director of the Instrumenta­tion and Controls Division, follow­ing Hill 's acceptance of a position with Beckman Instruments, Inc., in Fullerton, California. Pete Lotts will assume Mynatt's former responsibilities as director of the Nuclear Regulatory Commission programs at the Laboratory. Tom Row will replace Lotts as director of Nuclear Waste programs.

Fred Hartman, Lew Keller, Warren Masker, Salil Niyogi, Audrey Stevens, and Dorothy Skinner have been elected fellows of the American Association for the Advancement of Science.

Dick Hahn is the new chair ­man-elect of the American Chemi­cal Society's division of nuclear chemistry, and Joseph Peterson is its treasurer.

Newly elected fellows of the American Physical Society are Frank Plasil, K.T.R. Davies, and Nobuyoshi Wakabayashi.

Tsuneo Tamura has been named Fellow of the American Society of Agronomy and the Soil Science Society of America.

49