Applying Photonics to User Needs: The Application Challenge" Presentation to Adel Saleh, DARPA ATO...
-
Upload
rebecca-bennett -
Category
Documents
-
view
212 -
download
0
Transcript of Applying Photonics to User Needs: The Application Challenge" Presentation to Adel Saleh, DARPA ATO...
“Applying Photonics to User Needs: The Application Challenge"
Presentation to Adel Saleh, DARPA ATO
University of California, San Diego
La Jolla, CA
April 14, 2005
Dr. Larry Smarr
Director, California Institute for Telecommunications and Information Technology
Harry E. Gruber Professor,
Dept. of Computer Science and Engineering
Jacobs School of Engineering, UCSD
Coherence
DRAM - 4 GB - HIGHLY INTERLEAVEDMULTI-LAMBDAOptical Network
VLIW/RISC CORE40 GFLOPS
10 GHz
240 GB/s24 Bytes wide
240 GB/s24 Bytes wide
VLIW/RISC CORE 40 GFLOPS 10 GHz
...
2nd LEVEL CACHE8 MB
2nd LEVEL CACHE 8 MB
CROSS BAR
DRAM – 16 GB64/256 MB - HIGHLY INTERLEAVED
640GB/s
OptIPuter Inspiration--Node of a 2009 PetaFLOPS Supercomputer—5 Tb/s LAN!
Updated From Steve Wallach, Supercomputing 2000 Keynote
5 Terabits/s
Four Classes of OptIPuterApplications Drivers
• Browsing & Analysis of Multiple Large Remote Data Objects
• Telepresence, CineGrid, and Shared Virtual Reality
• Assimilating Data—Linking Supercomputers with Data Sets
• Interacting with Remote Scientific Instruments
Realizing the Dream:High Resolution Portals to Global Science Data
30 MPixel SunScreen Display Driven by a 20-node Sun Opteron Visualization Cluster
Source: Mark Ellisman, David Lee, Jason Leigh
150 Mpixel Microscopy MontageGreen: Purkinje CellsRed: GFAP in the Glial CellsBlue: DNA in Cell Nuclei
Cumulative EOSDIS Archive Holdings--Adding Several TBs per Day
0
1,000
2,000
3,000
4,000
5,000
6,000
7,000
8,00020
01
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
Calendar Year
Cu
mu
lati
ve T
era
Byt
es
Other EOSHIRDLSMLSTESOMIAMSR-EAIRS-isGMAOMOPITTASTERMISRV0 HoldingsMODIS-TMODIS-A
Other EOS =• ACRIMSAT• Meteor 3M• Midori II• ICESat• SORCE
file name: archive holdings_122204.xlstab: all instr bar
Terra EOMDec 2005
Aqua EOMMay 2008
Aura EOMJul 2010
NOTE: Data remains in the archive pending transition to LTA
Source: Glenn Iona, EOSDIS Element Evolution Technical Working Group January 6-7, 2005
Challenge: Average Throughput of NASA Data Products to End User is Only < 50 Megabits/s
Tested from GSFC-ICESATJanuary 2005
http://ensight.eos.nasa.gov/Missions/icesat/index.shtml
Interactive Retrieval and Hyperwall Display of Earth Sciences Images Using NLR
Earth science data sets created by GSFC's Scientific Visualization Studio were retrieved across the NLR in real time from OptIPuter servers in Chicago and San Diego and from GSFC servers in McLean, VA, and displayed
at the SC2004 in Pittsburgh
Enables Scientists To Perform Coordinated Studies Of
Multiple Remote-Sensing Datasets
http://esdcd.gsfc.nasa.gov/LNetphoto3.html
Source: Milt Halem & Randall Jones, NASA GSFC& Maxine Brown, UIC EVL
Eric Sokolowsky
The NIH Biomedical Informatics Research Network:Shared Federated Repositories of Image Data
National Partnership for Advanced Computational Infrastructure
Part of the UCSD CRBS Center for Research on Biological Structure
UCSD is IT and Telecomm Integration Center
Average File Transfer~10-50 Mbps
NCRR BIRN Site Rack NCRR BIRN Site Rack
NetworkNetworkAttached Attached Storage Storage 1 1 -- 10 TB10 TB
RouterRouterCisco 4006Cisco 4006
Grid POPGrid POP
Network StatsNetwork Stats
GigEGigE Net ProbeNet Probe
UPSUPS
Encryption Encryption
NCRR BIRN Site Rack NCRR BIRN Site Rack
NetworkNetworkAttached Attached Storage Storage 1 1 -- 10 TB10 TB
RouterRouterCisco 4006Cisco 4006
Grid POPGrid POP
Network StatsNetwork Stats
GigEGigE Net ProbeNet Probe
UPSUPS
Encryption Encryption
Landsat7 Imagery100 Foot Resolution
Draped on elevation data
High Resolution Aerial Photography Generates Images With 10,000 Times More Data than Landsat7
Shane DeGross, Telesis
USGSNew USGS Aerial ImageryAt 1-Foot Resolution
~10x10 square miles of 350 US Cities 2.5 Billion Pixel Images Per City!
Multi-Gigapixel Images are Available from Film Scanners Today
The Gigapxl Projecthttp://gigapxl.org
Balboa Park, San Diego
Large Image with Enormous DetailRequire Interactive LambdaVision Systems
One Square Inch Shot From 100
Yards
The OptIPuter Project is Pursuing
Obtaining some of these Images
forLambdaVision
100M Pixel Walls
http://gigapxl.org
Cosmic Simulator with a Billion Zone and Gigaparticle Resolution
• 5123 AMR or 10243 Unigrid • 8-64 Times Mass Resolution• Can Simulate First Galaxies• One Gigazone Run:
– Output ~10 TeraByte – “Snapshot” is 100 GB– Must Visually Analyze
Source: Mike Norman, UCSD
SDSC Blue Horizon (2004)
10243 Unigrid
To a LambdaGrid, a Supercomputer is Just
Another High Performance Data Generator
First Beams: April 2007
Physics Runs: from Summer 2007
TOTEM
LHCb: B-physics
ALICE : HI
pp s =14 TeV L=1034 cm-2 s-1
27 km Tunnel in Switzerland & France
ATLAS
Large Hadron Collider (LHC) e-Science Driving Global Cyberinfrastructure
Source: Harvey Newman, Caltech
CMS
High Energy and Nuclear Physics A Terabit/s WAN by 2010
Year Production Experimental Remarks
2001 0.155 0.622-2.5 SONET/SDH
2002 0.622 2.5 SONET/SDH DWDM; GigE Integ.
2003 2.5 10 DWDM; 1 + 10 GigE Integration
2005 10 2-4 X 10 Switch; Provisioning
2007 2-4 X 10 ~10 X 10; 40 Gbps
1st Gen. Grids
2009 ~10 X 10 or 1-2 X 40
~5 X 40 or ~20-50 X 10
40 Gbps Switching
2011 ~5 X 40 or
~20 X 10
~25 X 40 or ~100 X 10
2nd Gen Grids Terabit Networks
2013 ~Terabit ~MultiTbps ~Fill One Fiber
Continuing the Trend: ~1000 Times Bandwidth Growth Per Decade;We are Rapidly Learning to Use Multi-Gbps Networks Dynamically
Source: Harvey Newman, Caltech
Four Classes of LambdaGrid Applications
• Browsing & Analysis of Multiple Large Remote Data Objects
• Telepresence, CineGrid, and Shared Virtual Reality
• Assimilating Data—Linking Supercomputers with Data Sets
• Interacting with Remote Scientific Instruments
Telepresence Using Uncompressed HDTV Streaming Over IP on Fiber Optics
Seattle
JGN II WorkshopJanuary 2005
Osaka
Prof. OsakaProf. Aoyama
Prof. Smarr
Goal—Upgrade Access Grid to HD Streams Over IP on Dedicated Lambdas
Access Grid Talk with 35 Locations on 5 Continents—SC Global Keynote Supercomputing 04
Calit2 Collaboration Rooms Testbed UCI to UCSD
In 2005 Calit2 will Link Its Two Buildings
via CENIC-XD Dedicated Fiber over 75 Miles Using OptIPuter Architecture to Create a
Distributed Collaboration Laboratory
UC Irvine UC San Diego
UCI VizClass
UCSD NCMIR
Source: Falko Kuester, UCI & Mark Ellisman, UCSD
OptIPuter Challenge is to Couple Cluster Endpoints to WAN DWDM Dedicated Photonic Channels
• Scalable Adaptive Graphics Environment (SAGE) Controls:
• 100 Megapixels Display
– 55-Panel
• 1/4 TeraFLOP – Driven by 30 Node
Cluster of 64 bit Dual Opterons
• 1/3 Terabit/sec I/O– 30 x 10GE
interfaces– Linked to OptIPuter
• 1/8 TB RAM• 60 TB Disk
Source: Jason Leigh, Tom DeFanti, EVL@UICOptIPuter Co-PIs
NSF LambdaVision
MRI@UIC
• Calit2 is Building a LambdaVision Wall in Each of the UCI and UCSD Buildings• Each LCD can Handle a Full Resolution HD stream
Applying the OptIPuter to Digital Cinema The Calit2 CineGrid Project
• Connect a Global Community of Users and Researchers– Engineering a Camera-to-Theatre Integrated System– Create Digital CineGrid Production & Teaching Tools – Engage Artists, Producers, Scientists, Educators
• Educational & Research Testbed Using OptIPuter Architecture – Scaling to 4K SHD and Beyond!– Distributed Computing, Storage, Visualization & Collaboration– CAVEwave and Global Lambda Infrastructure Facility (GLIF)– Support CineGrid Network Operations from Calit2
• Develop Partnerships with Industry and Universities, e.g.:– USC School of Cinema-Television– DCTF in Japan– National School of Cinema in Italy
Source: Laurin Herr, Pacific-Interface
A High Definition Command Center as Imagined In 2007 In A HiPerCollab
Source: Jason Leigh, EVL, UIC
Augmented Reality
SuperHD StreamingVideo
100-MegapixelTiled Display
ENDfusion Project
Four Classes of LambdaGrid Applications
• Browsing & Analysis of Multiple Large Remote Data Objects
• Telepresence, CineGrid, and Shared Virtual Reality
• Assimilating Data—Linking Supercomputers with Data Sets
• Interacting with Remote Scientific Instruments
Increasing Accuracy in Hurricane Forecasts Real Time Diagnostics in GSFC of Ensemble Runs on ARC Project Columbia
Operational ForecastResolution of National Weather Service
Higher Resolution Research ForecastNASA Goddard Using Ames Altix
5.75 Day Forecast of Hurricane Isidore
Resolved Eye Wall
Intense Rain-
Bands
4x Resolution
Improvement
Source: Bill Putman, Bob Atlas, GFSC
NLR will Remove the InterCenter Networking Bottleneck
Project Contacts: Ricky Rood, Bob Atlas, Horace Mitchell, GSFC; Chris Henze, ARC
Next Step: OptIPuter, NLR, and Starlight EnablingCoordinated Earth Observing Program (CEOP)
Note Current Throughput 15-45 Mbps:OptIPuter 2005 Goal is ~1-10 Gbps!
http://ensight.eos.nasa.gov/Organizations/ceop/index.shtml
Accessing 300TB’s of Observational Data in Tokyo and 100TB’s of Model Assimilation Data in MPI in Hamburg -- Analyzing Remote Data Using GRaD-DODS at These Sites Using OptIPuter Technology Over the NLR and Starlight
Source: Milt Halem, NASA GSFC
SIO
Use OptIPuter to Couple Data Assimilation Models to Remote Data Sources and Analysis in Near Real Time
Regional Ocean Modeling System (ROMS) http://ourocean.jpl.nasa.gov/
Goal is Real Time Local Digital Ocean Models
Long Range HF Radar
Four Classes of LambdaGrid Applications
• Browsing & Analysis of Multiple Large Remote Data Objects
• Telepresence, CineGrid, and Shared Virtual Reality
• Assimilating Data—Linking Supercomputers with Data Sets
• Interacting with Remote Scientific Instruments
Brain Imaging Collaboration -- UCSD & Osaka Univ. Using Real-Time Instrument Steering and HDTV
Southern California OptIPuterMost Powerful Electron Microscope in the World
-- Osaka, Japan
Source: Mark Ellisman, UCSD
UCSDHDTV
LOOKING: (Laboratory for the Ocean Observatory
Knowledge Integration Grid)
New OptIPuter Application Driver: Gigabit Fibers on the Ocean Floor
• LOOKING NSF ITR with PIs:– John Orcutt & Larry Smarr - UCSD– John Delaney & Ed Lazowska –UW– Mark Abbott – OSU
• Collaborators at:– MBARI, WHOI, NCSA, UIC, CalPoly, UVic,
CANARIE, Microsoft, NEPTUNE-Canarie• Goal: Prototype Cyberinfrastructure for NSF
ORION
www.neptune.washington.edu
LOOKING--Integrate Instruments & Sensors
(Real Time Data Sources) Into a LambdaGrid
Computing Environment With Web Services Interfaces
Pilot Project ComponentsPilot Project Components
LOOKING Builds on the Multi- Institutional SCCOOS Program, OptIPuter, and CENIC-XD
• SCCOOS is Integrating:– Moorings– Ships– Autonomous Vehicles – Satellite Remote Sensing– Drifters– Long Range HF Radar – Near-Shore
Waves/Currents (CDIP)– COAMPS Wind Model– Nested ROMS Models– Data Assimilation and
Modeling– Data Systems
www.sccoos.org/
www.cocmp.org
Yellow—Initial LOOKING OptIPuter Backbone Over CENIC-XD
MARS New Gen Cable Observatory Testbed - Capturing Real-Time Basic Environmental Data
Tele-Operated Crawlers
Central Lander
MARS Installation Oct 2005 -Jan 2006
Source: Jim
Bellingham, MBARI
September 26-30, 2005University of California, San Diego
California Institute for Telecommunications and Information Technology
The Networking Double Header of the Century Will Be Driven by LambdaGrid Applications
iGrid
2oo5T H E G L O B A L L A M B D A I N T E G R A T E D F A C I L I T Y
Maxine Brown, Tom DeFanti, Co-Organizers
www.startap.net/igrid2005/
http://sc05.supercomp.org
Goal – From Expedition to Cable Observatories with Streaming Stereo HDTV Robotic Cameras
Scenes from The Aliens of the Deep, Directed by James Cameron &
Steven Quale
http://disney.go.com/disneypictures/aliensofthedeep/alienseduguide.pdf
Proposed Experiment for iGrid 2005 –Remote Interactive HD Imaging of Deep Sea Vent
Source John Delaney & Deborah Kelley, UWash
To Starlight, TRECC,
and ACCESS
Global Architecture of a 2009 COTS PetaFLOPS System
I/O
ALL-OPTICAL SWITCH
Multi-DieMulti-Processor
1
23
64
63
49
48
4 516
17
18
32
3347 46
128 Die/Box4 CPU/Die
10 meters= 50 nanosec Delay
...
...
...
...
LAN/WAN
Source: Steve Wallach, Supercomputing 2000 Keynote
Systems Become GRID Enabled