Summary of MetOp Data Flow and Latency
description
Transcript of Summary of MetOp Data Flow and Latency
Summary of MetOp Data Flow and Latency
November 13, 2008
Selina M. Nauman
COPC Action Item 2007-2.13• NESDIS will evaluate the costs needed to
get a non-NOAA satellite data (e.g., JASON 2, METOP) data streams available to the OPCs and/or the user community in a more timely manner. CSAB 22 October, 2008. Recommend closing
with Selina Nauman’s briefing to COPC. Briefing will include a comparison of NOAA’s satellite latency with non-NOAA satellite latency.
• NOAA analysis confirms Svalbard to NOAA Gateway in Darmstadt drives the data latency– Issue is prior to NOAA getting the data
• Two aspects of the issue need to be considered:– Latency requirement
• Current Metop and future Jason-2 data transfer within established latency requirements
– Concept of Operations between NOAA and foreign partners
• Change to data distribution procedures to reduce latency requires modifying existing agreements and operational scenarios
COPC Action Item 2007-2.13
IJPS System Overview
Space to Ground National International
Links Legend:
Internal
Wallops CDA
Fairbanks CDA
SOCC
ESPC
NOAA Ground Segment
NOAA and MetOp
Instruments
EPS Overall Ground Segment (OGS)
Suitland Darmstadt
CGS PCDA
NOAA and MetOp
InstrumentsSatellite
Svalbard
Satellite
Space to Ground National International
Links Legend:
Internal
Wallops CDA
Fairbanks CDA
SOCC
NOAA Ground Segment
NOAA and MetOp
Instruments
EPS Overall Ground Segment (OGS)
Suitland Darmstadt
CGS PCDA
NOAA and MetOp
InstrumentsSatellite
Svalbard
Satellite
Space to Ground National International
Links Legend:
Internal
Wallops CDA
Fairbanks CDA
SOCC
NOAA Ground Segment
NOAA and MetOp
Instruments
EPS Overall Ground Segment (OGS)
Suitland Darmstadt
CGS PCDA
NOAA and MetOp
InstrumentsSatellite
Svalbard
Satellite
Space to Ground National International
Links Legend:
Internal
Wallops CDA
Fairbanks CDA
SOCC
NOAA Ground Segment
NOAA and MetOp
Instruments
EPS Overall Ground Segment (OGS)
Suitland Darmstadt
CGS PCDA
NOAA and MetOp
Instruments
Determining METOP Data Latency
• A snapshot of one METOP pass from October 22, 2008 (Julian Day 296) was analyzed
• The data processing and data flow times for key interfaces were recorded in an Excel spreadsheet – Svalbard– Darmstadt– SOCC
– ESPC
~184Last granule received at
SPP
~
Data age will be ~ 85 -115 minutes for all granules
CDA Svalbard Ingest
0 ~92
Darmstadt CE Transmission &
Orbital File Complete
ESPC granule processing
Current METOP Pipeline Processing Timeline
AOS LOS
~108
~96
Darmstadt CE Transmission
starts
~180First granule received at
DAPE Gateway
(~105 min)
POES AVHRR 1b Data Latency(30 Jun '08 - 09 Nov '08)
120
130
140
150
160
170
180
190
200
210
27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45
Calender Week (2008)
La
ten
cy
(m
inu
tes
)
MetOp-2 NOAA-18 SV NOAA-18 WI NOAA-18 GC NOAA-17 WI NOAA-17 GC
Reship of 1 blind orbit N-18 GAC delayed processing by
9+ hours (September 25)
Data latency is the time difference between the start time of the GAC/FRAC orbit and the completion of the AVHRR level 1B
Latency Summary• Data flow within ESPC has very little impact
on the overall data delay
• Data flow from Svalbard to NOAA Gateway in Darmstadt drives the data latency– November 18 – 20 Initial Joint Polar System
Coordination• EUMETSAT planned upgrades will be discussed
BACKUP SLIDES
Metop GDS Data Latency
Time Since Start of Observation
Event (minutes)
Start of Observation
Acquisition of Signal (AOS) 92.47
First Command Time (FCT) 93.13
Communications Element (CE) Transmit from Darmstadt Started 96.08
ESPC AFEP Transmit to Diamond Started 96.80
OPUS 'diamond' Data Received Report 97.23
End of Observation 101.62
1b Granule S0837.E0844 Completed on 'diamond' 103.80
1b Granule S0837.E0844 Received @ SPP 105.12
Granule S0837.E0844 Sent to AFWA 105.73
Granule S0837.E0844 Sent to NAVO 106.45
Last Command Time (LCT) 106.97
Loss of Signal (LOS) 107.80
1b Granule S0845.E0852 Completed on 'diamond' 110.80
1b Granule S0845.E0852 Received @ SPP 111.62
Granule S0845.E0852 Sent to NAVO 112.13
Granule S0845.E0852 Sent to AFWA 112.25
1b Granule S1013.E1020 Completed on 'diamond' 180.80
1b Granule S1013.E1020 Received @ SPP 183.52
Granule S1013.E1020 Sent to AFWA 183.80
Granule S1013.E1020 Sent to NAVO 184.72
1b Orbit S0837.E1018 Completed on 'diamond' 180.80
1b Orbit S0837.E1018 FRAC Received @ SPP 183.35
Orbit S0837.E1018 MHS Received @ SPP 183.35
Orbit S0837.E1018 HIRS Received @ SPP 183.52
Orbit S0837.E1018 AMSU-A Received @ SPP 183.68
Orbit S0837.E1018 MHS Sent to FNMOC 183.83
Orbit S0837.E1018 GAC Received @ SPP 183.85
Orbit S0837.E1018 HIRS Sent to NAVO 184.15
Orbit S0837.E1018 HIRS Sent to FNMOC 184.30
Orbit S0837.E1018 AMSU-A Sent to FNMOC 184.68
Orbit S0837.E1018 GAC Sent to NAVO 190.08
Metop GDS Data Latency (cont.)
MetOp Global Data Stream Data Flow
• Svalbard– Acquisition Of Signal– Loss Of Signal
• Darmstadt– Core Ground System– NOAA Gateway– Communications Element
• SOCC– Communications Element• ESPC– Sky-x– Advanced Front End Processor– Diamond – primary level 1B processor – P570– Shared Processing Gateway or Data Distribution Server
NJDATMS -U Connectivity2 /20/2008
DATMS-U CONNECTIVITY
LEGEND:CDAAC – COSMIC Data Analysis & Archive CenterFE -- Fast EthernetGb – Gigabits/secMb – Megabits/secOC12 – Connection backside to DATMS-UTLS – Transport Layer Security LinkUCAR – University Corp for Atmospheric Research
-- DATMS-U connection to physical AP1000
14 Mb
6 Mb
6 Mb
50 Mb
6 Mb
NOAA/NWSTG & NCEP
(74 Mb Total)
100 Mb(FE)
100 Mb TLS
NAVO (12 Mb Total)
100 Mb (FE)NWS MontereyPass thru to NOAA
FNMOC(26 Mb Total)
100 Mb(FE)
NOAA/NESDIS(DAPE Gateway)
& NAVICE(1 Gb Ethernet)
UCAR/CDAAC
(1 Gb)
FE
AFWA(64 Mb Total)
ESPC PROCESSING (Jason-2 TM-NRT v2)
ESPC PROCESSING (Jason-2 TM-NRT v2)