Transcript of Kerry Grant, Wael Ibrahim, Paula Smit, JPSS CGS Raytheon Intelligence, Information, and Services,...
- Slide 1
- Kerry Grant, Wael Ibrahim, Paula Smit, JPSS CGS Raytheon
Intelligence, Information, and Services, Aurora, CO Kurt Brueske,
JPSS CGS Raytheon Intelligence, Information, and Services, Omaha,
NE The first satellite in the JPSS constellation, known as the
Suomi National Polar-orbiting Partnership (S-NPP) satellite, was
launched on 28 October 2011. CGS is currently processing and
delivering Sensor Data Records (SDRs) and Environmental Data
Records (EDRs) for S-NPP and will continue through the lifetime of
the Joint Polar Satellite System programs. The EDRs for Suomi NPP
are currently undergoing an extensive Calibration and Validation
(Cal/Val) campaign. As Cal/Val changes migrate into the operational
system, long term monitoring activities will begin to track product
quality and stability. In conjunction with NOAAs Office of
Satellite and Product Operations (OSPO) and the NASA JPSS Ground
Project, Raytheon is supporting this effort through the development
and use of tools, techniques, and processes designed to detect
changes in product quality, identify root causes, and rapidly
implement changes to the operational system to bring suspect
products back into specification. Figure 1 outlines the processes
in place to detect and analyze quality issues, and initiate the
change process. Maintaining a strong provenance to validated
science is critical as the system evolves in response to both
resolution of product quality issues and incorporation of new
science. Establishing the provenance requires rigorous test,
analyses, and comparison activities, the maintenance of controlled
test data sets and expected results, and the ability to update test
inputs and outputs based on evolving needs. Figure 2 illustrates a
data mining tool that allows investigators find and select specific
scene conditions needed for a new test data set. It uses the VIIRS
Cloud Mask (VCM) Intermediate Product (IP) as a characterizing
gateway to identify and extract sub-orbit granules of interest
based on geophysical characteristics of interest. This greatly
reduces the time spent sorting through very high volume data sets
for the necessary test conditions. Figure 3 outlines the steps in
maintaining the science provenance as the baseline evolves. This
process includes comparing outputs, ensuring changes in outputs are
the results of the new science and not unanticipated responses or
errors, and the selection and verification of new test data sets.
Figure 3 Analysis Framework Figure 1 Detecting and Analyzing
Quality Issues Assign Investigate Coordinate Update DR Assign
Investigate Coordinate Update DR Authorize Preparation of a
Configuration Change Request (CCR) to Algorithm Engineering Review
Board (AERB) Assign to System Engineering New Algorithm required:
Select Algorithm source New Algorithm required: Select Algorithm
source Algorithm Discrepancy Report (ADR) DR Action Team: NASA,
NOAA, Raytheon Assess Urgency DR Action Team: NASA, NOAA, Raytheon
Assess Urgency Analysis/Investigation based on: Data
Exploration/Monitoring observations Report-based assessments
Requests for Information Discrepancy Report and/or Work Request CCR
Material Generation Change Package materials needed for CCR Test
Data Code Changes Documentation Change Pages Identification of
affected documents TIM briefing charts Delivery letter & CCR
memo Delivers package to NASA DQST Analysis System OAA QCV System
Raytheon science data and engineering experts support Program
Analysis efforts using the DQST Analysis and Operational Algorithm
Assessment (OAA) Quality Characterization and Visualization (QCV)
Tool Suite. Some problems require a change, and need formal review.
In some cases, information gathering may be all that is needed.
Offline Support Issue needs more analysis to better characterize
problem JPSS Ground Project Issue needs broader collaboration to
resolve (e.g. CGS impact, user impacts) Sustainment Issue can be
resolved using Factory support and is authorized (planned) Near
Real-Time Monitoring Mission Operations Stored Telemetry & Data
Quality Reports, Ad Hoc Data Exchanges, Problem Reporting (WRS)
Informal Collaboration or Working Groups Work Request Review Panel
(WRRP) Anomaly Resolution Team (ART) Mission Operations Support
staff provide near real-time monitoring during day-to-day
operations STA and DQ Reports Ad Hoc Data Exchanges Problem
Reporting (WRS) Informal Collaboration or Working Groups Work
Request Review Panel (WRRP) Anomaly Resolution Team (ART) Near
Real-Time Monitoring STA Reports DQ Reports Ad Hoc Exchanges Weekly
Tag ups Assigned WRs Analysis PGE/ASF outputs Custom Tools
Operations Support Collaboration Actions Status Report DQST
Analysis System Data Quality Support Team (DQST) science data
experts collect information from OPS and GRAVITE for assessments
using DQST Analysis System. The DQST collaborate closely with
Mission Operations and the JPSS Mission Partners. Algorithm ERB
(AERB) ProjectsCOAST Chair JPSS DPA Manager CentralsCentrals
ChangeRequest to CGS Project led Science Teams COASTCOAST Customer
Centrals DirectReadoutDirectReadout NASA SDS PEATEs ADR Tool
DRActionTeam(DRAT) Algorithm Change Package Test using GRAVITE-ADA
CCR/JPSS PCR/PCRB ECR/O-CCB WR/WRRP Discrepancies are reported
using the Algorithm DR Tool, reviewed by the DR Action Team (DRAT).
Formal changes are proposed via algorithm change packages via
Configuration Change Requests (CCRs), which are reviewed at the
AERB. Government Resource for Algorithm Verification, Independent
Test, and Evaluation (GRAVITE) Figure 2 Targeted Data Mining A
tailored graphical user interface combining user-defined VIIRS
Cloud Mask Quality Flag (QF) values, QF bundling logic, as well as
geographic and solar/geophysical constraints enables quantitative
analysis and identification of a best match from multiple-day,
multi-orbit datasets for either cloud-or-downstream algorithm
processing Test Data V n Science Process V n Output Results V n
Test Data V n+1 Science Process V n+1 Output Results V n+1 Update
test data set to include scene conditions necessary to test new
science Analyze output to ensure changes reflect expected behavior
from science change. The framework consists of two data categories
benchmark and experimental and two analysis variation categories
principle and non-principle. Experimental data advances to
benchmark data iteratively as the operational algorithm baseline
evolves. Here, updates to Test Data V n to include the necessary
new scene conditions provide the new experimental Test Data V n+1
set. Analysis of output results ensure only variations due to known
external variables are present (principle variations).
Non-principle variations (e.g., errors in software code,
requirements, or interfaces) are identified and removed through
rigorous software or systems engineering processes. Once principle
variations are confirmed, the Test Data V n+1 set becomes the new
benchmark. MATLAB conf clear conf cloudy