ARLINGTON COUNTY, VIRGINIA OFFICE OF THE ... - Vendor Registry
1 Application Communities Kick-off Meeting Arlington, Virginia July 7, 2006.
-
Upload
jonah-dawson -
Category
Documents
-
view
215 -
download
1
Transcript of 1 Application Communities Kick-off Meeting Arlington, Virginia July 7, 2006.
1
Application Communities Kick-off Meeting
Arlington, VirginiaJuly 7, 2006
2
Agenda
0800-0830Continental Breakfast0830-0900Program Manager Welcome0900-1000SRI Presentation1000-1030Morning Break1030-1130SRI Presentation (continued)1130-1200Discussion1200-1300Lunch1300-1400MIT Presentation1400-1430Afternoon Break1430-1530MIT Presentation (continued)1530-1600Wrap-up discussions1600 Side-bars
3
Lee BadgerInformation Processing Technology
OfficeDefense Advanced Research
Projects Agency
Application Communities July 7, 2006
Program Kickoff
4
The Problem
Attack Sophistication Growing
(Source: CERTCoordination Center)
Lines of Code55,000,00050,000,00030,000,00020,000,00017,000,00015,000,0003,000,0001,000,000
485,00020,000
79 80 90 92 95 98 99 00 01 02
SCOMP
Debian
Multics
?Win2K?
Win95Win3.1
LispDarwin Kernel
RedHat6
WinNT
RedHat7
Linux Kernel
Scale/Vulnerability Growing
SCOMP
Debian
Multics
?Win2K?
Win95Win3.1
LispDarwin Kernel
RedHat6
WinNT
RedHat7
Linux Kernel
Military ContextMilitary Context
DoD needs “better” COTS than adversaries have.
Traditional wisdom: monocultures are weak:vulnerable to homogeneity/scale/collaboration
Monocultures and MarketsMS Office, Windows, Cisco, IE, Apache
0%
20%
40%
60%
80%
100%
MarketShare
New idea: turn these “weaknesses” into advantages:
Leverage homogeneity/scale/collaboration for the defense.
5
The Military ContextD
oD
Cyb
er
Fou
nd
ati
on
Voice/DataRouting
OperatingSystem
Routers
EmailIM
Office Productivity
Firewalls
Special Applications(COP, IntelligenceTactical, Imagery,
…)
Anti-virus
COTS Applications
JWICSSIPRNETNIPRNETSatellite
Forward DeploymentSustaining Bases/Hqtrs
•Same equipment at home, deployed•Frequent updates•Some systems just “Show-up”•Fast fielding•Vulnerable COTS
6
Technology Objective
Scale/homogeneity/collaboration for defense
Collaboratively diagnoses problems (attacks/bugs/errors)
Transform many running copies of a (COTS) software program to behave as a self-aware Application Community that:Application Community
xx% accurate problem identification, localization, and diagnosis in xx minutes.
Collaboratively responds while maintaining intended functions
Generate effective patches/filters in xx minutes.Prevent xx% of harmful patch/filter side effects.
2
3
1
Collaboratively generates Situation Awareness Gauge
Predict likelihood and timing of problemswith xx% accuracy.
Valu
e o
f C
olla
bora
tion
0%
100%
1 100 1000 10,000 100,000 1,000,000 …active copies
untappedutility
Exploit scale for problem:
localization
mitigation
recovery
learning
Possible Curves
unknownthreshold fordiminished
returns
Collaboration: The Source Of Traction
00110
11001
00110 11001
0011011001
Observe
Orient
DecideAct
timeKn
ow
led
ge
001101100100110
110
01
0011
0
110
01
00
11
01
10
01
00110
110 01
0011
0
110 01
00110 11001
0011011001
00110
11001
001101100100110
110
01
0011
011
001
00
11
01
10
01
0011
0
110 01
0011
0 110 01
Browser
Browser
Operating
System
Web ServerEmail
Middle-ware
possible curve
possible curve
7
Evaluation Protocol
Phase I (Technology Development, 18 Months)
SRI Metrics
Detect >80% of attacks, with <10% false alarmsRecover entire community from attack <30 minutes<30% average performance slowdown>60% accurate problem effects prediction < 15 min before arrival
MIT Metrics
Detect 95% of code injection attacks, recover from 60%Detect 50% of all other attacks and errors, recover from >30%<5% performance slowdown
July 7, 2006 Start
GoNo-Go
Dec. 6, 2007
Phase II (Maturation, Evaluation, Transition, 12 Months)
March 6, 2009
SystemMaturation
InitialRed TeamEvaluation
Transition-ReadySelf-Defending COTS Software
Commercialization
Flaw RemediationMore Ambitious
Metrics
Detect >98% of attacks, with <1% false alarmsRecover entire community from attack <10 minutes<30% average performance slowdown>80% accurate problem effects prediction < 5 min before arrival
FinalRed TeamEvaluation
Feb. 7, 2009 toMarch 6, 2009
March 7, 2008 Start
Dec. 7, 2007 toJan. 6, 2008
8
AC KickoffJuly 7 2006Washington DC1-day meetingPresent new projects
PI MeetingWednesday, Jan. 10, 2007East Coast LocationPresent progress reports
PI MeetingJuly 2007West Coast LocationPresent progress reports
PI Meetingearly Jan. 2008East Coast Location
2006
2007
Phase 1 Schedule
2008
Red Team EvaluationsDec. 2007on site
Site Visits by the PMand selected IETOct. 2006
Site Visits by the PMand selected IETApril 2007
Site Visits by the PMand selected IETOct. 2008
9
Role of the IET
Provide technical feedback to performers at PI meetingsAttend site visits for in-depth reviewsReview performer self-assessment strategies
Evaluate validity of progress measures Evaluate how understandable progress measures are to SRS outsiders
Cordell Green (Kestrel) Hilarie Orman (Purple Streak) Alex Orso (Ga. Tech)