Post on 13-Jul-2020
Forum Securitatis
Lars Westerdahl
5 December 2016
Access Control
Access control
Identification
Authentication
Authorization
Access decision
(Karp, Haury & Davis 2009)
Access Control
• Policy
• Model
• Mechanism
(Hu, Ferraiolo & Kuhn 2006)
Discretionary Access Control
Discretionary Access Control
AC
DAC
Mandatory Access Control
Mandatory Access Control
AC
DAC MAC
Bell-LaPadula model
AC
DAC MAC
Role-Based Access Control
Role-Based Access Control
AC
RBAC
DAC MAC
Attribute-Based Access Control
Attribute-Based Access Control
AC
RBAC
DAC MAC
ABAC
Attribute-Based Access Control
AC
RBAC
DAC MAC
ABAC
Attribute-Based Encryption
KP-ABECP-ABE
Attribute-Based Encryption
References
• Karp, A.H., Haury, H. & Davis, M.H. (2009).
From ABAC to ZBAC: The Evolution of Access
Control Models.
http://www.hpl.hp.com/techreports/2009/HPL-
2009-30.pdf
• Hu, V.C., Ferraiolo, D.F. & Kuhn, D.R. (2006).
Assessment of Access Control Systems.
Gaithersburg: National Institute of Standards
and Technology.
ICS– History and Security
Supervisory Control and Data Acquisition
In the simplest terms: Computer systems that control
physical processes, usually on an industrial (large) scale.
We use ICS as a generic term. (ICS, DCS, SCADA)
What are they?
History
• Punch cards 1725
• Automated control of a machine
• Humans reduced to support units
• Mass production 1850
• Electrification 1900
• Chain production 1900-
• RW-300 1959
• The first DCS
– Distributed control system
– Removed humans from production
Specifics of early ICS:s
• Control logic implemented in physical relay
ladders
• Reprogramming involved physical rebuilding of the ”computer”
level.
• Direct control on the machine level is physical:
• Relays, cam timers, drum sequencers
• Cool, but impractical.
Examples
• Nuclear and hydroelectrical powerplants and power distribution
grids
• Drinking water plants and water supply network
• HVAC systems
• Paper mills
• Steel mills
• Rail and road traffic control systems
The summer of love
• MODICON 084
• MOdular DIgital CONtroller
• The first PLC
• Programmable logic controller
PLC;s
• A programmable logic
controller, PLC or
programmable
controller is a digital
computer used for
automation of typically
industrial
electromechanical
processes (Wikipedia)
Ladder logic
Ladder logic continued
• These days PLC:s may or may not have ladder
logic
• Proprietary or non proprietary programming
languages
• Open or undisclosed internal representation of
logic
ICS
Specifics
• Controls a PHYSICAL PROCESS, important!
• Large areas, may need MW-links or SatCom
• LOTS of devices, often in redundance
• Low latency, low bandwidth (1200 Baud)
• Real time data
Specifics
• Historically:
• Custom designed hardware, software and systems
• Single vendor
• Proprietary protocols,
• Now
• Mixed systems, some custom parts some COTS
• Usually single major vendor
• Mixed protocols, often older ones run on IP
PLC never die….
• Almost.
• Operating life of ICS-technology routinely
exceed twenty years.
• State of the art 2014 next to state of the art
1984
• Quite a bit of difference
Then and now:
Trends
• Virtualization
• Relays -> Software
• Generalization
• Custom built hardware -> COTS
• Geographic distribution
• Plant wide networks -> Internet connected systems
• ICS to ECS (Embedded Control Systems)
• Turbine hall -> Suburban house
• Legacy remains
• Modbus -> Still Modbus but over IP over WiFi
• Integration
• Separate networks -> Connected networks
ICS Security
Safety and security
• Safety protects people from the system
• Security protects the system from people
• Traditional priority is as follows
• Safety
• Uptime
• Cost
• Safety again
Example
This is Sheikh Abdolsamad
Mosque
Square Kufic?
But I digress..
It’s here:
What lies below..
• 100,000 square meters
• Built 8 meters underground
• Steel reinforced concrete roof 2.5 meters thick
• Covered by an additional 22 meters of packed
earth.
Uranium enrichement plant
• Extracts U235 from UF6
• Uses Centrifugal extraction
• EU and the US were against
this activity
Timeline
• 2002 <200 centrifuges
• 2003 Underground complex construction
begins
• 2006 UN resolution 1696 demand that Iran
cease all enriching activity
• 2006 By end of year approx 3000 centrifuges
• 2007 IAEA reports only 2/3 centrifuges running
Cont
• 2009 Project plagued by delays due to quality control, At least 1000
centrifuges scrapped
• Irans head of Nuclear program resign
• 2010 IAEA: 3772 working, 5084 still inactive
• Prof Massoud Ali Mohammadi murdered
• Dr Majid Shahriari Murdered
• Chemistry engineer Fereydoon Abbasi Davani Survived murder
attempt
• 2011 Darioush Rezaeinejad murdered
• 2012 Dr Mostafa Ahmadi Roshan murdered
• 2015 Treaty signed
Stuxnet
• Jun 2010 VirusBlokAda discover a new
malware.
• 15 Jun they post their findings.
• 15 Jun, DDOS of unknown origin strikes mailing
lists for SCADA-security causing many to miss
the post.
• The malware is later named Stuxnet, and it is
MAGNIFICENT! very dangerous
Stuxnet timeline
• 0.500 November 3, 2005 C&C server registration
• 0.500 July 4, 2009 Infection stop date
• 1.001 June 22, 2009 Main binary compile timestamp
• 1.100 March 1, 2010 Main binary compile timestamp
• 1.101 April 14, 2010 Main binary compile timestamp
• 1.x June 24, 2012 Infection stop date
What does it do?
• It attacks SCADA-systems in a very sneaky
way.
Infection
• USB-drive
• Primary infection is through a USB drive in a windows system.
• The malware uses a vulnerability in windows shortcuts to install
itself without user interaction
• RPC-exploit for windows
• Used to spread throughout the windows network
• Rootkits, user-mode and kernel-mode is used to hide itself in the
targets.
• The worm updates from C&C servers (DE&MY) and P2P
Infection continued
• Four zero-day exploits were used, as well as
CPLINK and Conficker exploits to accomplish
the attack.
• That is a pretty impressive attack.
Target
• All this was just preliminary
• The target was the server running WinCC
• It is used to update the actual PLC:s in the ICS
• StuxNet carried two genuine certificates from
Jmicron and Realtek
• Using another Zeroday and the certificates,
Stuxnet uploaded malware to the ICS
UNLESS!
• Stuxnet had an encoded failsafe:
• Siemens S7-300 PLC:s..
• ..controlling Variable-frequency drives..
• ..from either Fararo Paya, or Vacon..
• ...that runs at 807-1210Hz
• Curioser and curioser!
Payload
• Rootkit that hides the software, but also hides
its activities
• It sometimes changes the control frequency as
follows: Up to 1402Hz, down to 2Hz, up to
1064Hz
• If this happed to a centrifuge, it would most
likely cause vibrations that would eventually
destroy the ballbearings, or the structural
integrity of the main body. (The tube)
Notable differences
• Four Zerodays
• Two known exploits
• Two root kits
• C&C network
• P2P network
• Two stolen
Certificates
Office Network Scada Network
• Root kit
Most ICS networks lack security
• Focus is on safety
• Security is often non existing once you get
network access.
• Some work has been done regarding white
listing or IDS:s
• Most traditional IT-security measures have
been unsuitable for ICS:s
• Security depends securing access points
Why?
PC
PLC
Security through NW-separation
• Eroded by need for
patching modern OS:s
• Eroded by use of public
datacommunication lines
• Eroded by use of IP and
other COTS methods
• Eroded by lack of training
and awareness of the
personnel
Then and now:
Air gap?
5.9MB
4578
6618
9379
12535
15356
18072
2016621987
2374925482
2739628261
0
5000
10000
15000
20000
25000
30000
2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015
Disclosed high severity vulnerabilities
Year
Total
Background
DefenseIncident
Vulnerability
Attack
IT system
Impact on
system
Impact on
service realized
by the system
Scan
Overview of cyber intrusions
There is no “silver bullet”
• 99% of all alarms are false…
Hannes Holm, Signature Based Intrusion Detection for Zero-Day Attacks: (Not)
a Closed Chapter?, Hawaii International Conference on Systems Sciences,
2014
Unauthenticated scan Authenticated scan
Scanner SamplesRemediation
%Remediation
% (Linux)Remediation
% (Win)Remediation
%Remediation %
(Linux)Remediation
% (Win)
AVDS 50 40 65 19 70 65 75
McAfee VM 50 22 22 22 53 22 83
Nessus 50 22 9 33 72 70 75NeXpose 50 34 43 26 53 43 63
Patchlinkscan
50 2 4 0 45 13 75
QualysGuard 50 38 48 30 87 83 92
SAINT 50 54 83 30 66 100 33
Hannes Holm, Performance of automated network vulnerability scanning at remediating
security issues, Computers and Security, 2012
There is no “silver bullet”
Hannes Holm, Performance of automated network vulnerability scanning at remediating
security issues, Computers and Security, 2012
There is no “silver bullet”
Scan
AttackAbuseJonsson, Erland, and Tomas Olovsson. "A quantitative model of the security intrusion process based on attacker behavior." Software Engineering, IEEE
Transactions on 23.4 (1997): 235-245.
Holm, Hannes, Matus Korman, and Mathias Ekstedt. "A bayesian network model for likelihood estimations of acquirement of critical software vulnerabilities and
exploits." Information and Software Technology 58 (2015): 304-318.
McQueen, Miles A., et al. "Time-to-compromise model for cyber risk reduction estimation." Quality of Protection. Springer US, 2006. 49-64.
Overview of IT intrusions
Vulnerability
Vulnerability timeline
Tid
Vulnerability discovered
Vulnerability disclosed
Detection signatures
Exploit publishedPatch
Software release
Arbaugh, William A., William L. Fithen, and John McHugh. "Windows of vulnerability:
A case study analysis." Computer 33.12 (2000): 52-59.
Vulnerability timeline
Tid
Vulnerability discovered
Vulnerability disclosed
Detection signatures
Exploit publishedPatch
Software release
Arbaugh, William A., William L. Fithen, and John McHugh. "Windows of vulnerability:
A case study analysis." Computer 33.12 (2000): 52-59.
Known0-day
Windows of vulnerability
Arbaugh, William A., William L. Fithen, and John McHugh. "Windows of vulnerability:
A case study analysis." Computer 33.12 (2000): 52-59.
A 0-day attack on average requires 312 days to detect
Bilge, Leyla, and Tudor Dumitras. "Before we knew it: an empirical study
of zero-day attacks in the real world." Proceedings of the 2012 ACM
conference on Computer and communications security. ACM, 2012.
A patch is released on average 7 days after a vulnerability is disclosed
Holm, Hannes, Matus Korman,
and Mathias Ekstedt. "A bayesian
network model for likelihood
estimations of acquirement of
critical software vulnerabilities and
exploits." Information and Software
Technology 58 (2015): 304-318.
312 days 7 dagar
Network traffic (PCAP) given 0.5 Gb bandwidth during 1 year: 1684.8 TB
Scan
AttackAbuseJonsson, Erland, and Tomas Olovsson. "A quantitative model of the security intrusion process based on attacker behavior." Software Engineering, IEEE
Transactions on 23.4 (1997): 235-245.
Holm, Hannes, Matus Korman, and Mathias Ekstedt. "A bayesian network model for likelihood estimations of acquirement of critical software vulnerabilities and
exploits." Information and Software Technology 58 (2015): 304-318.
McQueen, Miles A., et al. "Time-to-compromise model for cyber risk reduction estimation." Quality of Protection. Springer US, 2006. 49-64.
Overview of IT intrusions
Vulnerability
Common vulnerabilities and exploits
• Social engineering
• Denial of service
• Spoofing
• Code injection
• Buffer overflow
• Bypass access control
Common vulnerabilities and exploits
• Social engineering
• Denial of service
• Spoofing
• Code injection
• Buffer overflow
• Bypass access control
• Deceive an individual to comply with a malicious request– E-mail (phishing)
– Telephone
– Rarely physically
• The threat actor exploits feelings (e.g., fear or excitement) and established trust relationships
• Common exploit vectors:– Link to web server
• Drive-by-download
– Theft of credentials
– Malicious appendix
Holm, Hannes, Waldo Rocha Flores, and Göran Ericsson. "Cyber security for a Smart Grid-What
about phishing?." Innovative Smart Grid Technologies Europe (ISGT EUROPE), 2013 4th
IEEE/PES. IEEE, 2013.
Social engineering
Phishing experiment #1
Holm, Hannes, Waldo Rocha Flores, and Göran Ericsson. "Cyber security for a Smart Grid-What about phishing?."
Innovative Smart Grid Technologies Europe (ISGT EUROPE), 2013 4th IEEE/PES. IEEE, 2013.
Click link 8%
Run application 2%
Phishing experiment #2
Holm, Hannes, Waldo Rocha Flores, and Göran Ericsson. "Cyber security for a Smart Grid-What about phishing?."
Innovative Smart Grid Technologies Europe (ISGT EUROPE), 2013 4th IEEE/PES. IEEE, 2013.
Click link 30%
Run application 11%
Common vulnerabilities and exploits
• Social engineering
• Denial of service
• Spoofing
• Code injection
• Buffer overflow
• Bypass access control
• Send data that an application or system cannot
properly manage
• Methods:
– Send large amounts of data
– Send invalid data
Denial of Service
Common vulnerabilities and exploits
• Social engineering
• Denial of service
• Spoofing
• Code injection
• Buffer overflow
• Bypass access control
Spoofing
• Att lura en entitet att man är någon annan
• Man-in-the-middle
• SMTP spoofing
• DNS spoofing
• ARP spoofing
DNS Spoofing
UserUserwww.foi.sewww.foi.se
’ ’ DNS server
’ ’ DNS server
Designated DNS server
Designated DNS server
se DNS server
se DNS server
foi.seDNS server
foi.seDNS server
1
6
23
5
4
Domain IPwww.aftonbladet.se 193.14.90.203
www.stackoverflow.com 198.252.206.16
www.foi.se 150.227.7.71
DNS Spoofing
UserUserwww.foi.sewww.foi.se
’ ’ DNS server
’ ’ DNS server
Designated DNS server
Designated DNS server
se DNS server
se DNS server
foi.se DNS server
foi.se DNS server
1
3
2
Domain IPwww.aftonbladet.se 193.14.90.203
www.stackoverflow.com 198.252.206.16
www.foi.se 150.227.7.71
UserUserwww.evilhamster.comwww.evilhamster.com
Designated DNS server
Designated DNS server
1
3
2
DNS SpoofingDomain IPwww.aftonbladet.se 120.237.19.52
www.stackoverflow.com 120.237.19.52
www.foi.se 120.237.19.52
ARP Spoofing
UserUser
UserUser
UserUser
UserUser
UserUser
IP MAC192.168.10.1 00:19:5B:4C:2C:5A
192.168.10.2 00:40:8C:55:70:9C
192.168.10.3 00:14:22:52:6B:03
192.168.10.4 00:B0:55:86:BB:F7
192.168.10.5 00:22:6B:86:BB:E4
192.168.10.6 00:55:D0:86:03:40
ARP Spoofing
UserUser
UserUser
UserUser
UserUser
UserUser
Threat agentThreat agent
IP MAC192.168.10.1 08:19:55:6B:14:70
192.168.10.2 08:19:55:6B:14:70
192.168.10.3 08:19:55:6B:14:70
192.168.10.4 08:19:55:6B:14:70
192.168.10.5 08:19:55:6B:14:70
192.168.10.6 08:19:55:6B:14:70
192.168.10.7 08:19:55:6B:14:70
192.168.10.7
Defenses against spoofing
• SMTP spoofing
– Sign e-mails
• DNS spoofing
– DNSSEC
• ARP spoofing
– ARP-defense modules such as dynamic
ARP inspect or IP-MAC connection
in DHCP server
– Static ARP-tables
AnvändareAnvändarewww.foi.sewww.foi.se
’ ’ DNS server
’ ’ DNS server
Föredragen DNS server
Föredragen DNS server
com DNS server
com DNS server
foi.com DNS server
foi.com DNS server
1
6
23
5
4
Domän IPwww.aftonbladet.se 193.14.90.203
www.stackoverflow.com 198.252.206.16
www.foi.se 150.227.7.71
Common vulnerabilities and exploits
• Social engineering
• Denial of service
• Spoofing
• Code injection
• Buffer overflow
• Bypass access control
Code injection
• SQL injection
• File inclusion
• Command Injection
• Cross site scripting (XSS)
#include <stdio.h>
#include <unistd.h>
int main(int argc, char **argv) {
char cat[] = "cat ";
char *command;
size_t commandLength;
commandLength = strlen(cat) + strlen(argv[1]) + 1;
command = (char *) malloc(commandLength);
strncpy(command, cat, commandLength);
strncat(command, argv[1], (commandLength - strlen(cat)) );
system(command);return (0);
}
<?php
exec('ping -c1 ' . $_ GET['PING'], $result);
foreach($output as $line)
{
echo $line .’<br>’;
}
?>
Code injection concerns more than
web applications!
Defenses against code injection
• Development– Use libraries that have undergone security tests
– Code analysis
• Operation– System tests
– Hardening
– Access control
<?php
include( $_GET['COLOR'] . '.php' );
?>
<?php
if ($_GET['COLOR'] == ”red”) {
include('red.php');
}
elseif($_GET['COLOR'] == ”blue”) {
include('blue.php');
}
?>
Common vulnerabilities and exploits
• Social engineering
• Denial of service
• Spoofing
• Code injection
• Buffer overflow
• Bypass access control
Memory virtualization
MS Word
Outlook
Paint
Address Value0x000ff7ac ”HELO"
0x000ff7a8 ”GREY"
0x000ff7a4 ”KITTY”
0x000ff7a0 0x4a3c5c9e
0x7FFFFFFF
0x00000000
0xFFFFFFFF
--Kernel memory--
1 2 3 44 Byte
32 Bit 8 16 24 32
Buffer overflowexample.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
user@system:~$./example test
example.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
Buffer overflow
0x00000000
…
Pointer to argv[1]
…
0xFFFFFFFF
Top address =
user@system:~$./example test
Top of
stack
0x000ff7ac
Address Value0x003e3cb2 ”test"
”test”
MS Word
Outlook
Paint
”test”
example.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
Buffer overflow
0x00000000
…
Pointer to argv[1]
…
0xFFFFFFFF
user@system:~$./example test
Top of stack
Top address
Buffer overflowexample.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
0x00000000
…
Saved instruction pointer
Pointer to argv[1]
…
0xFFFFFFFF
Top of stack
Top address
user@system:~$./example test
”test”
Top+4 byte
Buffer overflowexample.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
0x00000000
…
Saved frame address
Saved instruction pointer
Pointer to argv[1]
0xFFFFFFFF
user@system:~$./example test
”test”
Top of stack
Top+8 byte
Top address
Top+4 byte
Buffer overflowexample.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
0x00000000
…
<Space for MyVar>
Saved frame address
Saved instruction pointer
Pointer to argv[1]
…
0xFFFFFFFF
Top of stack
Top address
Top+8 byte
Top+12 byte
user@system:~$./example test
”test”
Top+16 byte
Buffer overflowexample.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
0x00000000
…
<Plats för MyVar>
Saved frame address
Saved instruction pointer
Pointer to argv[1]
…
0xFFFFFFFF
Top of stack
user@system:~$./example test
test\n
”test”
Top address
Top+8 byte
Top+12 byte
Top+16 byte
strcpy()
Buffer overflowexample.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
0x00000000
…
Saved frame address
Saved instruction pointer
Pointer to argv[1]
…
0xFFFFFFFF
Top of stack
user@system:~$./example test
”test”
Top+8 byte
Top address
Top+4 byte
Buffer overflowexample.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
0x00000000
…
Saved instruction pointer
Pointer to argv[1]
…
0xFFFFFFFF
Top of stack
user@system:~$./example test
”test”
Top address
Top+4 byte
Buffer overflowexample.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
0x00000000
…
<Space for MyVar>
Saved frame address
Saved instruction pointer
Pointer to argv[1]
…
0xFFFFFFFF
Top of stack
user@system:~$./example AAAAAAAABBBBCCCCDDDD
”AAAAAAAABBBBCCCCDDDD”
Top address
Top+8 byte
Top+12 byte
Top+16 byte
Buffer overflowexample.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
0x00000000
…
<Space forför MyVar>
Saved frameadress
Saved instruction pointer
Pointertill argv[1]
…
0xFFFFFFFF
Top of stack
”AAAAAAAABBBBCCCCDDDD”
AAAA
AAAA
BBBB
CCCC
DDDD
user@system:~$./example AAAAAAAABBBBCCCCDDDD
Top address
Top+8 byte
Top+12 byte
Top+16 byte
strcpy()
Buffer overflowexample.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
0x00000000
…
Saved instruction pointer
Pointertill argv[1]
…
0xFFFFFFFF
Top of stack
CCCC
”AAAAAAAABBBBCCCCDDDD”
DDDD
user@system:~$./example AAAAAAAABBBBCCCCDDDD
Toppadress
Top+4 byte
Top+16 byte
0x00000000
…
<Spaceför MyVar>
Saved frame address
Saved instruction pointer
Pointertill argv[1]
…
0xFFFFFFFF
Buffer overflowexample.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
Topaddress
DDDD
user@system:~$./example AAAAAAAABBBB$TopaddressDDDD
AAAA
AAAA
BBBB
Top address
Top+8 byte
Top+12 byte
”AAAAAAAABBBB$TopaddressDDDD”
strcpy()
Top address
Top+8 byte
Top+12 byte
Top+16 byte
0x00000000
…
<Plats för MyVar>
Sparad frameadress
Sparad instruktionspekare
Pekare till argv[1]
…
0xFFFFFFFF
Buffer overflowexample.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
Topaddress
$kod
user@system:~$./example AAAAAAAABBBB$Topaddress$code
AAAA
AAAA
BBBB
code= "\xdb\xc0\x31\xc9\xbf\x7c\x16\x70\xcc\xd9\x74\x24\xf4\xb1" . "\x1e\x58\x31\x78\x18\x83\xe8\xfc\x03\x78\x68\xf4\x85\x30" . "\x78\xbc\x65\xc9\x78\xb6\x23\xf5\xf3\xb4\xae\x7d\x02\xaa" . "\x3a\x32\x1c\xbf\x62\xed\x1d\x54\xd5\x66\x29\x21\xe7\x96" . "\x60\xf5\x71\xca\x06\x35\xf5\x14\xc7\x7c\xfb\x1b\x05\x6b" . "\xf0\x27\xdd\x48\xfd\x22\x38\x1b\xa2\xe8\xc3\xf7\x3b\x7a" . "\xcf\x4c\x4f\x23\xd3\x53\xa4\x57\xf7\xd8\x3b\x83\x8e\x83" . "\x1f\x57\x53\x64\x51\xa1\x33\xcd\xf5\xc6\xf5\xc1\x7e\x98" . "\xf5\xaa\xf1\x05\xa8\x26\x99\x3d\x3b\xc0\xd9\xfe\x51\x61" . "\xb6\x0e\x2f\x85\x19\x87\xb7\x78\x2f\x59\x90\x7b\xd7\x05" . "\x7f\xe8\x7b\xca"
strcpy()
Defenses against buffer overflow
• Development– Use languages with garbage collectors
– Use modern IDEs
– Use tested libraries
– Perform code analysis
• Operation– System tests
– Hardening
– Operating system defenses• Data Execution Prevention (DEP), Adress Space Layout
Randomization (ASLR), stack cookies, safeSEH
– Access control
example.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strcpy(MyVar,Buffer);
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
example.c
#include <string.h>
void do_something(char *Buffer)
{
char MyVar[8];
strncpy(MyVar,Buffer,sizeof(MyVar));
}
int main (int argc, char **argv)
{
do_something(argv[1]);
}
Common vulnerabilities and exploits
• Social engineering
• Denial of service
• Spoofing
• Code injection
• Buffer overflow
• Bypass access control
Bypass access control
• Credentials that are saved in a vulnerable
application, are sent/stored in clear text or with
weak encryption/low entropy
• Can be accomplished by most previously
discussed attacks
SELECT Users.Username
FROM Users
WHERE Users.Username = ’Anders’
AND Users.Password = ’gottmos’ OR 1=1
SQL
databas
• Administration and technology
• Access control
• Proactive vs reactive
• Detection vs prevention
• Network vs agents
• 0-day vs known attacks
General defenses
Questions?
Embedded systemsThe old days was better, or at least more robust
Mikael Wedlin, mwe@foi.se
Summary
• More and more of our surroundings is controlled by
software
• Tools can get new unexpected features
• Software around us becomes only more advanced =
more complicated and komplex
• Development is not the same as more robust
• Internet of Things
• Often Wireless
• Difficult to think outside the normal use
What is security?
Confidentiality
Availability Integrity
CIA vs AIC
Confidentiality
Availability Integrity
http://www.anniesinternetcafe.com
Healthcare
Healthcare
Another example of embedded systems
Are we without hope?
State that "attacking" SCADA systems
- a known example!
1982: Siberian Gas Pipeline Explosion.
”While the following cannot be fully confirmed, it has been reported that during the Cold War the CIA inserted malicious code into control system software leaked to the Soviet Union. The software, which controlled pumps, turbines, and valves on a Soviet gas pipeline, was programmed to malfunction after a set interval. The malfunction caused the control system to reset pump speeds and valve settings to produce pressures beyond the failure ratings of pipeline joints and welds, eventually causing an enormous explosion.”
This has more or less been verified by V. G. & Cherkashin Feifer, Spy Handler: Memoir of a KGB officer: The true story of the man who recruited Robert That Hanssen and Aldrich Ames, Basic Books, 2005.
Trust
Fides est bona,
sed custodia
est melior
2007 Pew Global Attitudes
Social trust
• QOG gives good societies
• Lack of trust provides:
• Increased transaction costs
• Increased feeling of insecurity
• Reduced use
• IT Security do not add any functionality
• IT security must be built from the beginning
Trivial example
Tailored Access Operations
Glenn Greenwald: No place to hide,
2014
Last Home-PC offer
Nytt HemPC-erbjudande
Frånowner-alla-lin@foi.se för Tommy Lodehed
Datum den 19 april 2006 13:16:00Tillalla@foi.seKopia:
ÄmneNytt HemPC-erbjudandeHej,
nytt HemPC-erbjudande enligt nedan.
För PC:
www.dellhempc.nu/view/foi
Klicka på länken ovan så kommer du till beställningssidan.
OBS! När du beställt, skriv ut beställningsbekräftelsen samt låneavtalet (som skall skrivas under) och skicka till Tommy Lodehed på Inköp.
För Mac:
http://intranet.foi.se/upload/organisation/forskningsstod/enheter/ekonomi/inkop/HemMac-2006-FOI.pdf
Klicka på länken ovan så öppnas ett PDF-dokument.
Skriv ut och fyll i beställningen som skall skickas tillsammans med ett underskrivet låneavtal till Tommy Lodehed på Inköp.
Har du några frågor kontakta mig.
Hälsningar
Tommy
Tommy Lodehed
FOI Inköp/FOI Purchasing Office
Phone: 46 13 378117 / Fax: 46 13 378067
Email: tommy.lodehed@foi.se
www.dellhempc.nu/view/foi
More of the same contract
www.dellhempc.nu/view/forsvaret
Login at Dell
Personnummer
UTAN bindestreck
So what?
• Could there be something on a home PC that
does not should not be there?
• Do we take work home?
A new quote
"So Snowden returned to the NSA, the This time as
an employee of Dell Corporation, which
collaborated with the Agency. "
Glenn Greenwald: No place to hide,
2014
Risks with increased security?