Testing Malicious Code Detection Tools

32
Testing Malicious Code Detection Tools Mihai Christodorescu [email protected] WiSA http://www.cs.wisc.edu/wisa University of Wisconsin, Madison

Transcript of Testing Malicious Code Detection Tools

Page 1: Testing Malicious Code Detection Tools

Testing Malicious Code Detection Tools

Mihai [email protected]

WiSA http://www.cs.wisc.edu/wisaUniversity of Wisconsin, Madison

Page 2: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 2

Problem• Wrong focus: Testing looks only at

today’s malware– What about tomorrow’s malware?

• Efficacy: How does one compare the efficacy of several malware detection tools?– Lack of openness about implementations

Page 3: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 3

Our Solution

• Generate new malware test cases– Using obfuscation transformations– Based on existing malware instances

• Test detection capabilities across a wide range of malware types

Test Case Generation through Obfuscation

Page 4: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 4

MilestonesBinary rewriting infrastructure– For IA-32/Windows– For Visual Basic

Suite of obfuscations

Comparison metrics

Complete test suite

Page 5: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 5

Overview

• Goals• State of the art• Our approach• Testing environment• Evaluation• Conclusions

Page 6: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 6

Testing Malware Detectors

Malware Detection Tool’s Goal:Detect malicious code!

• Focus: executable code that replicates– Viruses, worms, trojans– Not buffer overflow attacks– Not spyware

• Code is mobile

Page 7: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 7

Testing Goals1. Measure detection rate for existing

malwareFalse negatives

2. Measure detection rate for benign software

False positives3. Measure detection rate of new

malwareResilience to new malicious code

Page 8: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 8

State of the art• Several testing labs

– Commercial• Virus Bulletin• International Computer Security Association

(ICSA)• West Coast Lab’s CheckMark

– Independent• University of Hamburg Virus Test Center (VTC)• University of Magdeburg

Page 9: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 9

Sample certification req’sVirus Bulletin 100% Award

• Detect all In The Wild viruses during on-demand and on-access tests

• Generate no false positives

ITW virus lists are maintained by WildList Organization International

Page 10: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 10

Testing Goals1. Measure detection rate for existing

malwareFalse negatives

2. Measure detection rate of benign software

False positives3. Measure detection rate of new

malwareResilience to new malicious code

???

Checked

Checked

Page 11: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 11

Testing against Future Malware• First attempt:

Andreas Marx “Retrospective Testing”– Test 3- and 6-month old virus scanners

8% - 37%Win32 worms, trojans, backdoors24% - 79%Win32 file viruses35% - 82%Script viruses74% - 94%Macro virusesDetection RateMalware Type

Page 12: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 12

Testing against Future Malware• We can learn from the past:

– Often, old malicious code is slightly changed and re-launched

• Sobig e-mail worm

Hard-coded IPsHard-coded IPsGeocities-hosted pageUpdate path

July 14July 2June 8Deactivation

Mass e-mail,Copy to shared drive

Mass e-mail,Copy to shared drive

Mass e-mail,Copy to shared drive

Distribution

[email protected]@[email protected] “From”

Sobig.ESobig.DSobig.C

Page 13: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 13

Overview

• Goals• State of the art• Our approach• Testing environment• Evaluation• Conclusions

Page 14: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 14

Obfuscation = Test Case Generation

• Obfuscate current known malware to obtain new malware test cases

• Why?– Many viruses / worms / trojans reuse code

from older malware– Simulate self-mutating malware– Measure the ability of anti-virus tools to

detect malicious behavior, not just malicious code instances

Page 15: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 15

Test Case Generation

V1V2

.

.

.

Vn

Collection of current malware

x σ1 σ2 ... σm

Set of obfuscation

transformations

=

V’1,1 . . . V’1,mV’2,1 . . . V2,m

. .

. .

. .

V’n,1 . . . V’n,m

New malwarefor testing

Code Reordering

Garbage Insertion

Interpreter

...

Page 16: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 16

Test Case Generation

V1V2

.

.

.

Vn

Collection of current malware

x σ1 σ2 ... σm

Set of obfuscation

transformations

=

V’1,1 . . . V’1,mV’2,1 . . . V2,m

. .

. .

. .

V’n,1 . . . V’n,m

New malwarefor testing

Library of obfuscations

BinaryIDA Pro

BuildCFGs

ParseBinary

BREW

ConnectorMemoryAnalysis

GenerateCode

Rewrite

GeneratedBinary

Page 17: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 17

Sample Obfuscations• Change data:

– New strings, new dates, new constants– Encode / encrypt constants

• Change control:– Insert garbage– Encode / encrypt code fragments– Reorder code– Add new features– …

Page 18: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 18

Sample Obfuscations

Message=“Read this..”

Message.Send

Message=

Message.Send

...

Sub Decode(...)

...

Decode( )“13FQ...”

- Encode / encrypt constants

Page 19: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 19

Earlier Obfuscation Results

Chernobyl-1.4

f0sf0r0

Hare

z0mbie-6.b

Commercial anti-virus tools vs. morphed versions of known viruses

Not detected

Not detected

Not detected

Not detected

Not detected

Not detected

Not detected

Not detected

Not detected

Not detected

Not detected

Not detected

Page 20: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 20

Ideal Testing Results

VirusA

σ1(A) σ2(A) σ3(A) σ4(A) σ5(A) σ6(A) σ7(A) σ8(A)

Norton AVCommand AV

McAfee AV

Page 21: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 21

Parametrized ObfuscationEncoding data:

Parameters:• Data to obfuscate• Type of encoding / encryption• Encryption key• Location of obfuscation

Message=“Read this..”

Message.Send

Message=Decode(“13FQ...”)

Message.Send

...

Sub Decode(...)

...

Page 22: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 22

Testing Environment• Simple and complex obfuscations applied

to known (detected) malware

• Multiple malware detection tools:– McAfee– Norton– Sophos– Kasperski

Page 23: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 23

Overview

• Goals• State of the art• Our approach• Testing environment• Evaluation• Conclusions

Page 24: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 24

Evaluation• Test against a set of malware detected

in original form by all tools• Test using the same obfuscations and

the same parameters

• Obfuscation hierarchy– From simple to most complex

Page 25: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 25

Metrics• Minimum obfuscation level:

For a given obfuscation σ with parameters (x1, ..., xk), what are the least values for each xi that generates a false negative?

• Minimal combination of obfuscations:What is the smallest set of obfuscations {σ 1,

..., σ k} that generates a false negative?

Page 26: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 26

Preliminary Results

AnnaKourn

ikovaEnco

de 1Enco

de 5Enco

de 9Enco

de 13Enco

de 17Enco

de 21Enco

de 25Enco

de 29

Norton AVCommand AV

McAfee AV

Page 27: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 27

Lessons Learned• Malware spreads instantaneously

– Arms race between malware distribution and malware detection tool update

• In testing malware detection tools, one must use a virus writer’s mindset– It is a game need to think several moves

ahead

Page 28: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 28

Future Work• Explore more obfuscations

Depends on results of ongoing tests• Future idea # 1:

Self-guided test tool that finds the minimal test cases for false negatives

• Future idea # 2:Develop tests to check for detection of

malicious behavior, not just code sequences

Page 29: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 29

Seeing Through the Obfuscations

Malicious CodeBlueprint

Programto analyze

PatternLibrary

Annotator

AnnotatedProgram Detector

Yes/No

Smart Virus Scanner

Page 30: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 30

Detection Example

push eaxsidt [esp-02h]pop ebxadd ebx, HookNo * 08h + 04hclimov ebp, [ebx]mov bp, [ebx-04h]lea esi, MyHook - @1[ecx]push esimov [ebx-04h], sishr esi, 16mov [ebx+02h], sipop esi

Virus Code:

Virus Automaton:

mov X, Y

mov X1, Z

lea A, B

Irrelevant instruction

Irrelevant instruction

Irrelevant instruction

VirusFound!

(from Chernobyl CIH 1.4 virus)

Page 31: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 31

ReferencesMihai Christodorescu, Somesh Jha “Static

analysis of executables to detect malicious patterns”. USENIX Security’03, August 2003, Washington DC.

Andreas Marx “Retrospective testing - how good heuristics really work”. Virus Bulletin Conference, November 2002, New Orleans, LA.

Sarah Gordon “Antivirus software testing for the year 2000 and beyond”. 3rd NISSC Proceedings, October 16-19, 2000, Baltimore, MD.

Page 32: Testing Malicious Code Detection Tools

22 July 2003Mihai Christodorescu <[email protected]>WiSA http://www.cs.wisc.edu 32

Codesurfer

IDA Pro Clients

BREW

Connector

Binary Code Rewriting

Binary

MemoryAnalysis

GenerateCode

DetectMalicious Code

Detect BufferOverrunBuild SDG

Browse

Rewrite

BuildCFGs

ParseBinary

Build ProgramSpecification

GeneratedBinary