Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra,...

34
Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal

Transcript of Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra,...

Page 1: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Dependability benchmarking for transactional and web systems

Henrique MadeiraUniversity of Coimbra, DEI-CISUC

Coimbra, Portugal

Page 2: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 2

Ingredients of a recipe to “bake” a dependability benchmark

• Measures

• Workload

• Faultload• Procedure and rules (how to cook the thing)

Dependability benchmark specification• Document based only

or• Document + programs, tools,

Page 3: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 3

Benchmark properties

• Representativeness

• Portability

• Repeatability

• Scalability

• Non-intrusiveness

• Easy to use

• Easy to understand

Page 4: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 4

Benchmark properties

• Representativeness

• Portability

• Repeatability

• Scalability

• Non-intrusiveness

• Easy to use

• Easy to understand

• A benchmark is always an abstraction of the real world! It’s an imperfect and incomplete view of the world.

•Usefulness improve things

•Agreement

In practice…

Page 5: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 5

The very nature of a benchmark

• Compare components, systems, architectures, configurations, etc.

• Highly specific: applicable/valid for a very well defined domain.

• Contribute to improve computer systems because you can compare alternative solutions.

• A real benchmark represents an agreement.

Page 6: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 6

Three examples of dependability benchmarks for transactional systems

1. DBench-OLTP [DSN 2003 + VLDB 2003] Dependability benchmark for OLTP systems (database centric) Provided as a document structured in clauses (like TPC benchmarks)

2. Web-DB [SAFECOMP 2004]

Dependability benchmark for web servers

Provided as a set of ready-to-run programs and document-based rules

3. Security benchmark (first step) [DSN 2005] Security benchmark for database management systems Set of tests to database security mechanisms

Page 7: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 7

The DBench-OLTP Dependability Benchmark

(SUB)System Under Benchmarking

(BMS) Benchmark

Management System

DBMS

OSBM

RTE

FLE

Workload

Faultload

Control Data + results

Benchmark Target

Workload and setup adopted from the TPC-C performance benchmark

Page 8: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 8

Phase 1 Phase 2Time

Slot NSlot 1 Slot 2 Slot 3

• Phase 1: Baseline performance measures (TPC-C measures)

• Phase 2: Performance measures in the presence of the faultload

Dependability measures

Injection time

Recovery timeTesting Slot

(Start)Testing Slot

(End)

Steady state time

KeeptimeDetection

time

Recovery

start

Fault activatio

n

Recovery end

Steady state

condition

Data Integrity Testing

Measurement Interval

Benchmarking Procedure

Page 9: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 9

Measures

• Baseline performance measures tpmC – number of transactions executed per minute $/tpmC – price per transaction

• Performance measures in the presence of the faultload Tf – number of transactions executed per minute (with faults) $/Tf – price per transaction (with faults)

• Dependability measures AvtS – availability from the server point-of-view AvtC – availability from the clients point-of-view Ne – number of data integrity errors

Page 10: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 10

Faultload

• Operator faults Emulate database administrator mistakes

• Software faults Emulate software bugs in the operating system

• High-level Hardware failures Emulates hardware component failures

Page 11: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 11

Examples of systems benchmarked

System Hardware Operating System DBMS DBMS Config.

A

Processor: Intel Pentium III 800 MHz

Memory: 256MB Hard Disks: Four

20GB/7200 rpm Network:

Fast Ethernet

Win2k Prof . SP 3 Oracle 8i (8.1.7) Conf. 1B Win2k Prof . SP 3 Oracle 9i (9.0.2) Conf. 1C WinXp Prof. SP 1 Oracle 8i (8.1.7) Conf. 1D WinXp Prof. SP 1 Oracle 9i (9.0.2) Conf. 1E Win2k Prof . SP 3 Oracle 8i (8.1.7) Conf. 2F Win2k Prof . SP 3 Oracle 9i (9.0.2) Conf. 2G SuSE Linux 7.3 Oracle 8i (8.1.7) Conf. 1H SuSE Linux 7.3 Oracle 9i (9.0.2) Conf. 1I RedHat Linux 7.3 PostgreSQL 7.3 -

J Processor: Intel

Pentium IV 2 GHz

Memory: 512MB Hard Disks: Four

20GB / 7200 rpm Network:

Fast Ethernet

Win2k Prof . SP 3 Oracle 8i (8.1.7) Conf. 1

K Win2k Prof . SP 3 Oracle 9i (9.0.2) Conf. 1

Page 12: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 12

DBench-OLTP benchmarking results

Baseline Performance

0

1000

2000

3000

4000

A B C D E F G H I J K0

10

20

30

$

tpmC$/tpmC

Performance With Faults

0

1000

2000

3000

4000

A B C D E F G H I J K0

10

20

30

$

Tf$/Tf

Availability

50

60

70

80

90

100

A B C D E F G H I J K

% AvtS (Server)AvtC (Clients) • Performance

Page 13: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 13

DBench-OLTP benchmarking results

Baseline Performance

0

1000

2000

3000

4000

A B C D E F G H I J K0

10

20

30

$

tpmC$/tpmC

Performance With Faults

0

1000

2000

3000

4000

A B C D E F G H I J K0

10

20

30

$

Tf$/Tf

Availability

50

60

70

80

90

100

A B C D E F G H I J K

% AvtS (Server)AvtC (Clients) • Performance

• Availability

Page 14: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 14

DBench-OLTP benchmarking results

Baseline Performance

0

1000

2000

3000

4000

A B C D E F G H I J K0

10

20

30

$

tpmC$/tpmC

Performance With Faults

0

1000

2000

3000

4000

A B C D E F G H I J K0

10

20

30

$

Tf$/Tf

Availability

50

60

70

80

90

100

A B C D E F G H I J K

% AvtS (Server)AvtC (Clients) • Performance

• Availability

• Price

Page 15: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 15

Using DBench-OLTP to obtain more specific results

Availability (AvtS)

0

20

40

60

80

100

1 11 21 31 41 51 61 71 81 91Injecton Slot

%

Win2kOra8iWin2kOra9i

Availability (AvtC)

0

20

40

60

80

100

1 11 21 31 41 51 61 71 81 91Injection Slot

%

Win2kOra8iWin2kOra9i

Availability variation during the benchmark run

Corresponds to about 32 hours of functioning in which the system have been subject of 97 faults. Each fault is injected in a 20 minutes injection slot. System is rebooted between slots

Page 16: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 16

DBench-OLTP benchmarking effort

Type of fault # of days

TPC-C benchmark implementation 10 (with reuse of code)

DBench‑OLTP benchmark implementation 10 (first implementation)

Benchmarking process execution 3 (average per system)

Page 17: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 17

The WEB-DB Dependability Benchmark

(SUB)System Under Benchmarking

(BMS) Benchmark

Management System

Web- Server

OS

SPECWeb Client

Benchmark Target

Fault injector

Bench. Coordinator

Availability tester

Workload and setup adopted from the SPECWeb99 performance benchmark

Workload

Faultload

Control Data + results

Page 18: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 18

WEB-DB measures Performance degradation measures:

SPECf : Main SPEC measure in the presence of the faultload

THRf : Throughput in the presence of the faultload (ops/s)

RTMf : Response time in the presence of the faultload (ms)

Dependability related measures

Availability: Percentage of time the server provides the expected service

Autonomy: Percentage of times the server recovered without human intervention (estimator of the self-healing abilities of the server)

Accuracy: Percentage of correct results yielded by the server

Page 19: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 19

Network & hardware faults Connection loss (server sockets are closed) Network interface failures (disable + enable the interface)

Operator faults Unscheduled system reboot Abrupt server termination

Software faults Emulation of common programming errors Injected in the operating system (not in the web-server)

WEB-DB faultloads

Page 20: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 20

BL 1 2 3 res

Benchmark procedure: 2 steps

Step 1: Determine baseline performance

(SUB + benchmark tools running workload without faults) Tune workload for a SPEC conformance of 100%

Step 2: 3 runs Each run comprises all faults specified in the faultload

Bechmark results: the average of the 3 runs

WEB-DB procedure

Time

Specweb Ramp Up + Ramp Down times.

faults

O.S. / Net / W.S.

workloadWeb Srv. (BT) idle idleworkload workload

Page 21: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 21

Examples of systems benchmarked

Benchmark and compare the dependability of two common web-servers:

Apache web-server Abyss web-server

When running on: Win. 2000 Win. XP Win. 2003

Page 22: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 22

89

91

93

95

97

99

2000 XP 200389

91

93

95

97

99

2000 XP 2003

89

91

93

95

97

99

2000 XP 2003

Availability Accuracy Autonomy

Apache Abyss

Dependability results

Page 23: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 23

0

5

10

15

20

25

2000 XP 2003

50

60

70

80

90

2000 XP 2003

SPECf THRf RTMf

350

360

370

380

390

400

410

2000 XP 2003

Apache Abyss

Baseline performance- Apache: 31, 26, 30- Abyss: 28, 25, 24

Performance in the presence of faults

Performance degradation (%)- Apache: 55.4, 30.7, 62.3 - Abyss: 63.2, 45.2, 46.3

Page 24: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 24

Security benchmark for database management systems

Client Application

Client Application

Client Application

Web Browser

Web Browser

Application Server

Web Server

Client Application

DBMSNetwork

Network

Network

Key Layer

Page 25: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 25

Security Attacks vs System Vulnerabilities

• Security attacks: Intentional attempts to access or destroy data

• System vulnerabilities: Hidden flaws in the system implementation

Features of the security mechanisms available

Configuration of the security mechanisms

Page 26: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 26

Approach for the evaluation of security in DBMS

• Characterization of DBMS security mechanisms

• Our approach:

1) Identification of data criticality levels

2) Definition of database security classes

3) Identification of security requirements for each class

4) Definition of security tests for two scenarios:– Compare different DBMS

– Help DBA assessing security in real installations

Page 27: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 27

Database Security ClassesDB Security Class Data Criticality Level Required Security Mechanisms

Class 0 None None

Class 1 Level 1 - User authentication (internal or external)

Class 2 Level 2 - User authentication- User privileges (system and object privileges)

Class 3 Level 3- User authentication- User privileges- Encryption in the data communication

Class 4 Level 4- User authentication- User privileges- Encryption in the data communication- Encryption in the data storage

Class 5 Level 5

- User authentication- User privileges- Encryption in the data communication- Encryption in the data storage- Auditing

Page 28: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 28

Requirements for DBMS Security Mechanisms

RequirementsReq.

Weight (%)Req.

Reference

The system must provide internal user authentication by using usernames and passwords 10 1.1

The system must guarantee that, besides the DBA users, no other users can read/write to/from the table/file where the usernames and passwords are stored

6 1.2

The password must be encrypted during the communication between the client and the server during the authentication 6 1.3

The passwords must be encrypted in the table/file where they are stored 4 1.4

• Internal user authentication (username/ password):

Page 29: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 29

Measures and Scenarios

• Measures provided: Security Class (SCL) Security Requirements Fulfillment (SRF)

• Potential scenarios: Compare different DBMS products Help DBA assessing security in real installations

Page 30: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 30

Comparing DBMS Security

• Set of tests to verify if the DBMS fulfill the security requirements

• Development of a database scenario: Database model (tables)

Data criticality levels for each table

Database accounts corresponding to the several DB user profiles

System and object privileges for each account

Network

Oracle?DB2?PostgreSQL?

System under development

Page 31: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 31

Database scenario

CUSTOMER C_ID C_FNAME C_LNAME C_PHONE C_EMAIL C_SINCE C_LAST_VISIT C_DISCOUNT C_BALANCE C_YTD_PMT C_BIRTH_DATE C_DATA C_ADDR_ID

<pk>

<fk>

ORDER O_ID O_DATE O_SUB_TOTAL O_TAX O_TOTAL O_SHIP_TYPE O_STATUS O_C_ID O_BILL_ADDR_ID O_SHIP_ADD_ADDR_ID

<pk>

<fk3> <fk1> <fk2>

ORDER_LINE OL_ID OL_QTY OL_DISCOUNT OL_COMMENT OL_O_ID OL_I_ID

<pk>

<fk1> <fk2>

ITEM I_ID I_TITLE I_PUB_DATE I_PUBLISHER I_SUBJECT I_DESC I_RELATED I_THUMBNAIL I_IMAGE I_SRP I_COST I_AVAIL I_STOCK I_ISBN I_PAGE I_BACKING I_DIMENSION I_A_ID

<pk>

<fk>

ADDRESS ADDR_ID ADDR_STREET1 ADDR_STREET2 ADDR_CITY ADDR_STATE ADDR_ZIP ADDR_CO_ID

<pk>

<fk>

COUNTRY CO_ID CO_NAME CO_CONTINENT

<pk>

AUTHOR A_ID A_FNAME A_LNAME A_MNAME A_DOB A_BIO A_CO_ID

<pk>

<fk>

USER U_ID U_USERNAME U_PASSWORD C_ID

<pk>

<fk>

CREDIT_CARD CX_O_ID CC_TYPE CC_NUM CC_NAME CC_EXPIRY CC_AUTH_ID CC_XACT_AMT CC_XACT_DATE CX_CO_ID

<pk,fk1>

<fk2>

• Database model:

Level 1

Level 4

Level 3

Level 5

Level 2

Page 32: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 32

Example: Comparative Analysis of Two DBMS

• Oracle 9i vs PostgreSQL 7.3Security Mechanism # Req. Oracle 9i PostgreSQL

Internal user authentication ALL OK OKExternal user authentication ALL OK OK

User privileges

3.1 OK Not OK3.2 OK OK3.3 OK OK3.4 OK OK3.5 OK Not OK

Encryption in the data communication

4.1 OK Not OK4.2 Depends on the method Not OK

Encryption in the data storage

5.1 OK Not OK5.2 Not OK Not OK5.3 Not OK Not OK

Auditing 6.1 OK Not OK

Page 33: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 33

Results Summary

Oracle 9i(encryption RC4, AES, and DES )

Oracle 9i(encryption 3DES)

PostgreSQL 7.3

Security Class Class 5 Class 5 Class 1

SRF metric 96% 92% 66%

• Oracle 9i does not fulfill all encryption requirements 400% < performance degradation < 2700%

• PostgreSQL 7.3: Some manual configuration is required to achieve Class 1 High SRF for a Class 1 DBMS Fulfills some Class 2 requirements

Page 34: Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, 2005 - Chicago, IL USA 34

Conclusions

• Comparing (components, systems, architectures, and configurations) is essential to improve computer systems Benchmarks needed!

• Comparisons could be missleading Benchmarks must be carefully validated!

• Two ways of having real benchmarks: Industry agreement User community (tacit agreement)