Making Performance Testing Simple with Apache JMeter€¦ · Making Performance Testing Simple with...

Post on 05-May-2018

230 views 2 download

Transcript of Making Performance Testing Simple with Apache JMeter€¦ · Making Performance Testing Simple with...

Making Performance Testing Simple with Apache JMeter

Presented by: Paul Hope

∗ Current: ∗ Principal IT Consultant – Substantia Solutions ∗ Software Quality Engineer – Source Allies (Client: John Deere)

∗ Past job functions include: ∗ Developer (VB, VB.Net, C#, COBOL, Java) ∗ Test Automation Engineer/Architect ∗ HP Tools Administrator ∗ Test Data Management Engineer ∗ Certified HP Tools Trainer ∗ Certified Agile Scrum Master ∗ Certified Agile Scrum Product Owner ∗ Certified Six Sigma Green Belt

∗ Hobbies/Interests ∗ Software algorithms ∗ Data mining and analytics ∗ Fine arts / theatre

A Little About Me…

∗ Software Performance Testing: ∗ What is it? ∗ Why do we need it? ∗ When should we do it? ∗ How do we do it with JMeter?

What We Will Cover Today

Definition: Software Performance testing is a Performance Engineering practice performed to determine how a system performs in terms of responsiveness and stability under a particular workload. ∗ It can serve to investigate, measure, validate or verify

other quality attributes of the system, such as scalability, reliability, and resource usage.

∗ It does not give a pass or fail result; instead, it is typically used to set the benchmark & standard of the application – Service Level Agreements (SLAs).

What is Software Performance Testing?

Types of Software Performance Testing

∗ Load Testing: A load test is usually conducted to understand the behavior of the system under a specific expected (normal) load. This load can be the expected concurrent number of users on the application performing a specific number of transactions within the set duration.

∗ Stress Testing: A stress test is done to evaluate the application's behavior beyond normal or peak load conditions. This kind of test is done to determine the system's robustness in terms of extreme load and helps application administrators to determine if the system will perform sufficiently if the current load goes well above the expected maximum.

∗ Spike Testing: Spike testing is done by suddenly increasing the load generated by a very large number of users, and observing the behavior of the system. The goal is to determine if performance will suffer, the system will fail, or if it will be able to handle dramatic changes in load.

∗ Endurance Testing/Soak Testing: is usually done to determine if the system can sustain the continuous expected (normal) load. During endurance tests, memory utilization is monitored to detect potential leaks.

Types of Software Performance Testing - Definitions

∗ Key Performance Indicators (KPIs) ∗ Service-oriented indicators: are availability and

response time; they measure how well (or not) an application is providing a service to the end users.

∗ Efficiency-oriented indicators: are throughput and capacity; they measure how well (or not) an application makes use of the hosting infrastructure.

Why do Software Performance Testing?

∗ Availability - The amount of time an application is available to the end user. Lack of availability is significant because many applications will have a substantial business cost for even a small outage.

∗ Response Time - The amount of time it takes for the application to respond to a user request.

Service-oriented Indicators

∗ Throughput - The rate at which application-oriented events occur. Example: the number of hits on a web page within a given period of time.

∗ Utilization - The percentage of the theoretical capacity of a resource that is being used. Example: network bandwidth and memory used on a web server farm when 1,000 visitors are active.

Efficiency-oriented Indicators

∗ The context of the SUT requires it (e.g. banking, trading, auctions, mission-critical)

∗ After the SUT has passed Functional and User Acceptance Testing via traditional testing methods

∗ The perceived ROI is greater than the development cost

When to do Software Performance Testing?

∗ History: ∗ Stefano Mazzocchi of the Apache Software Foundation

was the original developer of Jmeter v1.0. He wrote it primarily to test the performance of Apache JServ (now called Tomcat) in 1998.

∗ In May of 2016, JMeter v3.0 was released

Hello, JMeter!

Why JMeter?

∗ Ability to performance test many different server/protocol types: ∗ Web - HTTP, HTTPS ∗ SOAP / REST ∗ FTP ∗ Database via JDBC ∗ LDAP ∗ Message-oriented middleware (MOM) via JMS ∗ Mail - SMTP(S), POP3(S) and IMAP(S) ∗ Native commands or shell scripts ∗ TCP

What Protocols Are Supported?

∗ A minimal test will consist of the Test Plan, a Thread Group and one or more Samplers. ∗ Test Plan

∗ A test plan describes a series of steps JMeter will execute when run. ∗ Thread Group

∗ Thread group elements are the beginning points of any test plan. All controllers and samplers must be under a thread group.

∗ Sampler: JMeter Proxy Recorder ∗ The JMeter Proxy Recorder allows JMeter to watch and record

actions while users browse web application with standard browsers. JMeter will create test sample objects and store them directly into the Test Plan as the user browses.

JMeter - Let’s Dive In!

∗ Configure Proxy Server on your Browser ∗ A Proxy server is a server that acts

as an intermediary for requests from clients seeking resources from other servers.

∗ It is used to simulate user traffic from a pre-determined location.

Configuring the Browser Proxy Server

Configuring the JMeter Proxy Recorder

Configuring the Thread Group

Data-driving Tests

Data-driving Tests Cont’d

Think Time

∗ Simulate the time users naturally spend reading, typing, or pausing.

Assertions / Checkpoints

Interpreting the Test Results

∗ # Samples - The number of requests sent to the server ∗ Average - Arithmetic mean for all responses (sum of all times / count) ∗ Min - Minimal response time (ms) ∗ Max - Maximum response time (ms) ∗ Error % - percentage of failed tests ∗ Throughput - is measured in requests per second/minute/hour. The time unit is chosen so that the

displayed rate is at least 1.0. When the throughput is saved to a CSV file, it is expressed in requests/second, i.e. 30.0 requests/minute is saved as 0.5.

∗ Kb/sec - The throughput measured in Kilobytes per second

?

Q&A

Paul Hope Substantia Solutions, LLC

Email: Paul.Hope@substantiasolutions.com LinkedIn: https://www.linkedin.com/in/paul-hope

Contact Information

Thank you!