Solving the Automation Puzzle - how to select the right automation framework for you

Post on 23-Jan-2017

421 views 3 download

Transcript of Solving the Automation Puzzle - how to select the right automation framework for you

Solving the Automation Puzzle @bendet_ori

About me - @bendet_ori5.5y in HPE Software in various managerial QA roles

Michael’s father | Naomi’s husband

For small talk later:

Today – Inbound PM for UFT & a new cloud offering

Largest Pharm. Customer in

Europe

Critical Defect in Diaper

3

State of Automation – the poll!

4

5

6

7

8

To Business!

Assumption #1

9

Vendors only use their own tools

Assumption #2

10

Evil corporates hate Open Source

Assumption #3

11

Everybody is doing automated testing

12

My Automation Journey

13

Project #1AUT: Analytic Platform Automation: Internal Standalone Tool

14

Analytic Platform for IT Executives (v 1.0)

– AUT technology Stack– Glassfish Server– Flex + GWT– SAP BODS for ETLs– SAP BOE for BI– MSSQL

– Automation– People doing automation: 0.5 / 5– Focusing mainly on APIs– Java Beans (EJBs)– Internal tool called FIST– ROI: LOL

3.5 hours to install

15

16

17

18

An extendable Java class

19

20

Why?

21

Lessons LearnedStandalone Tool developed internally

– Cons– Nobody knew about the tool, wasn’t cool, no buzz around it– External tool: unable to get DEV to cooperate with automation (or even install the tool)– Manually triggered (no part of the CI process)– Almost every new test required changes to be done by DEV (expose new API methods)– No direct access to source code

– Pros– Small investment– QTTV– Easily extendable– Stability

22

Analytic Platform for IT Executives (v 1.0 2.0)

– AUT technology Stack– Glassfish Server– Flex + GWT– SAP BODS for ETLs– SAP BOE for BI– MSSQL

– Automation– People doing automation: 4 / 10, dedicated developer to assist– Focusing mainly on APIs Automation installation, E2E flows– Java Beans (EJBs) REST APIs– Internal tool called FIST Internally built framework REST client, Selenium, Flex Monkey– Automatic Deployment Solution: internally developed tool called Slick – ROI: 3 MD each sprint 1 MM per release

Automatic Deployment

Solution

23

Saved over 10K MH

24

Why?

– High demand for automation coverage

– Developers selected the automation framework

– Wanted to work together with developers

– Invested in automated installation as well

25

Lessons LearnedInternally built Automation Framework (inside the IDE)

– Cons– Required more coding skills– Large effort of getting things started– Harder to be used by less-technical testers– UI automation stability

– Pros– Harness developers into the automation– Developers re-used testing assets for their own benefits– Testers have access to source code– Part of the CI process

26

Project #2AUT: Performance Testing toolAutomation: Open Source

27

Performance Testing tool

– AUT technology Stack– nodeJS– AngularJS– Internal SQLlite

– Automation– People doing automation: 1 / 4– System tests leveraging developers assets– Focus on API testing, sanity level UI testing– Application Modeling for less-technical testers – Protractor, Mocha, Jasmine– ROI: 1.5 MD / sprint

Protractor

Without the framework

With the framework

REST API

Without the framework

With the framework

UI:

Test (created by non-technical engineer)

31

Why?

– Best available choice for the technology stack

– 1 automation engineer working on the framework, others re-using the assets

32

Lessons LearnedLeveraging DEV assets and extending into our own framework (+ modeling)

– Cons– Modeling takes time– Technical engineer becomes the bottleneck– UI automation stability

– Pros– Use existing developers assets– Extend automation coverage using non-technical engineers– Modeling eases tests maintenance

33

Project #3AUT: Firefox pluginAutomation: Commercial Tool (LeanFT)

34

TruClient?TruClient is a tool for recording Web-based applications. It is used inside LoadRunner for performance testing on the browser level

1. TruClient Sidebar2. TruClient Toolbox3. Firefox browser4. Application Browser

Window5. TruClient Sidebar

Status Pane

LeanFT = UFT’s younger & cooler brother

35

TruClient

– AUT technology Stack– Firefox Plugin– Pure Web– WPF– Windows app

– Automation– People doing automation: 2 / 6– Did not have an automation suite as they could not find a tool which has automation abilities for the full flow –

including the three technologies.– Supports the three main browsers and needs an automation tool that can identify and test the objects in all the

supported browsers – Selected LeanFT as the tool

36

The Automation SuiteLeanFT Application Model

Containing the full AUT hierarchy, the App Model displays a Modular view of all the objects implemented in the tested application

37

The Automation SuiteLeanFT test

- Everything is written in the IDE, Dev have access to tests (and used them for sanity)

- Test code is completely reusable among the whole test suite

- The test is authored once, and can be run on all three browsers

- In the test setup, the TruClient launcher – written in WPF - is started and being used, and during the test, web and standard windows technologies are tested.

38

Why?

– Cross Technology support (Desktop & Web)

– Script once – run of all browsers

– Re-use, share testing assets with Dev

39

Want to buy a new automation framework for 1 shekel??

40

The Automation Council

41

42

The Guidelines

Rule #1

43

Accessible in the developers workspace

Rule #2

44

Cross Browsers/Technology Support

Rule #3

45

Ability to easily model the UI

Rule #4

46

Full support for REST APIs testing

Rule #5

47

DB Layer

Rule #6

48

Ability to combine UI/API/DB in 1 flow

Rule #7

49

Messaging Parser (json, XML, etc.)

Rule #8

50

Parameterization of tests

Rule #9

51

CI/CD complaint

Rule #10

52

Modularity to allow re-use by less tech eng

53

54

FIST Selenium built FWK LeanFT UFTIDE X √ √ X

Cross Browsers/Technology

X Mobile/Web √ √√√

Model the UI X X* √ √

REST API testing √ X* X* √

DB Layer √ X* X* √

Combine UI/API/DB X X* X* √

Parsers √ X* X* √

Parameterization √ √ √ √

CI/CD X √ √ √

Modularity for less-tech X X √ √

Cross Platforms X √ X X

Lightweight X √ √ X

= can be added by user

55

Summary

Assumption #1

56

Vendors only use their own tools

Assumption #1

57

Vendors only use their own tools

It’s not about the tool

Assumption #2

58

Evil corporates hate Open Source

Assumption #2

59

Evil corporates hate Open Source

We love Open Source!!

Assumption #3

60

Everybody is doing automated testing

Assumption #3

61

Everybody is doing automated testingEverybody is using automated testing

62

Take Home Message

63

“It’s never about the tool, it’s about finding the right tool for the right project”

64

Thank You!Contact me: @bendet_ori

r2d2@hpe.com