Keeping up with the changes: Automating UAT - Damian Sweeney, Student and Academic Services,...

download Keeping up with the changes: Automating UAT - Damian Sweeney, Student and Academic Services, University of Melbourne | ANZTLC15

If you can't read please download the document

Transcript of Keeping up with the changes: Automating UAT - Damian Sweeney, Student and Academic Services,...

Keeping up with the changes
Automating UAT

Damian SweeneyLearning Environments

Can we just get a quick indication of who we've got here: Support staff/help desk

Sysadmins/DBAs

Developers

Academics

Who we are

LMS experts team of 10

Central unit within University Services

Academic/admin staff-facing

Firstly, some context about our institution and Blackboard setup

Our setup

Recently upgraded from Apr '14 to Oct '14

Internally hosted

Managed hosted

Our LMS environments

Changes we care about

Bb upgrades and patches

New building blocks

Third party tools (Turnitin, Learning Objects, etc.)

(Network/infrastructure changes)

Why we do UAT

Confidence going into planned changes

Evaluate new features

Existing features continue to work

Verify known issues and find new ones

To find problems before our users do

To say 'No'

What we check

ConfigurationBuilding Blocks: settings and versions

Tool availability

Data integrityGrade centre data

Content

Permissions

Functionality

Time to juggle

Evolution of our UAT

2006 One pager

2008 several documents on what to test

2010 several documents on how to test

2011 single document including config check

2015 single document plus automation

Previous manual process

Sample subjects with lots of content

Complexity and depth of material is goodSimple testing steps are good

Previous manual process

Document what's there

Spot the difference

Previous manual process

100 pages of documentation with 1000 tests

Manual UAT issues

Takes a long time

Testing fatigue amongst staff

Frequent changes required to the documentation

We've just moved office, so had to cull a lot

Automation options

Selenium IDE for FirefoxCan record browser actions

Great for demonstrating a proof of concept

Anyone can run it

UAT is a documentation exercise more than a testing exercise, hence the testing paradigm didn't help

Automation demo

Selenium IDE for FirefoxSANDPIT environment

Fake staff user check

Automation options

Selenium IDE for Firefox disadvantagesHas limited flow control

Difficult to update per environment

Can't write to file

It's not primarily about testing

Automation options

Python-selenium + Phantomjs (headless)Phantomjs works well for our sysadmin script

Doesn't support multiple browser windows

Python-selenium + FirefoxWorks well for our fake staff script

Slightly trickier to schedule as GUI is required

Automated process

Scripts runs nightly on every environmentWrite current settings to a plain text file

Compare with a reference configuration

Output an HTML page of differences

Output a screenshot if there's an error

Aim was to have visibility across the teamAnyone can access the outputIt is all readable, but requires interpretation

Automated process

Can be run ad hoc on a single environment

Check that we're ready for Manual UAT

Manual UAT is now < 70 Pages with 400 tests

Automation advantages

Quicker

Checking much more regularly

Automation advantages

Reduces human error, especially in long boring lists

Manual testing now focusses on functionality, not config or data

More juggling

Automation advantages

Checking things we didn't used to

Finding changes/outages we would have otherwise missed

Easy comparisons with previous states and between environments

Privileges for all institutional rolesOne of our TEST environments regularly fails to connect to

Automation headaches (coding)

Blackboard loading content

Implicit waits vs explicit waits vs sleep()

Accommodating different environments' configurations

Table ordering differs between users

Heisenbugs

Automation headaches (admin)

Waiting for config changes to settle (false positives)

Needles in haystacks when comparing environments

Reduces visibility/knowledge across the team

Automation demo

Python + selenium + FirefoxSANDPIT environment

Fake staff user check

Youtube and Flickr example (test existing content and add/remove new content)

Updating documentation

Run the script again and use a comparison tool (eg: Notepad++ with Compare plugin)

git for tracking changes to reference config

Updating documentation

Complete manual UAT every 6 weeks

Preserve pre-upgrade reference configuration

How did it go?

Relatively smooth upgrade (internally hosted)

Several steps were much faster

Returned the environment to the user community a day earlier than planned

Our preproduction environment failed close to the upgrade, so this was the only definitive reference for configuration

What's next?

Student data integrity checks

Broaden scope of sysadmin checks to include more tools and settings

Thanks

Thanks to the LE team for supporting the new process

The Application Support team for testing

pip install -U selenium

http://phantomjs.org

Questions

[email protected]

Want the script?

Email [email protected]

Request the auto-UAT script in the subject line

Tell us how it goes