October 2010 Monthly Meeting Summary
Experience Report: Test Automation in an Agile Environment - presentation by Len Vaz
Sub-topics may include
* Breaking down the app you are going to test
* Test Environment - VMs, Tools, Languages, etc.
* Resources - People, Hardware, Software, etc
* Automation Coding - IDE, Frameworks, Coding; Examples (Python)
* Lessons learned
Presenter Bio: Len is QA Director at local startup company Invincea; previous stints in his
15-year IT career were at Micro Strategy, AOL and Verisign, where he's done CM, development
and QA. His passion is designing & implementing QA Automation systems using open source
automation tools. He's worked extensively with Watir and currently is working with Python and AutoIT.
Took place on: Wed. October 13 2010 6:30 PM
- Len's presentation on Test Automation in an Agile Environment is available as a 2.2 MB pptx file
- Writing the automation code is the easy part
- Automating Install Shield - has 'response files' which can be used to help test automation of installs
- 'Image for Windows' - $50 tool for windows imaging
- If the hardware is identical among multiple machines you can have the same image install to each or all machines
- You cannot run a different VM within VMware
- There was discussion re a useful DB schema for results logging
- Standards besides coding standards might include: test cases/suites organization and naming and file system organization
- Mostly used Python and PyUnit and AutoIT
- It was recommeded that dev tools be used for automation development; among other things it could help get buy-in from dev and make it more likely they would use the automation
tool/framework if necessary - in their case this included subversion, pyUnit, Eclipse/Pydev; tests could be run from IDE or command line
- Reporting tools included Google Visualization , non-gadgetized, to keep it simple
- They had one-week sprints; this necessitated that the automation engineer had to be at every sprint meeting - just reading the user stories alone was not enough
- Automation was run on each build (not each checkin). Builds took about 40 minutes and automation runs took on the order of 5 hours
- Useful approach to automated test organization and planning/prioritizing: enable use of a skeleton structure for each new test which would have - even if not automated yet -
a Manual test case name, a Description, and a Verification approach description. These tests were logged even if the automation for a test had not yet been coded/inserted,
thus enabling clear reporting in the automation results reports of what still needed to be automated, and it made it easier to plan and prioritize future automation work needed.
And there was enough information for someone to code the automated test case or run a manual test.
NoVaTAIG Home Page