Taking Your Automated UI Software Testing Effort from the Greenhouse to Deeply Rooted Within Your Organization

>>>Taking Your Automated UI Software Testing Effort from the Greenhouse to Deeply Rooted Within Your Organization

Automated testingAutomating user-interface testing can leverage your existing quality assurance team to cover more code in a shorter period of time. However, it may be harder to create a good in-house automation suite than it seems.

Following are five tips on how to avoid the headaches and grow your automation assets the right way.

1.   Have Your Development Team Build a Trellis

Automation software maps the user interface of the target software. If the target software is well described with metadata, it is easier to map. Otherwise, the automation software has to “guess.” The brittleness of your automation code is directly related to how many times the code has to “guess”.

testing

Trellis in Salzburg Source: Andrew Bossi – Wikipedia

Here is an example:

  1. Cara has coded a webpage with a [Submit] button. The button has no additional descriptive metadata.
  2. Viktor’s automation code assumes that there is a button on the form with text inside it that says “Submit.”
  3. The form is translated into German for a release and Viktor’s code breaks.
  4. Cara puts in metadata (name=“submit_btn”); Viktor’s code uses that as the criteria to map the button and the automation code is now language-independent.

If there is a new project on the horizon that will be automated, it is recommended to advocate for metadata “hooks” as part of the software requirements. If an existing project is automated, you could submit bugs or feature requests for crucial controls to have metadata added to them.

2. Prune your Supporting Tests

Before typing any code, realize that the spreadsheets and step-by-step documents targeted for automation are similar to building blueprints. The quality of the blueprint will directly affect the quality of the code. While reading through a test, if you find yourself asking, “What exactly is being tested here?” that test should never be automated. Unclear tests contribute to poor automation code much like a bad blueprint leads to a rickety house. It is simple to update wording and expected result text early in the process. It is much harder to modify subroutines and database checks later in the process.

Automation truly shines when a simple task is run multiple times on different environments. This is why it is critical to write the candidate testing documentation straightforward and as simple as possible. This style of writing will create automated code that is clear, modular, and effective. testing

3. Care & Feeding of the Automation Code

Automation code should be treated just like any other code within an organization. If your team handles your automation code as something that will only be used a few times, it will become a self-fulfilling prophecy. If code is tucked in someone’s “My Documents” folder instead of being properly filed into a version control system, don’t be surprised to find that the code has disappeared forever when that person leaves or someone has to do an archeological excavation in a decommissioned machine.

Automation code needs coding standards to remain manageable. Code reviews are recommended and code documentation should be required. If your code is an incoherent mess, people in your organization may not treat your budding automation effort with the respect it needs to grow and stay funded.

4. Grow Stronger Automation Code through Time

Typically, a testing project plays out similar to this example:

  1. Bill creates his first test: Ordering a princess phone from an e-commerce site. For this test, his code makes the assumption that a specific user is already registered on the site. The completed code reports problems on 3 of the 8 platforms. The QA manager investigates the three sites and finds that the e-commerce deployments were not broken. Those 3 sites did not have the specific user needed to run the test.
  2. Bill creates another test a week later. In this one he has to register a new user on the site.
  3. Bill revisits the ordering test and puts in some of the registration code to make the ordering test independent of any specific user.
  4. Bill runs his first test again. This time, 7 out of 8 platforms work. The last one reports an ugly, confusing error message. He eventually traces the code and finds that it doesn’t work because the princess phone is not in stock in that platform. Bill will have to revisit this test sometime later to check the in-stock status of the product.

The first lines of code created are not necessarily the final word on how it will be written at the end of a project. As the automated testing project grows and matures, the code is going to become more dependable as new “tools” are created to deal with different edge cases and environments. There is going to be a frustrating time early in the process where either no deliverables are coming through the pipeline, or the work that is submitted seems to fail all of the time. This “germination” is normal in the process and should be accounted for in the planning.

5. Planting on Unstable Ground

If a target software product is rapidly evolving with multiple changes, realize that it may not be the right “climate” for automation. Every time automation code has to be recoded due to project changes, the time savings and value of automation declines. If automation is necessary regardless of the climate, target your automation deployment to first concentrate on the stable regression testing areas, while manually testing the new features that are unstable until they became suitable for automation.

In closing, automation offers many benefits such as potential time and cost savings. In order to maximize the value of automation, consider the tips listed above. What has been your experience automating software? What tools have you used? What would you do differently if you had to do it again? If you have additional tips, please share below.

Chris Hasbrouck is a Consultant II at SDLC Partners, a leading provider of business and technology solutions. Please feel free to contact Chris at chasbrouck@sdlcpartners.com with any questions on this blog post or to further discuss software testing.