Quality Assurance Automation: Three Ways It Saves the Day
Posted by Neil Eddy on February 13, 2017
Web development has never been more advanced and fast moving than it is today. In light of agile development, complex e-commerce sites, heavily customized role-based content, and the continual drive to better engage end users, we've created a serious need for more advanced development strategies. Often, one casualty of this rapid advancement is Quality Assurance testing…and when that process fails, it can have major consequences for your site and its bottom line.
The Problem We've Created
As the latest development practices roll out, the market finds itself at more of a disadvantage when it comes to the testing process. Every scenario that an end user may end up in can become a point of failure with each new deployment. With deployments now occurring on at least a bi-weekly basis, the hours required to manually look over every facet of a complex build add up quickly.
This is where QA Automation comes in. It will save you in more ways than one.
#1. It's a Massive Time-Saver
Quality assurance automation drastically reduces the amount time required to test.
An example: Not all that long ago we ran into a scenario on a high-traffic e-commerce site containing four unique roles, five product types, numerous tax and shipping variables, four different shopping carts, and three payment methods, with different results for each physical location. Testing every possible regression as part of each weekly deployment became an unmanageable task.
Multiply that by the eight devices we were testing everything against, and "impossible" was all that came to mind. When left with a million plus variations, each that may potentially affect the client’s bottom line, we needed a solution, and fast!
We turned to Selenium, building a massive suite that validated the basics of every scenario. By using a page object model, we were able to do so with a reasonably minimal effort. This allowed us, through the use of continuous integration, to run through all of the above scenarios every time a developer pushed to the development environment. In addition to the time savings, this process allowed us to catch hundreds of thousands of regressions without a person ever laying eyes on the product. This brings me to my next point…
#2. It's Vastly Superior to the Human Alternative
Quality assurance automation is hands-down the best solution for catching regressions and bugs.
No matter how good we are as human testers, there are always unforeseen ramifications to our actions. The larger and more complex the website, the more side effects there will be. Ideally and traditionally, with every new site deployment, a QA tester would rigorously go through every page and piece of functionality on the site. However as the number of moving parts increases, this method has become a distinct roadblock to moving in a rapid and agile manner.
By using QA automation, we can cover all of that ground in detail programmatically, and give the site a much higher level of overview, while only needing to manually deep test a few facets of complicated functionality.
I cannot stress enough how amazing it is to catch a minor detail through automated testing. For example, let's say a tax is suddenly being calculated wrong due to some small change in a certain product type. It's one of those bugs that manual testing is unlikely to catch, as the tax system worked correctly when it was deployed three weeks ago, and nobody directly changed anything to do with taxes or prices. Since a number appears in the tax field it would be easily overlooked if not for automated review.
While it may seem understandable that such regression would get overlooked manually, catching the error immediately can be a real lifesaver. A client having discovered that they've been 20% short on taxes charged to customers for an entire month can really strain a relationship.
#3. Nothing Gets Left Behind
Quality assurance automation ensures excellent documentation, so you can always know what's really going on.
With the rapid clip in which technology moves, it happens on occasion that some piece that is supposed to get deployed goes undocumented, and as a result something minor breaks in a deployment. At this point, you may find yourself scrambling to determine what was missed to cause it.
With use of continuous integration and test tools, we can now do a pixel-by-pixel comparison, to compare the DOM, the CSS, and the source… all without having to lift a finger. As a result, we can see exactly where the break lies, and determine the cause of any problem in a much more rapid fashion. Plus, with a three-environment setup (development, staging, and production), we can do all of that, plus make appropriate corrections, all before any issue becomes client facing.
A Savior on Countless Occasions…but Always Evolving
In the end, our team does not stand still on QA automation; we are always looking for even more tools to add to our process in order to continuously improve upon it. We continue to grow and adapt to test regressions, page loads, analytics, and security, among so many other things. In our business, we need to continue adding to our tools, and even build our own when necessary.
It helps us rest easier at night knowing such a powerful set of tools has got our back. While automation will never replace a set of eyes on a page, you better believe it is that set of eyes’ best friend.
Neil Eddy is the Quality Assurance Automation Engineer at BlueModus. He ensures that all work completed by our technologists continually meets the rigorous standards our customers have come to expect. As the gatekeeper for each new project deployment, Neil is the mastermind behind our component and unit testing, validation of integrations and associated data transfers, as well as all device and browser compliance.