In an agile business, automation is becoming an essential process. Companies are now taking advantage of automation testing tools to increase their efficiency and productivity. Automation is frequently evangelized as the cure-all of software quality woes. Some of the benefits that accrue from automated tests include test re-usability, repeatability and coverage besides the savings on effort, time and cost it takes for execution (compared to manual testing). However it’s NOT the answer for everything. Test automation, while being able to improve numerous aspects of software development, has limitations that developers and quality assurance teams should be aware of from the start. Understanding these limitations of automation testing will help us devise an efficient & effective automation strategy.
Limitations of Automation Testing
Need for scripting and programming skills
For automation testing, owing to the requirement of coding & programming skills – the technical skills of human resource should be high enough to write robust automation code. This is a very major limitation of automation testing which provokes companies to hire testers with substantial coding skills. Organizations are increasingly looking for experts that understand how to work with automated processes. If a coder does not have the proficiency in this area, it could lead to the failure of a project and the inability to fix scripts for future projects.
Writing test automation scripts is not an easy task. You really need testers who are experienced in doing this otherwise it will go horribly wrong and you will end up spending even more money and time than if done manually.
Once written, it doesn’t mean that the test script will never be touched again. Agile means that the application code will undergo multiple modifications whenever a feature is updated or needs to be modified. For reliability, same is applicable for automation code in turn, i.e. the automated code needs to be updated accordingly failing which it might result in faulty output, ultimately leaving the product with vulnerabilities and defects.
Note: Undertaking early automation with the intent of saving time/effort can result in more time being expended in the maintenance of the automation scripts. This is especially true if there are frequent changes to the features/functions being tested.
Training & Standards
Automation tools entail a high upfront investment in tool selection, license costs and training. Designing a perfect automation script need some solid understanding of software development principles which automation testers lack most of the time. This is because they were not trained for development activities by organizations. Thus everyone follow their own standards while developing scripts.
As an automation tester, you must have heard some time in your career “Automation should be like pressing enter and getting final test results at the end-of-day without any human intervention”. Boss! It’s automation, not auto-magic. Automation maintenance places practicality limits around what we automate. It’s not feasible to automate everything, as we must maintain everything. We need to be prudent about what tests we want to keep. The extent of automation that can be achieved keeping in mind the technical, resources, time & budget constraints may not meet the expectations of stakeholders. Automation cannot fully replace Manual testing which is another misplaced expectation in some contexts.
Testing tools are not as intelligent as human beings. They only have programmed intelligence which is very limited. Automated tests can only follow rules that have been defined for it. This means that it is only as accurate as the rules it is given to follow. Say, how is the look & feel of the application?
Automation Tools fragmentation
There is no one tool to suffice all testing requirements like UI, Database, functional, security and performance. Some are specialized for functional (Selenium, HPE UFT), some for performance (HPE LoadRunner, JMeter) and others for security. Additionally automated tools tend to be limited to a specific technology platform, i.e. tools that test web content, PDF content, SAP application, desktop, etc. The automation tool selection process would involve due diligence to assess the compatibility of the tool with the application(s) being tested and the fit in the existing IT landscape. A single tool may not work best with all the applications and organizations may end up procuring more than one tool for automation. This adds to challenges around tool cost & governance.
Incompatible Legacy Code
Another limitation of automation testing is that a tester often comes across legacy code which was written not keeping in mind that automated testing tools can test it. I remember having one particular release to make our legacy code automation-friendly, i.e. adding specific element properties. Applying automated testing tool to legacy code is a very cumbersome and laborious task which entails lots of complexities.
Unidentified element properties
The greatest danger with automated accessibility tools is the assumption that they can somehow replace human involvement in improving accessibility. Use accessibility tools, but remember they are only one part of the toolkit. Sole reliance on them as sole indicators of accessibility compliance gives a false sense of security. In addition they may inaccurately portray a site or app as being fully accessible to people with disabilities when in fact problems exist. Evaluating Web site accessibility is an art not a science – it can’t be reduce to running a site through an automated tool. E.g. Logical meaning of content, Logical Tab order, Keyboard access, Text alternatives are meaningful. Therefore you can’t stop at automated testing.
UX Testing | Technical limitations
As soon as we move into the area where subjective qualities are being measured automation breaks down. Possibly one of the biggest limitations of automation testing is that it can’t think like a human, meaning that user experience tests will need to be done manually. The appearance, ease of use and consistency are not items that an automated test script would accurately be able to evaluate.
There are certain tasks which can be performed only using Automation tools such as Load, Endurance, and Scalable Performance testing which simulates 100’s of users. However there are similarly few of the tasks that cannot be automated like Image Re-captcha due to security measures being implemented in the application. This is nothing but an image which has distorted letters printed on it which can be identified only with the naked eye.
Ad-hoc testing is a commonly used term for software testing performed without planning and documentation. This type of testing is performed to learn more about the product by doing random testing. The main task of Ad-hoc testing is to find important defects much quickly. Automation scripts won’t do that for you.
Return on Investment
Ascertaining if a business case exists for automation – if the license and training costs would deliver business benefits in terms of the value being delivered. The extent of automation that is possible and the number of times the automated test case suite is executed would be key considerations here. Should the procured tool require niche skill sets, there would be additional costs on training resources. A significant ROI can be achieved by enlisting a team of developers, your software development partner and testers to choose tools that can be used in many applications thus ensuring reoccurring use of the procured tools.
While test automation does have some limitations, it can still be valuable for an agile organization to adopt for its software development processes. By understanding these limitations of automation testing, QA teams can better plan and leverage automated testing effectively. What are some limitations of automation testing you faced in your project? And how did you overcome it? Would love to hear your experience on this…