|By Kate Mackinder||
|August 11, 2009 05:15 PM EDT||
The definition of agile testing can be described as follows: "Testing practice for projects using agile technologies, treating development as the customer of testing and emphasizing a test-first design philosophy. In agile development, testing is integrated throughout the lifecycle, testing the software throughout its development."*
Agile is a methodology that is seeing increasingly widespread adoption and it is easy to understand why-especially if you consider the developer and user point of view.
Users: Don't want to spend ages being quizzed in detail about the exact requirements and processes for the whole system, and then have to review a large specification, which they know could come back to haunt them.
Developers: Don't want to have to follow a tight specification, without any expression of their own imagination and creative talents, especially if they can see a better way.
Yet for the QA professional an Agile approach can cause discomfort - In the ideal world they would have a ‘finished' product to verify against a finished specification. To be asked to validate a moving target against a changing backdrop is counter intuitive. It means that the use of technology and automation are much more difficult, and it requires a new approach to testing, in the same way that it does for the users and the developers.
All the agile approaches have (at least) one characteristic in common in that they impact the role of the QA professional. This in itself is not a bad thing when the outcome is a step change for the better. However, when decisions are made on the basis of an invalid paradigm, change is not always analogous with progress. When a new paradigm is proposed for software development, by software developers, it is not a surprise that it is developer-centric. Abraham Maslow said that "He that is good with a hammer tends to think everything is a nail." The responsibility of the QA profession is not to bury its head and pretend that agile development will go away, it is our responsibility to engage in discussion to ensure that someone with a hammer is not pounding on a screw!
With the emergence of Test Driven Development** some suggest the role of QA is now questionable citing Test Driven Development (TDD) as the key to testing. But, what is most important is that QA is directly involved in the agile scrums all the way through, to be an integral part of the team designing the tests, at the same time as the requirements and solutions evolve.
QA teams need to know the real impact of an Agile methodology, there are boundless myths circulating the industry. Here is our reply to ten of our favorite myths:
Ten QA Myths Regarding Agile Testing.
There are a number of comments and statements regarding TDD and the QA function beginning to appear in articles on the internet by so-called specialists, that are, at best, misguided. This article responds to some of these myths and highlights challenges that QA teams will need to face up to and address in order to be successful in an increasingly agile world.
Myth One: "You only need to unit test - TDD testing is sufficient"
For the vast majority of commercial developments this simply isn't true. Even strong proponents of agile development recognize the need for their armory to include a range of testing techniques. For example, Scott W. Ambler has a list of 21 different testing techniques as part of his FLOOT (Full Life Cycle Object-Oriented Testing) methodology, including white box, black box, regression testing, stress testing and UAT. (http://www.ambysoft.com/essays/floot.html)
TDD programmers rely on these tests to verify their code is correct. If the requirements (test cases) are specified incorrectly, then you'll build robust code that fails to meet the objective.
Therefore, most agile projects include investigative testing efforts (black-box) which support rather than compete with white-box testing. Good investigative testing will reveal problems that the developer missed before they get too far downstream.
Myth Two: "You can reuse unit tests to build a regression test suite"
Some TDD proponents argue that conventional downstream testing is unnecessary because every line of application code has a corresponding test case; they suggest that by reassembling unit tests, everything from User Acceptance Tests to Regression Tests can be performed.
Although this sounds feasible, it isn't realistic, and here's why:
The granularity and objectives of white-box unit tests developed in TDD serve a different purpose from downstream black-box testing.
While the overall objective of a unit test is designed to prove that the code will do what is expected, the aim of regression testing is to ensure that no unexpected effects result from the application code being changed. These two objectives are not synonymous - e.g. checking that an attribute has a valid date format, is not the same as checking that for a given input, the value of the field contains an expected date.
Myth Three: "We no longer need testers, or automation tools"
Professional testers fulfill a different and equally valid role from their development colleagues.
It is widely recognized that traditional test automation tools have failed to live up to the hype of the vendors. Original Software's roots are as providers of products that improve the productivity of the database testing environment. It was because there were no adequate tools to provide User interface testing that we extended our solutions into this market sector; our heritage is aligned to effective testing rather than screen-scraping automation. When we developed TestDrive, we did it with the benefit of twenty years hindsight of missed opportunities from the traditional vendors.
Often, TDD projects have at least as much test code as application code and, therefore, are themselves software applications. This test code needs to be maintained for the life of the target application.
Myth Four: "Unit tests remove the need for manual testing"
Manual testing is a repetitive task; it's expensive, boring and error-prone. While TDD can reduce the amount of manual effort needed for functional testing; it does not, remove the need for further black-box testing (manual and automated).
By automatically capturing the tester's process and documenting their keystrokes and mouse-clicks, the tester will have more time for the interesting, value-add activities, such as testing complex scenarios that would be difficult or impossible to justify automating. Though manual testing is a time-consuming (and therefore expensive) way to find errors, the costs of not finding them are often much higher.
Myth Five: "User Acceptance Testing is no longer necessary"
Within agile development, acceptance testing is often positioned as working with the customer to resolve "incorrect requirements", rather than correcting functionality that does not map to what the user needs. When the user initially defines their requirements, they do it based on their initial expectations. When they see the system "in the flesh" they will invariably come up with different, or additional, requirements. While agile methods might reduce the frequency of this happening, it is unreasonable to expect the approach to resolve them entirely, so there should be no expectation that user acceptance testing will be obviated. This is especially true when it comes to the business user signing off approval of the User Interface, since they may have envisaged something different from the developer; which brings us nicely to myth six...
Myth Six: "Automation is impossible"
Automation in the early stages of an agile project is usually very tough, but as the system grows and evolves, some aspects settle and it becomes appropriate to deploy automation - automation that can cope with changes by using approaches such as self healing scripts.
To begin with, almost all testing from a user and QA perspective will be manual but this testing effort and design can be beneficial later if captured and re-used.
Knowing the right time to automate is tough, so using technology that proactively supports early manual testing but provides a path to evolve this into automation is key.
Myth Seven: "Developers have adequate testing skills"
If testing was easy everybody would do it and we'd deliver perfect code every time.
An independent testing team serves as an objective third-party, able to see "the big picture", to validate the functionality and quality of the deliverable. While developers tend towards proving the system functions as required, a good tester will be detached enough to ask "what happens if...?" When you include business user testing as well, you are more likely to have a system that is fit for purpose.
Finally, and I'm sure this point will be disputed, most developers don't actually want to spend much time first writing tests and then developing code to prove the tests work. Using the collaborative process described below, the developer gets ample assistance in describing the functional tests and can focus on writing lean, accurate and robust code.
Myth Eight: "The unit tests form 100% of your design specification"
Whichever development method you use, before you develop code you have to think about the requirements. While TDD "done well" may identify a fair percentage of the design specification, there are still concerns about gaps between the unit tests. There are other equally viable approaches. The argument, that you need TDD to prove the requirements are captured accurately, isn't substantiated by history.
Defining test cases to check the accuracy and succinctness of the requirements is nothing new. The V-model, for example, is a valid approach to understanding testing requirements - and by implication, functional requirements. Like TDD, the V-model and most other models, fall down when the practitioner's approach is rigid, while development processes are fluid. Software engineering is not like mechanical engineering and trying to force conformity is wasted effort. Whichever approach you chose, correct thinking challenges each user requirement by asking "how would I test that?" The important factor here is that test cases are challenged before they are committed to code, otherwise you'll be doing a lot of recoding and calling it "refactoring".
When the requirements are refined through collaboration, the developer receives a more robust specification that is less likely to change because it has been openly appraised from several people's perspectives.
Myth Nine: "TDD is applicable on every project"
As the size of the project increases, the time required to run the tests also increases. This can be addressed by partitioning either/both the project and/or the tests. Either route results in tests that run at different frequencies depending upon their relevance to the current coding effort. This approach introduces the need for test planning and execution management. To achieve this efficiently, in addition to white-box testing, you need to be thinking about:
Integration Testing - "Which tests do I need to run to ensure the new code works seamlessly with the surrounding code?"
System Testing - "Does the functionality supported by the new code dovetail with functionality elsewhere in this system, or in other systems within the process flow?"
Regression testing - "How often do I need to run a regression test to ensure there are no unforeseen impacts of the new code?" Automated regression testing provides a safety net that can affirm agile development techniques.
User Acceptance Testing - "While TDD (in collaboration with business users) should ensure that a specific function performs correctly, is the cumulative impact of changes still acceptable to the business users?"
However, in today's environment it is unacceptable to think of these testing stages as discrete serial activities. Often they will need to be run in parallel as we get a new ‘code drop'. As the size of the project team increases (along with testing effort), it is no longer acceptable for the tests to be considered "self-documenting". Whenever the number of participants increases, the project's risks become exposed to its members' different interpretations of the requirements - the definition of what constitutes "success" can be misconstrued.
As the size of the project increases, so would the amount of test code that needs to be developed. Any test code developed needs to be supported for the life of the application - effectively doubling the ongoing maintenance effort.
As the size of the testing workload increases the project needs to add test automation to its armory, to minimize the human intervention and elapsed times for each of these testing cycles.
Myth Ten: "Developers and Testers are like oil and water"
Since the dawn of time there has often been a "them and us" tension between developers and testers. This is usually a healthy symbiotic relationship which, when working correctly, provides a mutually beneficial relationship between the two groups resulting in a higher quality deliverable for the customer.
The discussion should be focused on:
- Software delivery that meets business objectives (fit for purpose, on time and on budget), not who owns which part of the process.
- What can testers contribute to the requirements gathering phase to ensure they are involved in the TDD process?
- How can testers maximize reuse of the assets created during the development phase?
- Is there a role for the "traditional tester" in TDD, or should they (like the developers) be acquiring new skills to enable them to adapt to the new paradigm?
- How can developers and testers find mutual benefit in exploiting new advances in software development and testing tools?
Just as improvements in developer's software tools and methods have enabled a shift in development approaches, next generation technology for test automation is similarly reframing the opportunities for testers to automate earlier in the delivery cycle without incurring the heavy burden of script maintenance so often associated with traditional automation tools.
Agile projects are in fact an excellent opportunity for QA to take leadership of the agile processes; who else is better placed to bridge the gap between users and developers, understand both what is required, how it can be achieved and how it can be assured prior to deployment? QA should have a vested interest in both "the how" and "the result", as well as continuing to ensure that the whole evolving system meets the business objectives and is fit for purpose. But it requires QA professionals to be fluid and agile themselves, discarding previous paradigms and focusing on techniques to optimize a new strategy to testing.
* Definition taken from The Glossary of Software Testing Terms from Original Software (www.origsoft.com)
**Defined on Wikipedia as "A software development technique consisting of short iterations where new test cases covering the desired improvement or new functionality are written first, then the production code necessary to pass the tests is implemented, and finally the software is refactored to accommodate changes."
“DevOps is really about the business. The business is under pressure today, competitively in the marketplace to respond to the expectations of the customer. The business is driving IT and the problem is that IT isn't responding fast enough," explained Mark Levy, Senior Product Marketing Manager at Serena Software, in this SYS-CON.tv interview at DevOps Summit, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 18, 2014 04:00 PM EST Reads: 1,025
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete...
Dec. 18, 2014 03:45 PM EST Reads: 724
“We help people build clusters, in the classical sense of the cluster. We help people put a full stack on top of every single one of those machines. We do the full bare metal install," explained Greg Bruno, Vice President of Engineering and co-founder of StackIQ, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 18, 2014 02:30 PM EST Reads: 710
The cloud is becoming the de-facto way for enterprises to leverage common infrastructure while innovating and one of the biggest obstacles facing public cloud computing is security. In his session at 15th Cloud Expo, Jeff Aliber, a global marketing executive at Verizon, discussed how the best place for web security is in the cloud. Benefits include: Functions as the first layer of defense Easy operation –CNAME change Implement an integrated solution Best architecture for addressing network-l...
Dec. 18, 2014 02:00 PM EST Reads: 793
Mobile commerce traffic is surpassing desktop, yet less than 20% of sales in the U.S. are mobile commerce sales. In his session at 15th Cloud Expo, Dan Franklin, Segment Manager, Commerce, at Verizon Digital Media Services, defined mobile devices and discussed how next generation means simplification. It means taking your digital content and turning it into instantly gratifying experiences.
Dec. 18, 2014 12:00 PM EST Reads: 962
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 18, 2014 11:30 AM EST Reads: 812
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, p...
Dec. 18, 2014 11:00 AM EST Reads: 2,091
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com...
Dec. 18, 2014 10:30 AM EST Reads: 2,226
Verizon Enterprise Solutions is simplifying the cloud-purchasing experience for its clients, with the launch of Verizon Cloud Marketplace, a key foundational component of the company's robust ecosystem of enterprise-class technologies. The online storefront will initially feature pre-built cloud-based services from AppDynamics, Hitachi Data Systems, Juniper Networks, PfSense and Tervela. Available globally to enterprises using Verizon Cloud, Verizon Cloud Marketplace provides a one-stop shop fo...
Dec. 18, 2014 10:30 AM EST Reads: 1,776
Leysin American School is an exclusive, private boarding school located in Leysin, Switzerland. Leysin selected an OpenStack-powered, private cloud as a service to manage multiple applications and provide development environments for students across the institution. Seeking to meet rigid data sovereignty and data integrity requirements while offering flexible, on-demand cloud resources to users, Leysin identified OpenStack as the clear choice to round out the school's cloud strategy. Additional...
Dec. 18, 2014 10:30 AM EST Reads: 1,825
The Internet of Things is not new. Historically, smart businesses have used its basic concept of leveraging data to drive better decision making and have capitalized on those insights to realize additional revenue opportunities. So, what has changed to make the Internet of Things one of the hottest topics in tech? In his session at @ThingsExpo, Chris Gray, Director, Embedded and Internet of Things, discussed the underlying factors that are driving the economics of intelligent systems. Discover ...
Dec. 18, 2014 10:15 AM EST Reads: 2,056
"Our premise is Docker is not enough. That's not a bad thing - we actually love Docker. At ActiveState all our products are based on open source technology and Docker is an up-and-coming piece of open source technology," explained Bart Copeland, President & CEO of ActiveState Software, in this SYS-CON.tv interview at DevOps Summit at Cloud Expo®, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 18, 2014 10:00 AM EST Reads: 1,816
"BSQUARE is in the business of selling software solutions for smart connected devices. It's obvious that IoT has moved from being a technology to being a fundamental part of business, and in the last 18 months people have said let's figure out how to do it and let's put some focus on it, " explained Dave Wagstaff, VP & Chief Architect, at BSQUARE Corporation, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 18, 2014 10:00 AM EST Reads: 1,751
The move in recent years to cloud computing services and architectures has added significant pace to the application development and deployment environment. When enterprise IT can spin up large computing instances in just minutes, developers can also design and deploy in small time frames that were unimaginable a few years ago. The consequent move toward lean, agile, and fast development leads to the need for the development and operations sides to work very closely together. Thus, DevOps become...
Dec. 18, 2014 10:00 AM EST Reads: 1,867
SYS-CON Events announced today that AIC, a leading provider of OEM/ODM server and storage solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. AIC is a leading provider of both standard OTS, off-the-shelf, and OEM/ODM server and storage solutions. With expert in-house design capabilities, validation, manufacturing and production, AIC's broad selection of products are highly flexible and are conf...
Dec. 18, 2014 09:45 AM EST Reads: 1,702
SYS-CON Events announced today that IDenticard will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. IDenticard™ is the security division of Brady Corp (NYSE: BRC), a $1.5 billion manufacturer of identification products. We have small-company values with the strength and stability of a major corporation. IDenticard offers local sales, support and service to our customers across the United States and Canada...
Dec. 18, 2014 09:30 AM EST Reads: 1,857
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 18, 2014 09:00 AM EST Reads: 1,184
“We are a managed services company. We have taken the key aspects of the cloud and the purposed data center and merged the two together and launched the Purposed Cloud about 18–24 months ago," explained Chetan Patwardhan, CEO of Stratogent, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 18, 2014 09:00 AM EST Reads: 1,156
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session a...
Dec. 17, 2014 11:15 PM EST Reads: 1,254