To head off software problems before they start, developers and Quality Assurance (QA) teams need a way to anticipate, identify, and manage issues that might arise. When problems occur with a new release, QA teams get a lot of questions, many of which are tied more to the code development process than just the QA process.
Ten questions QA teams ask on an every day basis can expose key issues that show up in most new software releases:
- What requirements initiated the new software release? Be sure requirements are understood and accepted and that new features are really necessary.
- What additional features were added to the release and who requested these new features? Avoid scope creep and understand how new features not defined in the initial requirements may have been added to a release.
- Were the new features reported to QA and were test cases created? QA is too often the last to know about any new feature. Test cases against new features should be the first set of cases to be executed.
- How did the changes affect the overall application and was the impact added to the time line? Development teams often do not know the answer since analysis may be difficult and time consuming. The application build process should provide this level of impact information, and QA can use it to target particular test cases at crunch time.
- Did the development team initiate any unexpected changes? Developers often like to sneak in a few modifications to source code while they are in there. If a module must be updated, fixing old problems and cleaning up code is not unusual, but can be problematic.
- Who approved the changes to be promoted to the new release and do these changes match the approved requirements? The project manager should be aware of the level of system testing completed before the application moved to QA, and be confident the development team is truly finished and all requirements have been met.
- Where is the approved source code managed and stored, and are all third-party libraries also managed and stored? Getting all the pieces together is a must. If baselined source code can’t be found, it’s difficult to fix a release.
- How did the application build process verify matching source to executables, and is all source code used in the build coming from the approved and managed source code repository? Just because source code is stored in a central repository does not mean the executables moving through the QA environment were created from that source code.
- Based on the impact of the source code changes, what test cases were selected as critical? Discover what test cases must be executed prior to a production release. Knowing this will allow QA teams to refine the testing process so that any affected code is tested before moving to production.
- Who approved the release to production? Finally, understand who was ultimately responsible for approving the QA release. Development and QA teams should be on the same page regarding readiness for release.
These are reasonable questions. Now if only getting the answers was easier. With solid process tools, the answers can be found with a little digging. Questions 1 and 2 can be answered using a requirements gathering tool. Question 3 needs information from both a requirements tool and a testing tool. Question 4 can be answered using a commercial build management tool that provides impact analysis information. To answer Questions 5-8, a combination of tools must be used including requirements, configuration management, and build management tools. Question 9 may require knowledge from both an impact analysis build management feature and a test tool. And finally, Question 10 involves a release management tool.
ALF lands to help QA
Some challenges remain, including:
- No single software team can quickly get answers to most of these questions with just one tool – cooperation is required between people and tools
- If teams have them, the necessary life-cycle tools seldom share data or even a common vocabulary
- Expertise in all of these tools is required to retrieve data needed to report on the most basic of development metrics
|“As vendors roll out tools integrated with ALF, organizations will benefit from implementing modern life-cycle tools with better information gathering and sharing capability.”|
The Eclipse Application Lifecycle Framework (ALF) project can help. The ALF project, which is in the implementation phase with teams working on vocabulary and service flow definitions, will allow tools that support the application life-cycle process to communicate with one another using Web services transactions via a common communication framework.
ALF defines a common vocabulary between application life-cycle tools. This alone will improve the interaction between tools. Tools will communicate via Simple Object Access Protocol (SOAP) transactions over the Eclipse ALF.
ALF also defines service flows. An example of a service flow would be Build/Deploy/Test. For example, to answer Question 9, “Based on the impact of the source code changes, what test cases were selected as critical?”, the build management tool could send a list of the impacted executables to the test tool. The test tool could then determine which test cases must be retested prior to production release based on the list provided by the build management tool impact analysis function.
Teams using ALF might make better use of requirements. Often QA does not have a complete picture of the requirements of a new release. To answer Question 3, “Were the new features reported to QA and were test cases created?”, the ALF service flow Requirements Management to Test Management will allow requirement tools to pass requirement information to testing tools. This aids QA in better defining new test cases for new requirements. Getting this information early as soon as the requirement is defined will give QA an immediate notification that new test cases are required.
ALF is independent, but communicates
Developers don’t need to use the Eclipse IDE for teams to take advantage of the Eclipse ALF project. ALF is not a development tool, but a life-cycle tool.
Life-cycle tools include requirements gathering, issue tracking, version control, configuration management, build management, testing, and deployment. No single team uses all these tools independently. Organizations break down these responsibilities into diverse teams, each using tools for their part of the job. These teams must work together, and tools designed to ALF specifications will have the potential to communicate directly as well.
Actively pursuing these types of questions and sharing the information found across the organization helps improve software quality. As vendors roll out tools integrated with ALF, organizations will benefit from implementing modern life-cycle tools with better information gathering and sharing capability.
More information: www.eclipse.org/alf/
Tracy Ragan has served as CEO of Catalyst Systems Corporation since it was founded in 1995. She is an industry leader specializing in version management tools, testing tools, IDEs, and deployment tools. Tracy has been an active member of the Eclipse Foundation since 2002. As CEO of Catalyst, she has worked with large financial, insurance, and embedded systems companies on defining a software life-cycle process that fully incorporates testing. Tracy can be reached at email@example.com.