Thursday, February 21, 2013

A Testing Process

Testing Process Example

Following the set of articles relate to Testing and TMMI implementation, today I will write about an example testing process


Testing process is the set of activities that control and support the testing model, allowing to organize all roles, entries, outputs and techniques that support the process.

The testing process is the global process that any delivery will follow in the testing factory. It is resume in the following draw:

Main roles that are identified:
  • Project Manager: person with the responsibility to deal the project, between other functions this person will indicate the acceptable risk tolerance, services to include in testing, level of acceptance, quality rules to apply, schedule and costs.
  • Testing Team: technical team that will handle all test cases, define the test scope (in accordance with the project manager), report defects (block defects and normal defects) and generate the testing reports.
  • Testing Manager: collaborates on test plan definition, in accordance to testing team and project management. Governs of testing process, testing techniques, testing reports templates and content, capacity management, availability management...
  • Development Team: group of people that are building the delivery (software and/or documentation). They have to update configuration management system with the alpha/beta version and support for Blocking defects
What Blocking and Normal defects are?
  • Defect is any deviation between what is expected or is documented and what is delivered
  • Blocking Defect is a defect that blocks the testing services execution, and need to be solved as soon as possible in order to continue with testing activities. A response time need to be accorded in order to allow this flexible model.
  • Normal Defect are all other defects, that should be fixed in next releases and they are reporting in the moment that they are detected. 

Activities description:
  • 1.- Request: project manager ask to testing team or provider to test a delivery, giving the necessary input, as for example: delivery scope, delivery dates or detected risks. 
  • 2.- Delivery study: testing team will study the request with the context information about this project (i.e. previous deliveries, previous testing reports, testing plan...)
  • 4.- Testing Plan: with the summary made from testing team, testing management will analyse the request and formalize a proposal of services, acceptance level, quality evidences, risk mitigation... and planning the testing services and budgeting this delivery. 
  • 5.- Approve: the project manager have to validate testing plan scope, budget and schedule, in accordance to their liabilities.
  • 6.- Waiting for Delivery: once the test plan is accepted, the development team will announce when the delivery is ready for test. In parallel the testing team and the testing management will be working to ensure platform availability, test cases design and update and all the previous tasks that could be performed without the delivery performed.
  • 7.- Testing Execution: execution of all testing services included in the scope of this delivery. In the case that the testing process is blocked due to a defect (i.e. needed libraries are not available to build, build process fails for any reason, installation process fails) a blocking defect will be register to stablish a collaboration canal between testing team and development team. 
  • 8.- Block Defect Resolution: development team has to solve this kind of defects as soon as possible. New instructions, new updates in configuration management system could be done.
  • 9.- Report Generation: once services are finished a report will be generated to summarized and give recommendation to the project manager or/and to the development team.
  • 10.- Revision: Test Management audit and review test execution to ensure the alignment to quality policy.
  • 11.- Report Submission: reports are distribute to stakeholder for their evaluation in order to decide what to do with the delivery (ask for a new delivery, accept the delivery or accept and promote to production the delivery)
  • 12.- Invoicing: testing manager will invoice all finished testing deliveries at the end of marked invoicing period.
It is just an abstract, to complete the process new activities, inputs and outputs should be included, but I think it is a good starting point.

The relation between this testing process and the tmmi processes area could be summarized in this matrix (with many considerations - it is just a initial approximation):









Thursday, February 14, 2013

Testing Services, overview

Testing Services


I would like to resume and describe a set of basic IT testing services that should be offered to any customer. In the context of TMMI implementation, there are several process area that focus their goal (specific goals and specific practices) on defining a testing policy and increase testing capabilities.

In this context, a testing service catalogue, should be the base for any testing organization.

The main TMMI processes area that should be involved on this are:

  • Test Policy and Strategy
  • Test Design and Execution
  • Test Organization
  • Non-functional Testing
The list is not exhaustive, but from my point of view, it is a good starting point.

Early Testing: a set of services focus on software engineering process that aims on early defect detection. Early Testing has it basics on increasing system quality by early defect detecting on the application lifecycle, decreasing defect impact and defect correction. As early a defect is detected as cheaper is the solution. A quality assurance plan starts in the begging of project lifecycle and it is applied all along the project life.




Early Testing services mission is to assurance that the technical and functional documentation is complete and according to end user requirements. A basic catalogue set of services should be:

  • Requirement Testing: related to testing requirement specification, aims on reviewing requirements, that should comply:
    • Concision: simple redaction, clear and understandable for all stakeholders
    • Completeness: requirements are written with enough information to be defined
    • Consistency: there are not contradictions between different requirements
    • Concreteness: requirements are not ambiguous, have only one interpretation
    • Verifiable: it could be check its fulfilment in the final product
  • Analysis Testing: activities applied to analysis phase aimed on reviewing how analysis techniques have been applied and to verify the traceability between requirements and analysis specification.
  • Design Testing: activities applied to design phase aimed on reviewing how design techniques have been applied and to verify the traceability between requirements, analysis and design specifications.
Software Testing: services useful to apply to a software delivery. Software delivery testing is organized in a wide set of testing services which are focussed on project risk management (this will be expose on other blog entry). The workflow of services could be like this one:



  • Build and Static Code Review: source code verification, in order to measure code rules fulfilment and code quality. Build process ensure that the organization is able to build the application from the source code delivered by the development team. This ensures independence from the development supplier and business continuity.
  • Setup Testing: is possible to deploy the application with the documentation and sources that have been delivered?
  • Functional Testing: focused on system functional behaviour, according to the test plan and the requirement specification.
  • Usability Testing: test system usability, ease of use, consistent behaviour... if the organization has a usability rules, they should be reviewed.
  • Accessibility Testing: if there is any accessibility requirements this service will review its fulfilment
  • Regression Testing: functional testing of previous functionality, that should be kept invariant. Usually regression testing is based on automed programs aimed to decrease testing cost.
  • Performance Testing: stress, capability and continuous load test in order to predict system behaviour on real conditions
  • Security Testing: test of security vulnerabilities of system implementation (see owasp recomendations)


Wednesday, January 2, 2013

TMMi Overview

TMMI overview


As software tester professional, I have had to organize and manage a testing office for public government for the last three years. From the early beginning I focused on reading several documents and getting around testing techniques and services and then I moved to test oriented method and to manage and communicate with stakeholders in my organization.

As a horizontal service we have to be able to adapt to different requirements and different approximations to delivery processes, quality criteria and summary reporting models.

After these years I concluded that we have to focus in several vectors of interest:
  • Technology capabilities: the testing office has to be able to assume and to be specialist in all IT aspect, in order to diagnose software characteristics, as functionality, security, usability, accessibility...
  • Planning and Management: we have to be able to plan every delivery and to manage what we are doing, effort, defects, delay.. every aspect related to testing process as a whole.
  • Stakeholder relation: know which roles are involved in a delivery and what have to be done in each moment by whom.
  • Methodology base: the project and every  customer of testing office have to be normalized, in the appropriate grade, in order to know the main lifecycle steps, and the normalization of every main aspect (architectural normalization, codification, exception management, integration capabilities...)
  • Acceptance model: we have to develop an acceptance model which consider in an objective way, which delivery has to be accepted, and which in proposal for a new delivery, in order to reach the acceptance quality level.
To give an answer to all these issues, Test Maturity Model Integration is a good approximation (TMMi), there are other approximations but I will focus on TMMi because for me is the wider and more useful of all of them.

TMMi presents a staged architecture to improve testing capabilities, organized like CMMi, it is organized on maturity levels, process areas, generic goals, generic practices, specific goals and specific practices.




As a brief resume I will expose the main content, in next publications I will try to expose how we have implemented it in our organization.

The maturity levels proposed by TMMi are:
  • Level 1. Initial: no testing processes are defined or applied in the organization. We could test but there are not any systematic approach. It could be consider not testing but debugging.
  • Level 2. Managed: the process areas defined in this level are:
    • Test Policy and Strategy: establish a test policy (test objectives, goals and strategic), and based on the test policy a test strategy will be defined. The strategy will define generic product risks and how to mitigate them. Also from the test policy Test Performance Indicators (TPI) will be defined.
    • Test Planning: defines a test approach for performing and managing testing activities. To determinate which requirements will be tested, to what degree, in which delivery. For every delivery a schedule will be given.
    • Test Monitoring and Control: control the test progress in order to detect any deviation and the product quality to apply any exit criteria that should apply.
    • Test Design and Execution: test specification (inputs, preconditions, execution, results, postconditions), test design techniques, execution of tests and defect reporting and collaboration to closure.
    • Test Environment: test environment consists on hardware requirements, data sets and every thing needed for testing purpose, with the goal of being independent and ensure the fiability of testing results
  • Level 3. Defined: 
    • Test Organization: the purpose is to identify and organize a group of people that is responsible for testing activities.
    • Test Training Program: develop a training program for testing group in order to improve knowledge.
    • Test Lifecycle and Integration: integration of development lifecycle and testing lifecycle.
    • Non-functional Testing: this process area aimed on improving testing capabilities for non-functional testing, in order to systematize performance, security or instalability testing.
    • Peer Reviews: verification of work products between different stakeholders, focussed on product and defect understanding.
  • Level 4. Measured
    • Test Measurement: the purpose is to identify, collect, analyze and apply mesuarements to ensure the effectiveness and efficiency of test processes.
    • Product Quality Evaluation: objective measurement of product quality with  a quantitative indicator.
    • Advanced Peer Reviews: add to Peer Review process area in level 3 an early product quality measure to enhance test strategy and test design previously to test execution.
  • Level 5. Optimization
    • Defect Prevention: identify and analyze commun causes of defects across all software development in the organization and define actions to prevent similar defects from occurring in the future.
    • Quality Control: statistical manage and control of the test process. Predict product quality.
    • Test Process Optimization: continuous  test process improvement.
In order to be exhaustive I will summarize the generic goals and practices:
  • GG2. Institutionalize a Managed Process
    • GP 2.01. Establish an organizational policy
    • GP 2.02. Plan the process
    • GP 2.03. Provide Resources
    • GP 2.04. Assign Responsibilities
    • GP 2.05. Train People
    • GP 2.06. Manage Configurations
    • GP 2.07. Identify and Involve Relevant Stakeholders
    • GP 2.08. Monitor and Control the Process
    • GP 2.09. Objectively Evaluate Adherence
    • GP 2.10 Review Status with Higher Level Management
  • GG3. Institutionalize a Defined Process
    • Establish a Defined Process
    • Collect Improvement Information