Turn your manual testers into automation experts! Request a DemoStart testRigor Free

Test Manager Cheat Sheet

A test manager shoulders the responsibility of contributing to the QA operations in an organization. They work closely with development teams, QA teams, and project managers to coordinate testing efforts and ensure that the software is thoroughly tested before it is released to users. Being well-versed with the various processes and concepts in the QA world is a must for every test manager. Hence we have created an easy-to-digest compilation that will come in handy if you are looking to brush up on your QA vocabulary.

Software development lifecycle (SDLC)

Principles of testing

Testing shows the presence of defects Testing can unearth defects but cannot guarantee a defect-free product
Exhaustive testing is impossible It is impractical to test every single combination of inputs and conditions. Instead focus on critical areas and workflows crucial for business
Early testing Fixing defects early on in the cycle is easier and less costly
Defect clustering A small number of modules or components typically contain the majority of defects. Focusing testing efforts on these areas can yield significant defect detection
Pesticide paradox If the same tests are repeated over time, they become less effective at finding new defects
Testing is context-dependent The testing approach and techniques used should be tailored to the specific context of the project, including its goals, requirements, and constraints
Absence-of-error fallacy The absence of errors in testing doesn’t guarantee that the software is defect-free. It’s possible that some defects remain undetected

Why do we test?

We test the product to:

  • Identify defects in the product
  • Make sure that the product meets customer requirements
  • Product is well suited for the purpose it is made for

QA v/s QC

  • It is a preventive measure, focusing on the process of ensuring quality
  • Evaluates the effectiveness of methodologies, workflows, and processes within the organization for STLC
  • Proactive in terms of setting guidelines before the product is developed
  • It involves documenting guidelines and standards, and even training other teams to follow
  • Focus on the product to ensure its quality
  • Ensures that the product is bug-free and ready for the end user
  • Reactive in the sense that testing starts after the product is developed

How much testing is enough?

Software testing lifecycle (STLC)

Requirement Analysis
  • Requirements documented during SDLC are analyzed for better understanding
  • What portion can be managed through automation testing is also decided here
Test Planning
  • This is an overview planning of testing activities and resources that will be needed for it
  • A comprehensive test plan outlines the testing scope, objectives, resources, timelines, roles and responsibilities, and strategies
  • Should factor in risk and outline ways to mitigate or manage it
Test Design
  • Focuses on test case design, test data design, test environment design, and test procedure design
  • Test scenarios are identified here
  • The aim remains to provide maximum test coverage
Test Development
  • This is the actual creation of test cases
  • It could be for both manual or automated test cases
Test Execution
  • Test cases are run in the test environment
  • Results are documented for report generation
  • Any identified bugs are logged
Test Closure
  • Test reports are compiled to show execution results, time taken, issues identified, test coverage

Levels of testing

Unit tests
  • Focus on individual units of code like functions and classes
Integration tests
  • Focus on integrations between different systems
End-to-end tests
  • Testing is done from the customer’s perspective rather than a developer’s
  • UI-based testing hence takes longer to run

Types of testing

Functional testing

  • System testing
  • Acceptance testing
  • Smoke testing
  • Regression testing
  • UI testing
  • UX testing
  • Cross-platform testing
  • Usability testing

Non-functional testing

  • Performance testing
  • Load testing
  • Stress testing
  • Security testing
  • Accessibility testing
  • Globalization testing

Static testing

  • Thrive on reviewing or analyzing the software artifacts, such as requirements, design documents, and code, to identify defects, inconsistencies, and potential issues
  • Common techniques include
    • Code reviews
    • Design reviews
    • Peer reviews
    • Walkthrough

Dynamic testing

  • These techniques focus on assessing how the software functions during runtime and provide results by identifying defects
  • Common techniques include
    • System testing
    • Acceptance testing
    • Performance testing
    • Security testing
    • Usability testing
    • Load testing
    • Regression testing

Test management phases

Risk Analysis
  • Every project involves some form of risk
  • These should be analyzed and a mitigation plan should be made
Test Estimation
  • Estimating how long the testing activity will take
  • Results in better planning, execution and monitoring of tasks
Test Planning
  • A high level planning of the entire testing process
Test Organization
  • Assigning roles and responsibilities to testers in the team based on their skill level and competency
Test Monitoring and Control
  • Important for ensuring that project is on track
  • Each testing activity needs to happen within designated time and if it does not, then appropriate measures should be taken
Issue Management
  • Identified issues must be tracked till resolution
Test Report and Evaluation
  • A report of the testing activities needs to be provided to show the test coverage and if exit criteria were met

Test Strategy

  • It is a strategic overview of how testing will be conducted and sets the direction for the testing efforts
  • Test strategy talks about types of testing that will be performed, tools to be used for the same, along with the scope of testing
  • It may be part of a test plan or a stand-alone document
  • Differs from test plan as test plan is more comprehensive and talks about aspects as well like test cases to be used, testing schedule, milestones, resource allocation, test environment, and defect reporting process
  • Types of test strategies include
    • Analytical
    • Model-based
    • Methodical
    • Compliance-based
    • Regression-averse
    • Consultative
    • Reactive

Dealing with Risks

Risks can occur in:

  • The project
    • Risks that can impact project progress
  • The product
    • Risks like the ones that lead to the product not satisfying the requirements, customers, or stakeholders

Failures can occur due to:

  • Defects in the system
  • Environmental conditions
  • Malicious activities

Steps to manage risks

Risk mitigation strategies

These strategies aim to address risks before they negatively impact the testing process and project outcomes. They are as follows:

  • Risk avoidance: Bypassing activities that cause this risk. It is usually a high-stakes risk that demands such measures
  • Risk acceptance: Low-stakes risks that are acceptable and are usually forecasted during the planning stage are allowed to exist
  • Risk reduction or control: Making changes in the plan to avoid or minimize the risk and its consequences
  • Risk transfer: Delegating the risk handling to another party

Test Reporting

Types of test reports

Test incident report
  • Summarizes the defects found during testing. It includes details about each defect, such as its unique id, severity, status, steps to reproduce, and the affected component
Test summary report
  • Overall outcome of the testing activities to see if the product is ready for release
  • Comprises of details like tests run, deviation from test plan, and issues encountered
Test cycle report
  • Tests can be run multiple times in a phase or if test cases are run selectively during test cycles then this report tracks those results
Traceability matrix
  • Links test cases to requirements, ensuring that each requirement has corresponding test cases and vice versa

Entry and exit criteria

  • Entry Criteria: These are conditions that must be fulfilled before testing activities can commence
  • Exit Criteria: These are the conditions that need to be met before testing can be considered complete and the software can progress to the next phase or be released

Pointers to create a good test report

Responsibilities of a test manager

Test Metrics

Defect Density
  • Number of defects found in a specific component or area of the software, normalized by the size of that component
  • Defect Density = Number of Defects / Size of Component (e.g., lines of code, function points)
Defect Removal Efficiency (DRE)
  • To measure the number of defects found internally
  • DRE = (Total Defects Found – Total Defects Found in Production) / Total Defects Found
Test Coverage
  • Measures the extent to which the code or requirements are exercised by test cases
Defect Age
  • Measures how long a defect remains unresolved
  • Importance of the issue to be resolved with respect to the customer
  • The seriousness of the issue in terms of functionality of the product

Tips for choosing tools

Be it test management, automation testing or issue tracking system, using frameworks and tools will ensure that your activities are effective.

When choosing these tools, consider:

  • Ease of integration with existing system
  • Costs in terms of license, additional skilled manpower
  • Ease of learning and onboarding
  • Suitability for the task (whether it offers the required functionalities)

Streamline automation testing using testRigor

In the hunt for a suitable automation testing tool, a test manager comes across many choices. Automation testing can seem like a double-edged sword with most tools in the market. Luckily, that is not the case with testRigor. This powerful test automation tool uses AI to make test creation, execution, and maintenance ultra-smooth and easy.

When it comes to writing automation scripts, you need not worry about your team learning a new coding language or hiring skilled experts to do the job. Not just manual testers, but anyone from your team can write test cases using this tool in plain English statements. testRigor leverages AI to convert these statements into executable steps. Not only that, but it also makes interacting with UI elements very easy. With the latest generative AI feature, testRigor can create fully functional test cases with just a scenario description.

When the testing volume increases and test suites tend to become heavy, most test tools start performing poorly. With the help of AI, testRigor gives you speedy test executions and reliable test runs. Moreover, your test maintenance efforts are reduced to a bare minimum as testRigor does most of the heavy lifting. This means that you can quickly complete your test cycles and generate accurate test reports.

Along with supporting end-to-end testing across applications of the web, mobile, and desktop, this tool can integrate with other tools and platforms that offer services like test case management, issue and requirement tracking, and device farms.

Like the companies that have migrated to testRigor and found significant improvements in their quality processes, you too can speak to testRigor’s sales team or try out the tool for yourself to see it in action.

Join the next wave of functional testing now.
A testRigor specialist will walk you through our platform with a custom demo.
Related Articles

Top 5 QA Tools to Look Out For in 2024

Earlier, test scripts were written from a developer’s perspective to check whether different components worked correctly ...

Best Practices for Creating an Issue Ticket

“Reminds me of the awesome bug report I saw once: Everything is broken. Steps to reproduce: do anything. Expected result: it ...