Skip to main content
Talk · Rex Black, Inc.

Five trends affecting software testing. What the AI era is doing to the QA profession.

AI-generated code, platform engineering, expanded software liability, supply-chain risk, and continuous quality are the five macro-forces reshaping software testing right now. This talk names each, explains the threat to the legacy testing profession, and maps the specific skills and practices the next generation of test engineers has to build. It's the 2026 successor to a trend-forecasting talk first given in the early 2000s — the five calls made then (globalization, test automation, commoditization, compliance, certification) all played out. These five are the next round.

Trends covered
5
Horizon
3–5 years
Archival deck
2005 original
Format
Keynote / briefing
Download the deck (PDF)

Modern 2026 deck. The original 2005 forecast is preserved as an archival PDF — see the related links below.

Open interactive slides

Abstract

The profession reshapes itself every generation. This is that generation.

In the early 2000s Rex Black, Inc. called five trends that would reshape software testing: globalization of software development, early-phase test automation, IT/high-tech commoditization, compliance and regulatory pressure, and education/certification. All five played out. Offshore QA teams became standard; TDD and CI became table stakes; software got cheap and good enough, and the customer expectation shifted from "wows" to "works"; HIPAA and SOX grew into GDPR and EU AI Act; ISTQB became the industry's baseline credential. Calling a trend well is hard; calling five in a row is why this deck gets cited two decades later.

This talk is the successor — the five we're calling now. Not guessing; reading the forcing functions already in motion. (1) AI-generated software changes the economics of what "a unit of work" means, and forces testing to become AI-assisted too or lose relevance. (2) Platform engineering and internal developer platforms turn the test environment into a first-class platform concern. (3) Regulation and product liability have moved from checkbox compliance to a continuous obligation with legal teeth. (4) The software supply chain — open-source dependencies, third-party APIs, model providers — is now the largest unaudited risk surface in most products. (5) Continuous quality via observability replaces the old "test before ship, measure after" divide with a continuous signal loop that testers own end-to-end.

Each section names the trend, explains the threat to the legacy profession, describes what a competent testing function looks like on the other side, and points at canonical resources for building the skill. The original 2005 deck is linked as an archival PDF at the bottom of the page — useful reading before or after, mostly because watching the prior round play out is the best reason to take this round seriously.

Learn to sail into the wind. Disruptions create opportunities for those quick enough to seize them. Consider how the major trends will affect testing and plan your career moves accordingly.

Rex Black, Inc.

Outline

What the talk covers, in order.

01

Trend 1 — AI-generated software, and AI-assisted testing

Engineering teams are shipping AI-drafted code at several multiples of historical velocity. If test capacity does not scale proportionally, the bottleneck simply moves — from "not enough developers" to "not enough testers." The test profession has to meet AI-generated software with AI-assisted testing: LLM-based test generation, agent-based exploratory testing, model-evaluation harnesses for the AI components themselves, and behavioral testing against prompt-conditioned or fine-tuned systems. Testers who can use these tools, review their output critically, and calibrate them against real quality risks will be in high demand. Testers who refuse the category will face the outsourcing-era fate from two decades ago.

  • Threat: volume of code to test outpaces human test capacity; testing becomes the bottleneck or the blind spot.
  • What to learn: LLM-based test generation and critique, agent-driven exploration, model-eval frameworks, prompt/evaluator testing, AI behavior regression.
  • Resource: the ISTQB AI Testing syllabus, model-eval literature from providers (OpenAI evals, Anthropic's capability evals), published AI-incident postmortems.
02

Trend 2 — Platform engineering and ephemeral test environments

Internal Developer Platforms (IDPs) are absorbing what used to be ad-hoc infra work. Engineers now self-serve dev, test, and staging environments through a platform layer; infrastructure is code; the whole stack is described in Terraform/Pulumi/Crossplane/K8s. This changes where test environment management lives — the long-complex-test-environments discipline is now a platform discipline. Testers who own their environment model, version it in git, and wire it into the platform team's CI/CD tooling get production-grade test infrastructure. Testers who still ask the ops team for a VM lose ground.

  • Threat: test environment management is being absorbed by platform engineering; QA teams that don't participate get what's left over.
  • What to learn: infra-as-code (Terraform/Pulumi), ephemeral preview environments, K8s test fixtures, CI/CD pipeline design, environment observability.
  • Resource: the CNCF platform-engineering landscape, the Team Topologies model, and our own managing-complex-test-environments talk (which still maps cleanly to platform-era tooling).
03

Trend 3 — Regulation, product liability, and AI governance

What began with SOX and HIPAA has compounded. GDPR, state-level US privacy laws, the EU AI Act, SEC cyber-incident disclosure rules, sectoral rules in healthcare (IEC 62304) and automotive (ISO 26262) and financial services — all of these treat software testing evidence as a legal artifact, not an internal hygiene practice. For AI-enabled products, governance adds a whole second stack: model documentation, evaluation disclosures, bias testing, model-cards and system-cards, post-deployment monitoring. Testing organizations that can produce audit-ready traceability (requirement → test → evidence → outcome) become risk-management infrastructure for the whole company. Those that can't create exposure.

  • Threat: testing evidence is now discoverable. A poor testing record is a legal liability, not just a quality problem.
  • What to learn: requirements traceability, audit-grade test documentation, security-testing fundamentals (OWASP ASVS), AI-system evaluation standards, privacy-testing practice.
  • Resource: the EU AI Act published text, NIST AI Risk Management Framework, the OWASP ASVS, ISO 29119 for test-process alignment.
04

Trend 4 — The software supply chain as the largest unaudited risk

Every shipping product is now mostly open-source and third-party code — frameworks, libraries, cloud services, external APIs, model providers. The SolarWinds, Log4Shell, and xz-utils incidents showed what happens when that surface goes untested. Software Bills of Materials (SBOMs), SCA/SAST/DAST integration, API-contract testing, dependency-update risk analysis, third-party model monitoring — these are not developer concerns anymore, they're testing concerns. The testing organization that owns the software supply-chain test program becomes indispensable. The one that still defines scope as "our code only" misses most of the product's actual attack and failure surface.

  • Threat: the product's risk surface is now bigger than the code the team wrote. Testing scope has to expand or it becomes irrelevant.
  • What to learn: SBOM tooling (Syft, Dependency-Track), SCA/SAST integration, API-contract testing (Pact, OpenAPI validation), dependency-risk scoring, third-party model evaluation.
  • Resource: CISA SBOM guidance, the SLSA supply-chain framework, OWASP's software supply-chain security top 10.
05

Trend 5 — Continuous quality via observability and SLOs

The old line — "testers test before ship, ops measures after" — does not survive modern deployment cadences. Feature flags, progressive rollouts, canary deploys, real-user monitoring, and SLO/error-budget discipline mean the biggest test signal in a mature organization comes from production. Testing organizations that operate only in pre-production miss the majority of the quality signal. The next-generation test engineer reads production telemetry, owns synthetic monitoring, defines and defends SLOs jointly with SRE, and closes the loop from "bug in production" back to "test added pre-production" in hours, not releases. This is where continuous quality lives.

  • Threat: pre-production-only testing is insufficient; production telemetry is where the real quality signal is.
  • What to learn: SLO design, error-budget accounting, synthetic monitoring, RUM (real-user monitoring), chaos testing, progressive-delivery tooling.
  • Resource: Google's SRE book (the SLO chapters), the OpenTelemetry project, the Progressive Delivery Foundation.
06

How the five trends compound

The five trends are not independent. AI-generated code (1) raises the test volume, which forces automation into the platform-engineered environment (2), which produces audit-ready evidence for regulators (3), including evidence covering the third-party model dependency (4), with production telemetry (5) feeding the closed-loop quality system. The test-engineering role that survives the next cycle is the one that can operate across all five. The one that specializes narrowly in "write test cases against a spec" is the one that gets outsourced or AI-assisted out of the industry.

07

What any tester can do now — a skills ladder

A three-to-five year development plan that covers the five trends without requiring a career restart. Year one: pick up AI-assisted testing tools and a solid LLM-evals foundation. Year two: go deep on infra-as-code and CI/CD; commit the test-environment model to git. Year three: add audit-grade traceability and one compliance specialty (AI governance, medical, or financial). Year four: add supply-chain testing (SBOM, SCA integration). Year five: take ownership of production quality signal — SLOs, error budgets, synthetic monitoring. Five years, five trends, one professional you.

Key takeaways

5 things to remember.

01

The profession renews every generation.

The five trends called in 2005 all played out. The five we're calling now will too — on a 3-to-5 year horizon. Build skills that match the next round, not the last one.

02

AI-generated software changes the denominator.

If dev velocity 3–10× and test capacity stays the same, the bottleneck moves to testing — or the quality drops. AI-assisted testing is not optional in the medium term; it's the only way the math works.

03

Testing evidence is now a legal artifact.

GDPR, EU AI Act, SEC cyber rules, sectoral compliance — testing records are discoverable and auditable. Traceability (requirement → test → evidence) is now risk management, not just hygiene.

04

The product isn't just your code.

Supply chain risk — open-source dependencies, third-party APIs, model providers — is the largest unaudited surface most products have. Testing scope has to expand accordingly or it stops being useful.

05

Production is the biggest test signal.

SLOs, error budgets, synthetic monitoring, progressive rollouts. Pre-production-only testing leaves most of the quality signal on the floor.

Closing

Two closing points. First, the five trends are disruptive but not apocalyptic. The testing profession survived the outsourcing wave, the CI/CD wave, and the compliance wave. It will survive this round too — for practitioners who engage with it. The ones who retreat into a narrow "run the test plan" definition of the role are the ones who disappear.

Second, the best reason to take this round seriously is the last round. The archival 2005 deck linked below named five trends back then. Every one of them played out. That's the track record behind the calls here. Look at it, think about it, and build accordingly.

More for this audience

Articles, guides, and case studies tagged for the same readers.

Want this talk delivered in-house?

Rex Black, Inc. delivers every talk on this site as a live workshop, a keynote, or a conference session. Tailored to your stack, your team, and your timeline.