In classic Waterfall (plan-driven): you are more likely to see BRD → PRD/FRD/SRS → design → build → test
In Agile: you are more likely to see vision / roadmap / epics / user stories / acceptance criteria, though some teams still keep a PRD for larger features or releases.
A useful rule:
Minimum set for a small Waterfall project
| Phase | Minimum Artifact |
|---|---|
| Planning | Business Case / Charter, Project Plan |
| Requirements | BRD, SRS/FRD, RTM |
| Design | HLD, LLD |
| Build | Source Code, Build/Config Docs |
| Testing | Test Plan, Test Cases, Defect Log, UAT Sign-off |
| Deployment | Deployment Plan, Release Notes, Rollback Plan |
| Maintenance | User Manual, Support Manual, Change Log |
The sequence follows the “Why -> What -> How -> Execute” progression:
BRD (Business Requirements Document) - The “Why”: This captures the business goals, target audience, ROI, and high-level problems the software will solve. It gets alignment from stakeholders and executives.
PRD (Product Requirements Document) - The “What”: Product Managers translate the BRD into specific product features. This defines user journeys, UI/UX requirements, and functional specifications without worrying about the underlying code.
SDD (System/Software Design Document) - The High-Level “How”: Architects and Lead Engineers take the PRD and design the overall system architecture. This includes database schemas, cloud infrastructure, API contracts, and system integrations.
TDD (Technical Design Document) - The Detailed “How”*: Senior Developers break the SDD down into specific component designs. This covers data structures, algorithms, class diagrams, and specific libraries to be used.
Code: The actual implementation, which should now be incredibly straightforward because all the hard thinking and problem-solving was done in the previous steps.
Traditional documentation is optimized for human alignment—building shared understanding over time. AI, on the other hand, requires context optimization.
AI agents act like incredibly fast, highly capable junior developers who have amnesia: they need explicit boundaries, structured inputs, and exact architectural constraints for every task.
The document strategy you should adopt when driving an AI coding agent:
AI struggles with 50-page PDFs. If you feed an entire SDD into an agent, it will lose focus or hallucinate. Your strategy must shift to bite-sized, isolated Markdown files.
Feature-Specific PRDs: Instead of one massive PRD, break down user journeys into single-feature Markdown files (e.g., feature-webhook-integration.md).
Targeted Context: When asking the AI to build a specific REST endpoint, you only feed it the PRD for that feature, the specific database schema it touches, and the global rules.
Before writing any feature code, you need a living document (often a .cursorrules file or a global system prompt) that defines the non-negotiable architectural boundaries. This replaces a lot of the boilerplate found in a traditional SDD.
It should explicitly state:
Tech Stack: “Use Java 21, Spring Boot 3, and PostgreSQL.”
Architectural Patterns: “We are using a microservices architecture. Implement the Transactional Outbox pattern for all database writes.”
Infrastructure: “Target deployment is GCP; optimize for containerized Kubernetes environments.”
Coding Standards: “Always use constructor injection. Write unit tests using JUnit 5 and Mockito.”
AI agents parse structured data much better than prose. Instead of writing paragraphs explaining how a system should work, your technical documentation should use formats the AI natively understands:
Database Schemas: Provide exact DDL (SQL) or ORM models instead of entity-relationship descriptions.
API Contracts: Use OpenAPI/Swagger specifications or clear JSON structures for request/response payloads.
Logic & Flows: Use Mermaid.js diagrams directly in your Markdown. AI agents can read Mermaid syntax to understand complex logic, like chunk conflict detection pipelines or data retrieval workflows, perfectly.
With an AI agent, the document flow looks more like this:
Lightweight PRD (Markdown): Define the exact acceptance criteria and edge cases for a specific feature.
Architecture Decision Record (ADR): A brief document noting why a certain technology or pattern was chosen, keeping the AI aligned with the broader system architecture.
The Implementation Prompt: This becomes the new “TDD.” It is a highly specific prompt combining the lightweight PRD, the specific API contract, and the global rules, ending with a direct command to the AI.
The AI Stack
docs/global-architecture-rules.md (Tech stack, cloud provider, patterns)
docs/database-schema.sql (Ground truth for data)
docs/api-contracts.yaml (Ground truth for communication)
docs/features/feature-X.md (The isolated PRD for the current task)
Ideation & Architecture: AI as a Sounding Board: Lightweight Architecture Decision Records (ADRs) and Mermaid.js diagrams generated in minutes, rather than massive SDD files.
Context Engineering: The New “Design” Phase: lock down the exact database schema and API contracts first
Implementation: Prompt-Driven Development
Micro-Prompting: feed the AI a highly scoped, single-feature Markdown file.
Asynchronous Orchestration: review PRs, trigger terminal commands, or interact with coding agents
The human acts as the auditor, ensuring the right business logic is being tested, rather than just chasing code coverage percentages
Velocity skyrockets, but technical debt can accumulate just as fast if the AI is left unchecked.
The focus shifts to strict code review processes, robust CI/CD pipelines, and ensuring the team is skilled at “context engineering” (writing good prompts and maintaining clean documentation) rather than just writing syntax.