Dynamic Analysis (DAST)
debt(d7/e5/b5/t7)
Closest to 'only careful code review or runtime testing' (d7). The absence of dynamic analysis in a CI pipeline is not flagged by compilers, default linters, or standard SAST tools. The detection_hints note the code pattern is 'only static analysis in CI with no runtime testing' — a reviewer or architect must notice the gap. Tools like OWASP ZAP and Xdebug are listed but they are the solution, not detectors of their own absence. A team can ship for a long time without realising runtime vulnerabilities exist.
Closest to 'touches multiple files / significant refactor in one component' (e5). The quick_fix mentions running Xdebug coverage and deploying OWASP ZAP against staging — this requires setting up new tooling, configuring staging environments to be production-like, integrating DAST into CI pipelines, and potentially reworking test suites to cover untested paths. It is more than a single-line swap but does not require an architectural rewrite.
Closest to 'persistent productivity tax' (b5). Incorporating dynamic analysis affects multiple work streams: CI/CD pipeline configuration, staging environment maintenance, security review processes, and developer workflows for interpreting DAST results. The applies_to covers both web and cli contexts, giving it moderate reach. It is not a single-component concern but also does not define the entire system's shape.
Closest to 'serious trap (contradicts how a similar concept works elsewhere)' (t7). The misconception field explicitly states the trap: developers treat static and dynamic analysis as interchangeable, assuming that passing static analysis means the application is secure. This is a well-documented but serious cognitive error — static analysis tools feel comprehensive and are deeply integrated into CI, leading developers to incorrectly conclude runtime vulnerabilities are also covered. The common_mistakes reinforce this: teams rely solely on static analysis and miss authentication bypasses and injection flaws that only manifest at runtime.
Also Known As
TL;DR
Explanation
Dynamic Application Security Testing (DAST) attacks a live application from the outside, simulating real attack traffic without access to source code. Tools like OWASP ZAP and Burp Suite automate scanning for XSS, SQL injection, CSRF, open redirects, and misconfigurations. DAST finds vulnerabilities that static analysis misses (runtime configuration, third-party component behaviour) but has blind spots (complex business logic, authenticated workflows with state). Use DAST in combination with SAST (static) and manual penetration testing for comprehensive coverage.
Common Misconception
Why It Matters
Common Mistakes
- Relying solely on static analysis and missing runtime-only vulnerabilities.
- Dynamic analysis only in development — production-like load and data is needed to surface real issues.
- Not using fuzzing for input-handling code — fuzzers find edge cases that manual testing misses.
- Ignoring Xdebug's coverage mode for identifying untested code paths.
Code Examples
// Static analysis passes — dynamic analysis catches the bug:
function divide(int $a, int $b): float {
return $a / $b; // Static analysis: types look fine
// Dynamic analysis with input (1, 0): DivisionByZeroError at runtime
}
// Dynamic analysis — finds bugs by running the code
// Xdebug — code coverage + step debugging
$ XDEBUG_MODE=coverage vendor/bin/phpunit --coverage-html=coverage/
// OWASP ZAP — automated web vulnerability scanner
$ docker run -t owasp/zap2docker-stable zap-baseline.py -t https://staging.yourapp.com
// Infection — mutation testing (tests the quality of your tests)
$ vendor/bin/infection --threads=4
// Creates mutations (e.g. changes == to !=) — your tests should catch them
// MSI (Mutation Score Indicator) > 80% = good test quality
// Blackfire — performance profiling
$ blackfire run php script.php
// Valgrind-equivalent for PHP memory:
$ php -d memory_limit=-1 -r 'require "script.php";' 2>&1 | grep 'memory'