| Emerging | Proven in #745, #771 |
Portable
This investigation methodology could become a user-facing capability. When Piper helps users track and manage issues, the cascade audit pattern could guide users to discover related problems — “You fixed this bug; would you like me to check for similar issues in this category?” This preserves the discipline’s value while automating the systematic audit prompts.
Bug fixing typically follows a reactive cycle: find bug, fix bug, move on. This leaves adjacent issues undiscovered until they surface independently — often at worse times. Development teams experience:
The Cascade Investigation pattern addresses this by treating every bug fix as a trigger for category-wide audit, turning a single fix into a systematic sweep that surfaces adjacent problems while context is fresh.
Core Concept: When fixing a bug, audit the entire category before moving to the next task.
The key insight: a single bug is rarely alone. The conditions that created it likely created siblings. Fixing one without auditing the category is leaving known unknowns on the table.
What distinguishes this from related patterns:
Bug Fix
↓
┌─────────────────────────┐
│ Category Audit │ → "What else in this category
│ "Is this a pattern?" │ might have the same issue?"
└─────────────────────────┘
↓
┌─────────────────────────┐
│ Adjacent Discovery │ → File or fix each finding
│ (0-N new issues) │
└─────────────────────────┘
↓ (for each discovery)
┌─────────────────────────┐
│ Recursive Category │ → Does THIS finding suggest
│ Audit (if warranted) │ a broader category?
└─────────────────────────┘
↓
Resolution + Evidence Table
Each cascade investigation should produce an evidence table documenting depth:
| Depth | Discovery | Category Audited | Issues Found | Action |
|-------|-----------|-----------------|--------------|--------|
| 0 | Original bug | — | 1 | Fixed |
| 1 | Category audit | [category] | N | Fixed/Filed |
| 2 | Adjacent category | [category] | N | Fixed/Filed |
| ... | ... | ... | ... | ... |
| **Total** | | | **X issues** | |
1. Fix the original bug
2. PAUSE — don't move to next task
3. Ask: "What category does this bug belong to?"
- Examples: timezone handling, user scoping, auth flow, setup UX
4. Audit: Search codebase for same pattern in that category
- grep for similar code patterns
- Check related files/modules
- Review tests for same assumptions
5. For each discovery:
a. Fix if quick (<5 min) or file as tracked issue
b. Ask: "Does this suggest a DIFFERENT category to audit?"
c. If yes, repeat from step 3 for the new category
6. Document cascade in evidence table
7. Report total discoveries to PM
Trigger: #745 — Hardcoded user_id in todo service
| Depth | Discovery | Category | Issues | Action |
|---|---|---|---|---|
| 0 | #745 hardcoded user_id | — | 1 | Fixed |
| 1 | Audit: other hardcoded user_ids | User scoping | 6 | Fixed (#746-#751) |
| 2 | Timezone warnings surfaced | Datetime handling | 7 | Fixed (#752-#758) |
| 3 | Multi-tenancy audit gaps | Auth context | 2 | Fixed (#758-#759) |
| Total | 16 issues | All resolved same day |
Trigger: #757 — File scoring uses wrong timezone comparison
| Depth | Discovery | Category | Issues | Action |
|---|---|---|---|---|
| 0 | #757 file scoring timezone | — | 1 | Fixed |
| 1 | Audit: datetime usage | Datetime imports | 3 | Fixed (#768, #769, #770) |
| 2 | Schema drift: 73 columns | DB schema | 1 | Migration #771 |
| 3 | Schema validator false positive | Tooling | 1 | Filed #773 |
| Total | 6 issues | 5 fixed, 1 tracking |
Trigger: #734 — Calendar token leaked across users
| Depth | Discovery | Category | Issues | Action |
|---|---|---|---|---|
| 0 | #734 cross-user token leak | — | 1 | Fixed |
| 1 | Audit: user scoping in repos | Repository pattern | 12 | Fixed (9-phase plan) |
| 2 | Auth context in routes | Route auth | 3 | Fixed |
| Total | 16 issues | 94 tests added |
docs/internal/development/reports/pattern-sweep-2.0-results-2026-02-03.mdPattern documented: February 5, 2026 Approved in Pattern Sweep 2.0 (#777) — CIO response February 4, 2026