mirror of
https://github.com/samiyev/puaros.git
synced 2025-12-28 07:16:53 +05:00
Compare commits
13 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
7fea9a8fdb | ||
|
|
b5f54fc3f8 | ||
|
|
8a2c6fdc0e | ||
|
|
2479bde9a8 | ||
|
|
f6bb65f2f1 | ||
|
|
8916ce9eab | ||
|
|
24f54d4b57 | ||
|
|
d038f90bd2 | ||
|
|
e79874e420 | ||
|
|
1663d191ee | ||
|
|
7b4cb60f13 | ||
|
|
33d763c41b | ||
|
|
3cd97c6197 |
@@ -5,6 +5,80 @@ All notable changes to @samiyev/guardian will be documented in this file.
|
|||||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||||
|
|
||||||
|
## [0.7.6] - 2025-11-25
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- ♻️ **Refactored CLI module** - improved maintainability and separation of concerns:
|
||||||
|
- Split 484-line `cli/index.ts` into focused modules
|
||||||
|
- Created `cli/groupers/ViolationGrouper.ts` for severity grouping and filtering (29 lines)
|
||||||
|
- Created `cli/formatters/OutputFormatter.ts` for violation formatting (190 lines)
|
||||||
|
- Created `cli/formatters/StatisticsFormatter.ts` for metrics and summary (58 lines)
|
||||||
|
- Reduced `cli/index.ts` from 484 to 260 lines (46% reduction)
|
||||||
|
- All 345 tests pass, CLI output identical to before
|
||||||
|
- No breaking changes
|
||||||
|
|
||||||
|
## [0.7.5] - 2025-11-25
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- ♻️ **Refactored AnalyzeProject use-case** - improved maintainability and testability:
|
||||||
|
- Split 615-line God Use-Case into focused pipeline components
|
||||||
|
- Created `FileCollectionStep.ts` for file scanning and basic parsing (66 lines)
|
||||||
|
- Created `ParsingStep.ts` for AST parsing and dependency graph construction (51 lines)
|
||||||
|
- Created `DetectionPipeline.ts` for running all 7 detectors (371 lines)
|
||||||
|
- Created `ResultAggregator.ts` for building response DTO (81 lines)
|
||||||
|
- Reduced `AnalyzeProject.ts` from 615 to 245 lines (60% reduction)
|
||||||
|
- All 345 tests pass, no breaking changes
|
||||||
|
- Improved separation of concerns and single responsibility
|
||||||
|
- Easier to test and modify individual pipeline steps
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- 🤖 **AI Agent Instructions in CLI help** - dedicated section for AI coding assistants:
|
||||||
|
- Step-by-step workflow: scan → fix → verify → expand scope
|
||||||
|
- Recommended commands for each step (`--only-critical --limit 5`)
|
||||||
|
- Output format description for easy parsing
|
||||||
|
- Priority order guidance (CRITICAL → HIGH → MEDIUM → LOW)
|
||||||
|
- Helps Claude, Copilot, Cursor, and other AI agents immediately take action
|
||||||
|
|
||||||
|
Run `guardian --help` to see the new "AI AGENT INSTRUCTIONS" section.
|
||||||
|
|
||||||
|
## [0.7.4] - 2025-11-25
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- 🐛 **TypeScript-aware hardcode detection** - dramatically reduces false positives by 35.7%:
|
||||||
|
- Ignore strings in TypeScript union types (`type Status = 'active' | 'inactive'`)
|
||||||
|
- Ignore strings in interface property types (`interface { mode: 'development' | 'production' }`)
|
||||||
|
- Ignore strings in type assertions (`as 'read' | 'write'`)
|
||||||
|
- Ignore strings in typeof checks (`typeof x === 'string'`)
|
||||||
|
- Ignore strings in Symbol() calls for DI tokens (`Symbol('LOGGER')`)
|
||||||
|
- Ignore strings in dynamic import() calls (`import('../../module.js')`)
|
||||||
|
- Exclude tokens.ts/tokens.js files completely (DI container files)
|
||||||
|
- Tested on real-world TypeScript project: 985 → 633 issues (352 false positives eliminated)
|
||||||
|
- ✅ **Added 13 new tests** for TypeScript type context filtering
|
||||||
|
|
||||||
|
## [0.7.3] - 2025-11-25
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- 🐛 **False positive: repository importing its own aggregate:**
|
||||||
|
- Added `isInternalBoundedContextImport()` method to detect internal imports
|
||||||
|
- Imports like `../aggregates/Entity` from `repositories/Repo` are now allowed
|
||||||
|
- This correctly allows `ICodeProjectRepository` to import `CodeProject` from the same bounded context
|
||||||
|
- Cross-aggregate imports (with multiple `../..`) are still detected as violations
|
||||||
|
|
||||||
|
## [0.7.2] - 2025-11-25
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- 🐛 **False positive: `errors` folder detected as aggregate:**
|
||||||
|
- Added `errors` and `exceptions` to `DDD_FOLDER_NAMES` constants
|
||||||
|
- Added to `nonAggregateFolderNames` — these folders are no longer detected as aggregates
|
||||||
|
- Added to `allowedFolderNames` — imports from `errors`/`exceptions` folders are allowed across aggregates
|
||||||
|
- Fixes issue where `domain/code-analysis/errors/` was incorrectly identified as a separate aggregate named "errors"
|
||||||
|
|
||||||
## [0.7.1] - 2025-11-25
|
## [0.7.1] - 2025-11-25
|
||||||
|
|
||||||
### Fixed
|
### Fixed
|
||||||
|
|||||||
895
packages/guardian/COMPARISON.md
Normal file
895
packages/guardian/COMPARISON.md
Normal file
@@ -0,0 +1,895 @@
|
|||||||
|
# Guardian vs Competitors: Comprehensive Comparison 🔍
|
||||||
|
|
||||||
|
**Last Updated:** 2025-01-24
|
||||||
|
|
||||||
|
This document provides an in-depth comparison of Guardian against major competitors in the static analysis and architecture enforcement space.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 TL;DR - When to Use Each Tool
|
||||||
|
|
||||||
|
| Your Need | Recommended Tool | Why |
|
||||||
|
|-----------|------------------|-----|
|
||||||
|
| **TypeScript + AI coding + DDD** | ✅ **Guardian** | Only tool built for AI-assisted DDD development |
|
||||||
|
| **Multi-language + Security** | SonarQube | 35+ languages, deep security scanning |
|
||||||
|
| **Dependency visualization** | dependency-cruiser + Guardian | Best visualization + architecture rules |
|
||||||
|
| **Java architecture** | ArchUnit | Java-specific with unit test integration |
|
||||||
|
| **TypeScript complexity metrics** | FTA + Guardian | Fast metrics + architecture enforcement |
|
||||||
|
| **Python architecture** | import-linter + Guardian (future) | Python layer enforcement |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Feature Comparison Matrix
|
||||||
|
|
||||||
|
### Core Capabilities
|
||||||
|
|
||||||
|
| Feature | Guardian | SonarQube | dependency-cruiser | ArchUnit | FTA | ESLint |
|
||||||
|
|---------|----------|-----------|-------------------|----------|-----|--------|
|
||||||
|
| **Languages** | JS/TS | 35+ | JS/TS/Vue | Java | TS/JS | JS/TS |
|
||||||
|
| **Setup Complexity** | ⚡ Simple | 🐌 Complex | ⚡ Simple | ⚙️ Medium | ⚡ Simple | ⚡ Simple |
|
||||||
|
| **Price** | 🆓 Free | 💰 Freemium | 🆓 Free | 🆓 Free | 🆓 Free | 🆓 Free |
|
||||||
|
| **GitHub Stars** | - | - | 6.2k | 3.1k | - | 24k+ |
|
||||||
|
|
||||||
|
### Detection Capabilities
|
||||||
|
|
||||||
|
| Feature | Guardian | SonarQube | dependency-cruiser | ArchUnit | FTA | ESLint |
|
||||||
|
|---------|----------|-----------|-------------------|----------|-----|--------|
|
||||||
|
| **Hardcode Detection** | ✅✅ (with AI tips) | ⚠️ (secrets only) | ❌ | ❌ | ❌ | ❌ |
|
||||||
|
| **Circular Dependencies** | ✅ | ✅ | ✅✅ (visual) | ✅ | ❌ | ✅ |
|
||||||
|
| **Architecture Layers** | ✅✅ (DDD/Clean) | ⚠️ (generic) | ✅ (via rules) | ✅✅ | ❌ | ⚠️ |
|
||||||
|
| **Framework Leak** | ✅✅ UNIQUE | ❌ | ⚠️ (via rules) | ⚠️ | ❌ | ❌ |
|
||||||
|
| **Entity Exposure** | ✅✅ UNIQUE | ❌ | ❌ | ❌ | ❌ | ❌ |
|
||||||
|
| **Naming Conventions** | ✅ (DDD-specific) | ✅ (generic) | ❌ | ✅ | ❌ | ✅ |
|
||||||
|
| **Repository Pattern** | ✅✅ UNIQUE | ❌ | ❌ | ⚠️ | ❌ | ❌ |
|
||||||
|
| **Dependency Direction** | ✅✅ | ❌ | ✅ (via rules) | ✅ | ❌ | ❌ |
|
||||||
|
| **Security (SAST)** | ❌ | ✅✅ | ❌ | ❌ | ❌ | ⚠️ |
|
||||||
|
| **Dependency Risks (SCA)** | ❌ | ✅✅ | ❌ | ❌ | ❌ | ❌ |
|
||||||
|
| **Complexity Metrics** | ❌ | ✅ | ❌ | ❌ | ✅✅ | ⚠️ |
|
||||||
|
| **Code Duplication** | ❌ | ✅✅ | ❌ | ❌ | ❌ | ❌ |
|
||||||
|
|
||||||
|
### Developer Experience
|
||||||
|
|
||||||
|
| Feature | Guardian | SonarQube | dependency-cruiser | ArchUnit | FTA | ESLint |
|
||||||
|
|---------|----------|-----------|-------------------|----------|-----|--------|
|
||||||
|
| **CLI** | ✅ | ✅ | ✅ | ❌ (lib) | ✅ | ✅ |
|
||||||
|
| **Configuration** | ✅ (v0.6+) | ✅✅ | ✅ | ✅ | ⚠️ | ✅✅ |
|
||||||
|
| **Visualization** | ✅ (v0.7+) | ✅✅ (dashboard) | ✅✅ (graphs) | ❌ | ⚠️ | ❌ |
|
||||||
|
| **Auto-Fix** | ✅✅ (v0.9+) UNIQUE | ❌ | ❌ | ❌ | ❌ | ✅ |
|
||||||
|
| **AI Workflow** | ✅✅ UNIQUE | ❌ | ❌ | ❌ | ❌ | ❌ |
|
||||||
|
| **CI/CD Integration** | ✅ (v0.8+) | ✅✅ | ✅ | ✅ | ⚠️ | ✅✅ |
|
||||||
|
| **IDE Extensions** | 🔜 (v1.0+) | ✅ | ❌ | ❌ | ⚠️ | ✅✅ |
|
||||||
|
| **Metrics Dashboard** | ✅ (v0.10+) | ✅✅ | ⚠️ | ❌ | ✅ | ❌ |
|
||||||
|
|
||||||
|
**Legend:**
|
||||||
|
- ✅✅ = Excellent support
|
||||||
|
- ✅ = Good support
|
||||||
|
- ⚠️ = Limited/partial support
|
||||||
|
- ❌ = Not available
|
||||||
|
- 🔜 = Planned/Coming soon
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔥 Guardian's Unique Advantages
|
||||||
|
|
||||||
|
Guardian has **7 unique features** that no competitor offers:
|
||||||
|
|
||||||
|
### 1. ✨ Hardcode Detection with AI Suggestions
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```typescript
|
||||||
|
// Detected:
|
||||||
|
app.listen(3000)
|
||||||
|
|
||||||
|
// Suggestion:
|
||||||
|
💡 Extract to: DEFAULT_PORT
|
||||||
|
📁 Location: infrastructure/config/constants.ts
|
||||||
|
🤖 AI Prompt: "Extract port 3000 to DEFAULT_PORT constant in config"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- SonarQube: Only detects hardcoded secrets (API keys), not magic numbers
|
||||||
|
- Others: No hardcode detection at all
|
||||||
|
|
||||||
|
### 2. 🔌 Framework Leak Detection
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```typescript
|
||||||
|
// domain/entities/User.ts
|
||||||
|
import { Request } from 'express' // ❌ VIOLATION!
|
||||||
|
|
||||||
|
// Detected: Framework leak in domain layer
|
||||||
|
// Suggestion: Use dependency injection via interfaces
|
||||||
|
```
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- ArchUnit: Can check via custom rules (not built-in)
|
||||||
|
- Others: Not available
|
||||||
|
|
||||||
|
### 3. 🎭 Entity Exposure Detection
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```typescript
|
||||||
|
// ❌ Bad: Domain entity exposed
|
||||||
|
async getUser(): Promise<User> { }
|
||||||
|
|
||||||
|
// ✅ Good: Use DTOs
|
||||||
|
async getUser(): Promise<UserDto> { }
|
||||||
|
|
||||||
|
// Guardian detects this automatically!
|
||||||
|
```
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- None have this built-in
|
||||||
|
|
||||||
|
### 4. 📚 Repository Pattern Validation
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```typescript
|
||||||
|
// Detects ORM types in domain interfaces:
|
||||||
|
interface IUserRepository {
|
||||||
|
findOne(query: { where: ... }) // ❌ Prisma-specific!
|
||||||
|
}
|
||||||
|
|
||||||
|
// Detects concrete repos in use cases:
|
||||||
|
constructor(private prisma: PrismaClient) {} // ❌ VIOLATION!
|
||||||
|
```
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- None validate repository pattern
|
||||||
|
|
||||||
|
### 5. 🤖 AI-First Workflow
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```bash
|
||||||
|
# Generate AI-friendly fix prompt
|
||||||
|
guardian check ./src --format ai-prompt > fix.txt
|
||||||
|
|
||||||
|
# Feed to Claude/GPT:
|
||||||
|
"Fix these Guardian violations: $(cat fix.txt)"
|
||||||
|
|
||||||
|
# AI fixes → Run Guardian again → Ship it!
|
||||||
|
```
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- Generic output, not optimized for AI assistants
|
||||||
|
|
||||||
|
### 6. 🛠️ Auto-Fix for Architecture (v0.9+)
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```bash
|
||||||
|
# Automatically extract hardcodes to constants
|
||||||
|
guardian fix ./src --auto
|
||||||
|
|
||||||
|
# Rename files to match conventions
|
||||||
|
guardian fix naming ./src --auto
|
||||||
|
|
||||||
|
# Interactive mode
|
||||||
|
guardian fix ./src --interactive
|
||||||
|
```
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- ESLint has `--fix` but only for syntax
|
||||||
|
- None fix architecture violations
|
||||||
|
|
||||||
|
### 7. 🎯 DDD Pattern Detection (30+)
|
||||||
|
|
||||||
|
**Guardian Roadmap:**
|
||||||
|
- Aggregate boundaries
|
||||||
|
- Anemic domain model
|
||||||
|
- Domain events
|
||||||
|
- Value Object immutability
|
||||||
|
- CQRS violations
|
||||||
|
- Saga pattern
|
||||||
|
- Ubiquitous language
|
||||||
|
- And 23+ more DDD patterns!
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- Generic architecture checks only
|
||||||
|
- No DDD-specific patterns
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📈 Detailed Tool Comparisons
|
||||||
|
|
||||||
|
## vs SonarQube
|
||||||
|
|
||||||
|
### When SonarQube Wins
|
||||||
|
|
||||||
|
✅ **Multi-language projects**
|
||||||
|
```
|
||||||
|
Java + Python + TypeScript → Use SonarQube
|
||||||
|
TypeScript only → Consider Guardian
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Security-critical applications**
|
||||||
|
```
|
||||||
|
SonarQube: SAST, SCA, OWASP Top 10, CVE detection
|
||||||
|
Guardian: Architecture only (security coming later)
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Large enterprise with compliance**
|
||||||
|
```
|
||||||
|
SonarQube: Compliance reports, audit trails, enterprise support
|
||||||
|
Guardian: Lightweight, developer-focused
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Existing SonarQube investment**
|
||||||
|
```
|
||||||
|
Already using SonarQube? Add Guardian for DDD-specific checks
|
||||||
|
```
|
||||||
|
|
||||||
|
### When Guardian Wins
|
||||||
|
|
||||||
|
✅ **TypeScript + AI coding workflow**
|
||||||
|
```typescript
|
||||||
|
// AI generates code → Guardian checks → AI fixes → Ship
|
||||||
|
// 10x faster than manual review
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Clean Architecture / DDD enforcement**
|
||||||
|
```typescript
|
||||||
|
// Guardian understands DDD out-of-the-box
|
||||||
|
// SonarQube requires custom rules
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Fast setup (< 5 minutes)**
|
||||||
|
```bash
|
||||||
|
npm install -g @samiyev/guardian
|
||||||
|
guardian check ./src
|
||||||
|
# Done! (vs hours of SonarQube setup)
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Hardcode detection with context**
|
||||||
|
```typescript
|
||||||
|
// Guardian knows the difference between:
|
||||||
|
const port = 3000 // ❌ Should be constant
|
||||||
|
const increment = 1 // ✅ Allowed (semantic)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Side-by-Side Example
|
||||||
|
|
||||||
|
**Scenario:** Detect hardcoded port in Express app
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// src/server.ts
|
||||||
|
app.listen(3000)
|
||||||
|
```
|
||||||
|
|
||||||
|
**SonarQube:**
|
||||||
|
```
|
||||||
|
❌ No violation (not a secret)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```
|
||||||
|
✅ Hardcode detected:
|
||||||
|
Type: magic-number
|
||||||
|
Value: 3000
|
||||||
|
💡 Suggested: DEFAULT_PORT
|
||||||
|
📁 Location: infrastructure/config/constants.ts
|
||||||
|
🤖 AI Fix: "Extract 3000 to DEFAULT_PORT constant"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## vs dependency-cruiser
|
||||||
|
|
||||||
|
### When dependency-cruiser Wins
|
||||||
|
|
||||||
|
✅ **Visualization priority**
|
||||||
|
```bash
|
||||||
|
# Best-in-class dependency graphs
|
||||||
|
depcruise src --output-type dot | dot -T svg > graph.svg
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Custom dependency rules**
|
||||||
|
```javascript
|
||||||
|
// Highly flexible rule system
|
||||||
|
forbidden: [
|
||||||
|
{
|
||||||
|
from: { path: '^src/domain' },
|
||||||
|
to: { path: '^src/infrastructure' }
|
||||||
|
}
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Multi-framework support**
|
||||||
|
```
|
||||||
|
JS, TS, Vue, Svelte, JSX, CoffeeScript
|
||||||
|
```
|
||||||
|
|
||||||
|
### When Guardian Wins
|
||||||
|
|
||||||
|
✅ **DDD/Clean Architecture out-of-the-box**
|
||||||
|
```typescript
|
||||||
|
// Guardian knows these patterns:
|
||||||
|
// - Domain/Application/Infrastructure layers
|
||||||
|
// - Entity exposure
|
||||||
|
// - Repository pattern
|
||||||
|
// - Framework leaks
|
||||||
|
|
||||||
|
// dependency-cruiser: Write custom rules for each
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Hardcode detection**
|
||||||
|
```typescript
|
||||||
|
// Guardian finds:
|
||||||
|
setTimeout(() => {}, 5000) // Magic number
|
||||||
|
const url = "http://..." // Magic string
|
||||||
|
|
||||||
|
// dependency-cruiser: Doesn't check this
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **AI workflow integration**
|
||||||
|
```bash
|
||||||
|
guardian check ./src --format ai-prompt
|
||||||
|
# Optimized for Claude/GPT
|
||||||
|
|
||||||
|
depcruise src
|
||||||
|
# Generic output
|
||||||
|
```
|
||||||
|
|
||||||
|
### Complementary Usage
|
||||||
|
|
||||||
|
**Best approach:** Use both!
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Guardian for architecture + hardcode
|
||||||
|
guardian check ./src
|
||||||
|
|
||||||
|
# dependency-cruiser for visualization
|
||||||
|
depcruise src --output-type svg > architecture.svg
|
||||||
|
```
|
||||||
|
|
||||||
|
**Coming in Guardian v0.7.0:**
|
||||||
|
```bash
|
||||||
|
# Guardian will have built-in visualization!
|
||||||
|
guardian visualize ./src --output architecture.svg
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## vs ArchUnit (Java)
|
||||||
|
|
||||||
|
### When ArchUnit Wins
|
||||||
|
|
||||||
|
✅ **Java projects**
|
||||||
|
```java
|
||||||
|
// ArchUnit is built for Java
|
||||||
|
@ArchTest
|
||||||
|
void domainShouldNotDependOnInfrastructure(JavaClasses classes) {
|
||||||
|
noClasses().that().resideInPackage("..domain..")
|
||||||
|
.should().dependOnClassesThat().resideInPackage("..infrastructure..")
|
||||||
|
.check(classes);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Test-based architecture validation**
|
||||||
|
```java
|
||||||
|
// Architecture rules = unit tests
|
||||||
|
// Runs in your CI with other tests
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Mature Java ecosystem**
|
||||||
|
```
|
||||||
|
Spring Boot, Hibernate, JPA patterns
|
||||||
|
Built-in rules for layered/onion architecture
|
||||||
|
```
|
||||||
|
|
||||||
|
### When Guardian Wins
|
||||||
|
|
||||||
|
✅ **TypeScript/JavaScript projects**
|
||||||
|
```typescript
|
||||||
|
// Guardian is built for TypeScript
|
||||||
|
// ArchUnit doesn't support TS
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **AI coding workflow**
|
||||||
|
```bash
|
||||||
|
# Guardian → AI → Fix → Ship
|
||||||
|
# ArchUnit is test-based (slower feedback)
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Zero-config DDD**
|
||||||
|
```bash
|
||||||
|
guardian check ./src
|
||||||
|
# Works immediately with DDD structure
|
||||||
|
|
||||||
|
# ArchUnit requires writing tests for each rule
|
||||||
|
```
|
||||||
|
|
||||||
|
### Philosophical Difference
|
||||||
|
|
||||||
|
**ArchUnit:**
|
||||||
|
```java
|
||||||
|
// Architecture = Tests
|
||||||
|
// You write explicit tests for each rule
|
||||||
|
```
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```bash
|
||||||
|
# Architecture = Linter
|
||||||
|
# Pre-configured DDD rules out-of-the-box
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## vs FTA (Fast TypeScript Analyzer)
|
||||||
|
|
||||||
|
### When FTA Wins
|
||||||
|
|
||||||
|
✅ **Complexity metrics focus**
|
||||||
|
```bash
|
||||||
|
# FTA provides:
|
||||||
|
# - Cyclomatic complexity
|
||||||
|
# - Halstead metrics
|
||||||
|
# - Line counts
|
||||||
|
# - Technical debt estimation
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Performance (Rust-based)**
|
||||||
|
```
|
||||||
|
FTA: 1600 files/second
|
||||||
|
Guardian: ~500 files/second (Node.js)
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Simplicity**
|
||||||
|
```bash
|
||||||
|
# FTA does one thing well: metrics
|
||||||
|
fta src/
|
||||||
|
```
|
||||||
|
|
||||||
|
### When Guardian Wins
|
||||||
|
|
||||||
|
✅ **Architecture enforcement**
|
||||||
|
```typescript
|
||||||
|
// Guardian checks:
|
||||||
|
// - Layer violations
|
||||||
|
// - Framework leaks
|
||||||
|
// - Circular dependencies
|
||||||
|
// - Repository pattern
|
||||||
|
|
||||||
|
// FTA: Only measures complexity, no architecture checks
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Hardcode detection**
|
||||||
|
```typescript
|
||||||
|
// Guardian finds magic numbers/strings
|
||||||
|
// FTA doesn't check this
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **AI workflow**
|
||||||
|
```bash
|
||||||
|
# Guardian provides actionable suggestions
|
||||||
|
# FTA provides metrics only
|
||||||
|
```
|
||||||
|
|
||||||
|
### Complementary Usage
|
||||||
|
|
||||||
|
**Best approach:** Use both!
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Guardian for architecture
|
||||||
|
guardian check ./src
|
||||||
|
|
||||||
|
# FTA for complexity metrics
|
||||||
|
fta src/ --threshold complexity:15
|
||||||
|
```
|
||||||
|
|
||||||
|
**Coming in Guardian v0.10.0:**
|
||||||
|
```bash
|
||||||
|
# Guardian will include complexity metrics!
|
||||||
|
guardian metrics ./src --include-complexity
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## vs ESLint + Plugins
|
||||||
|
|
||||||
|
### When ESLint Wins
|
||||||
|
|
||||||
|
✅ **General code quality**
|
||||||
|
```javascript
|
||||||
|
// Best for:
|
||||||
|
// - Code style
|
||||||
|
// - Common bugs
|
||||||
|
// - TypeScript errors
|
||||||
|
// - React/Vue specific rules
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Huge ecosystem**
|
||||||
|
```bash
|
||||||
|
# 10,000+ plugins
|
||||||
|
eslint-plugin-react
|
||||||
|
eslint-plugin-vue
|
||||||
|
eslint-plugin-security
|
||||||
|
# ...and many more
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Auto-fix for syntax**
|
||||||
|
```bash
|
||||||
|
eslint --fix
|
||||||
|
# Fixes semicolons, quotes, formatting, etc.
|
||||||
|
```
|
||||||
|
|
||||||
|
### When Guardian Wins
|
||||||
|
|
||||||
|
✅ **Architecture enforcement**
|
||||||
|
```typescript
|
||||||
|
// ESLint doesn't understand:
|
||||||
|
// - Clean Architecture layers
|
||||||
|
// - DDD patterns
|
||||||
|
// - Framework leaks
|
||||||
|
// - Entity exposure
|
||||||
|
|
||||||
|
// Guardian does!
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Hardcode detection with context**
|
||||||
|
```typescript
|
||||||
|
// ESLint plugins check patterns
|
||||||
|
// Guardian understands semantic context
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **AI workflow integration**
|
||||||
|
```bash
|
||||||
|
# Guardian optimized for AI assistants
|
||||||
|
# ESLint generic output
|
||||||
|
```
|
||||||
|
|
||||||
|
### Complementary Usage
|
||||||
|
|
||||||
|
**Best approach:** Use both!
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# ESLint for code quality
|
||||||
|
eslint src/
|
||||||
|
|
||||||
|
# Guardian for architecture
|
||||||
|
guardian check ./src
|
||||||
|
```
|
||||||
|
|
||||||
|
**Many teams run both in CI:**
|
||||||
|
```yaml
|
||||||
|
# .github/workflows/quality.yml
|
||||||
|
- name: ESLint
|
||||||
|
run: npm run lint
|
||||||
|
|
||||||
|
- name: Guardian
|
||||||
|
run: guardian check ./src --fail-on error
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## vs import-linter (Python)
|
||||||
|
|
||||||
|
### When import-linter Wins
|
||||||
|
|
||||||
|
✅ **Python projects**
|
||||||
|
```ini
|
||||||
|
# .importlinter
|
||||||
|
[importlinter]
|
||||||
|
root_package = myproject
|
||||||
|
|
||||||
|
[importlinter:contract:1]
|
||||||
|
name = Layers contract
|
||||||
|
type = layers
|
||||||
|
layers =
|
||||||
|
myproject.domain
|
||||||
|
myproject.application
|
||||||
|
myproject.infrastructure
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Mature Python ecosystem**
|
||||||
|
```python
|
||||||
|
# Django, Flask, FastAPI integration
|
||||||
|
```
|
||||||
|
|
||||||
|
### When Guardian Wins
|
||||||
|
|
||||||
|
✅ **TypeScript/JavaScript**
|
||||||
|
```typescript
|
||||||
|
// Guardian is for TS/JS
|
||||||
|
// import-linter is Python-only
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **More than import checking**
|
||||||
|
```typescript
|
||||||
|
// Guardian checks:
|
||||||
|
// - Hardcode
|
||||||
|
// - Entity exposure
|
||||||
|
// - Repository pattern
|
||||||
|
// - Framework leaks
|
||||||
|
|
||||||
|
// import-linter: Only imports
|
||||||
|
```
|
||||||
|
|
||||||
|
### Future Integration
|
||||||
|
|
||||||
|
**Guardian v2.0+ (Planned):**
|
||||||
|
```bash
|
||||||
|
# Multi-language support coming
|
||||||
|
guardian check ./python-src --language python
|
||||||
|
guardian check ./ts-src --language typescript
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💰 Cost Comparison
|
||||||
|
|
||||||
|
| Tool | Free Tier | Paid Plans | Enterprise |
|
||||||
|
|------|-----------|------------|------------|
|
||||||
|
| **Guardian** | ✅ MIT License (100% free) | - | - |
|
||||||
|
| **SonarQube** | ✅ Community Edition | Developer: $150/yr | Custom pricing |
|
||||||
|
| **dependency-cruiser** | ✅ MIT License | - | - |
|
||||||
|
| **ArchUnit** | ✅ Apache 2.0 | - | - |
|
||||||
|
| **FTA** | ✅ Open Source | - | - |
|
||||||
|
| **ESLint** | ✅ MIT License | - | - |
|
||||||
|
|
||||||
|
**Guardian will always be free and open-source (MIT License)**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🚀 Setup Time Comparison
|
||||||
|
|
||||||
|
| Tool | Setup Time | Configuration Required |
|
||||||
|
|------|------------|------------------------|
|
||||||
|
| **Guardian** | ⚡ 2 minutes | ❌ Zero-config (DDD) |
|
||||||
|
| **SonarQube** | 🐌 2-4 hours | ✅ Extensive setup |
|
||||||
|
| **dependency-cruiser** | ⚡ 5 minutes | ⚠️ Rules configuration |
|
||||||
|
| **ArchUnit** | ⚙️ 30 minutes | ✅ Write test rules |
|
||||||
|
| **FTA** | ⚡ 1 minute | ❌ Zero-config |
|
||||||
|
| **ESLint** | ⚡ 10 minutes | ⚠️ Plugin configuration |
|
||||||
|
|
||||||
|
**Guardian Setup:**
|
||||||
|
```bash
|
||||||
|
# 1. Install (30 seconds)
|
||||||
|
npm install -g @samiyev/guardian
|
||||||
|
|
||||||
|
# 2. Run (90 seconds)
|
||||||
|
cd your-project
|
||||||
|
guardian check ./src
|
||||||
|
|
||||||
|
# Done! 🎉
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Real-World Performance
|
||||||
|
|
||||||
|
### Analysis Speed (1000 TypeScript files)
|
||||||
|
|
||||||
|
| Tool | Time | Notes |
|
||||||
|
|------|------|-------|
|
||||||
|
| **FTA** | ~0.6s | ⚡ Fastest (Rust) |
|
||||||
|
| **Guardian** | ~2s | Fast (Node.js, tree-sitter) |
|
||||||
|
| **dependency-cruiser** | ~3s | Fast |
|
||||||
|
| **ESLint** | ~5s | Depends on rules |
|
||||||
|
| **SonarQube** | ~15s | Slower (comprehensive) |
|
||||||
|
|
||||||
|
### Memory Usage
|
||||||
|
|
||||||
|
| Tool | RAM | Notes |
|
||||||
|
|------|-----|-------|
|
||||||
|
| **Guardian** | ~150MB | Efficient |
|
||||||
|
| **FTA** | ~50MB | Minimal (Rust) |
|
||||||
|
| **dependency-cruiser** | ~200MB | Moderate |
|
||||||
|
| **ESLint** | ~300MB | Varies by plugins |
|
||||||
|
| **SonarQube** | ~2GB | Heavy (server) |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Use Case Recommendations
|
||||||
|
|
||||||
|
### Scenario 1: TypeScript Startup Using AI Coding
|
||||||
|
|
||||||
|
**Best Stack:**
|
||||||
|
```bash
|
||||||
|
✅ Guardian (architecture + hardcode)
|
||||||
|
✅ ESLint (code quality)
|
||||||
|
✅ Prettier (formatting)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Why:**
|
||||||
|
- Fast setup
|
||||||
|
- AI workflow integration
|
||||||
|
- Zero-config DDD
|
||||||
|
- Catches AI mistakes (hardcode)
|
||||||
|
|
||||||
|
### Scenario 2: Enterprise Multi-Language
|
||||||
|
|
||||||
|
**Best Stack:**
|
||||||
|
```bash
|
||||||
|
✅ SonarQube (security + multi-language)
|
||||||
|
✅ Guardian (TypeScript DDD specialization)
|
||||||
|
✅ ArchUnit (Java architecture)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Why:**
|
||||||
|
- Comprehensive coverage
|
||||||
|
- Security scanning
|
||||||
|
- Language-specific depth
|
||||||
|
|
||||||
|
### Scenario 3: Clean Architecture Refactoring
|
||||||
|
|
||||||
|
**Best Stack:**
|
||||||
|
```bash
|
||||||
|
✅ Guardian (architecture enforcement)
|
||||||
|
✅ dependency-cruiser (visualization)
|
||||||
|
✅ Guardian v0.9+ (auto-fix)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Why:**
|
||||||
|
- Visualize current state
|
||||||
|
- Detect violations
|
||||||
|
- Auto-fix issues
|
||||||
|
|
||||||
|
### Scenario 4: Python + TypeScript Monorepo
|
||||||
|
|
||||||
|
**Best Stack:**
|
||||||
|
```bash
|
||||||
|
✅ Guardian (TypeScript)
|
||||||
|
✅ import-linter (Python)
|
||||||
|
✅ SonarQube (security, both languages)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Why:**
|
||||||
|
- Language-specific depth
|
||||||
|
- Unified security scanning
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🏆 Winner by Category
|
||||||
|
|
||||||
|
| Category | Winner | Runner-up |
|
||||||
|
|----------|--------|-----------|
|
||||||
|
| **TypeScript Architecture** | 🥇 Guardian | dependency-cruiser |
|
||||||
|
| **Multi-Language** | 🥇 SonarQube | - |
|
||||||
|
| **Visualization** | 🥇 dependency-cruiser | SonarQube |
|
||||||
|
| **AI Workflow** | 🥇 Guardian | - (no competitor) |
|
||||||
|
| **Security** | 🥇 SonarQube | - |
|
||||||
|
| **Hardcode Detection** | 🥇 Guardian | - (no competitor) |
|
||||||
|
| **DDD Patterns** | 🥇 Guardian | ArchUnit (Java) |
|
||||||
|
| **Auto-Fix** | 🥇 ESLint (syntax) | Guardian v0.9+ (architecture) |
|
||||||
|
| **Complexity Metrics** | 🥇 FTA | SonarQube |
|
||||||
|
| **Setup Speed** | 🥇 FTA | Guardian |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔮 Future Roadmap Comparison
|
||||||
|
|
||||||
|
### Guardian v1.0.0 (Q4 2026)
|
||||||
|
- ✅ Configuration & presets (v0.6)
|
||||||
|
- ✅ Visualization (v0.7)
|
||||||
|
- ✅ CI/CD kit (v0.8)
|
||||||
|
- ✅ Auto-fix (v0.9) **UNIQUE!**
|
||||||
|
- ✅ Metrics dashboard (v0.10)
|
||||||
|
- ✅ 30+ DDD patterns (v0.11-v0.32)
|
||||||
|
- ✅ VS Code extension
|
||||||
|
- ✅ JetBrains plugin
|
||||||
|
|
||||||
|
### Competitors
|
||||||
|
- **SonarQube**: Incremental improvements, AI-powered fixes (experimental)
|
||||||
|
- **dependency-cruiser**: Stable, no major changes planned
|
||||||
|
- **ArchUnit**: Java focus, incremental improvements
|
||||||
|
- **FTA**: Adding more metrics
|
||||||
|
- **ESLint**: Flat config, performance improvements
|
||||||
|
|
||||||
|
**Guardian's Advantage:** Only tool actively expanding DDD/architecture detection
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💡 Migration Guides
|
||||||
|
|
||||||
|
### From SonarQube to Guardian
|
||||||
|
|
||||||
|
**When to migrate:**
|
||||||
|
- TypeScript-only project
|
||||||
|
- Want faster iteration
|
||||||
|
- Need DDD-specific checks
|
||||||
|
- Don't need multi-language/security
|
||||||
|
|
||||||
|
**How to migrate:**
|
||||||
|
```bash
|
||||||
|
# Keep SonarQube for security
|
||||||
|
# Add Guardian for architecture
|
||||||
|
npm install -g @samiyev/guardian
|
||||||
|
guardian check ./src
|
||||||
|
|
||||||
|
# CI/CD: Run both
|
||||||
|
# SonarQube (security) → Guardian (architecture)
|
||||||
|
```
|
||||||
|
|
||||||
|
### From ESLint-only to ESLint + Guardian
|
||||||
|
|
||||||
|
**Why add Guardian:**
|
||||||
|
```typescript
|
||||||
|
// ESLint checks syntax
|
||||||
|
// Guardian checks architecture
|
||||||
|
```
|
||||||
|
|
||||||
|
**How to add:**
|
||||||
|
```bash
|
||||||
|
# Keep ESLint
|
||||||
|
npm run lint
|
||||||
|
|
||||||
|
# Add Guardian
|
||||||
|
guardian check ./src
|
||||||
|
|
||||||
|
# Both in CI:
|
||||||
|
npm run lint && guardian check ./src
|
||||||
|
```
|
||||||
|
|
||||||
|
### From dependency-cruiser to Guardian
|
||||||
|
|
||||||
|
**Why migrate:**
|
||||||
|
- Need more than circular deps
|
||||||
|
- Want hardcode detection
|
||||||
|
- Need DDD patterns
|
||||||
|
- Want auto-fix (v0.9+)
|
||||||
|
|
||||||
|
**How to migrate:**
|
||||||
|
```bash
|
||||||
|
# Replace:
|
||||||
|
depcruise src --config .dependency-cruiser.js
|
||||||
|
|
||||||
|
# With:
|
||||||
|
guardian check ./src
|
||||||
|
|
||||||
|
# Or keep both:
|
||||||
|
# dependency-cruiser → visualization
|
||||||
|
# Guardian → architecture + hardcode
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📚 Additional Resources
|
||||||
|
|
||||||
|
### Guardian
|
||||||
|
- [GitHub Repository](https://github.com/samiyev/puaros)
|
||||||
|
- [Documentation](https://puaros.ailabs.uz)
|
||||||
|
- [npm Package](https://www.npmjs.com/package/@samiyev/guardian)
|
||||||
|
|
||||||
|
### Competitors
|
||||||
|
- [SonarQube](https://www.sonarsource.com/products/sonarqube/)
|
||||||
|
- [dependency-cruiser](https://github.com/sverweij/dependency-cruiser)
|
||||||
|
- [ArchUnit](https://www.archunit.org/)
|
||||||
|
- [FTA](https://ftaproject.dev/)
|
||||||
|
- [import-linter](https://import-linter.readthedocs.io/)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🤝 Community & Support
|
||||||
|
|
||||||
|
| Tool | Community | Support |
|
||||||
|
|------|-----------|---------|
|
||||||
|
| **Guardian** | GitHub Issues | Community (planned: Discord) |
|
||||||
|
| **SonarQube** | Community Forum | Commercial support available |
|
||||||
|
| **dependency-cruiser** | GitHub Issues | Community |
|
||||||
|
| **ArchUnit** | GitHub Issues | Community |
|
||||||
|
| **ESLint** | Discord, Twitter | Community |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Guardian's Position in the Market:**
|
||||||
|
|
||||||
|
> **"The AI-First Architecture Guardian for TypeScript teams practicing DDD/Clean Architecture"**
|
||||||
|
|
||||||
|
**Guardian is NOT:**
|
||||||
|
- ❌ A replacement for SonarQube's security scanning
|
||||||
|
- ❌ A replacement for ESLint's code quality checks
|
||||||
|
- ❌ A multi-language tool (yet)
|
||||||
|
|
||||||
|
**Guardian IS:**
|
||||||
|
- ✅ The best tool for TypeScript DDD/Clean Architecture
|
||||||
|
- ✅ The only tool optimized for AI-assisted coding
|
||||||
|
- ✅ The only tool with intelligent hardcode detection
|
||||||
|
- ✅ The only tool with auto-fix for architecture (v0.9+)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Questions? Feedback?**
|
||||||
|
|
||||||
|
- 📧 Email: fozilbek.samiyev@gmail.com
|
||||||
|
- 🐙 GitHub: https://github.com/samiyev/puaros/issues
|
||||||
|
- 🌐 Website: https://puaros.ailabs.uz
|
||||||
323
packages/guardian/COMPETITIVE_ANALYSIS_SUMMARY.md
Normal file
323
packages/guardian/COMPETITIVE_ANALYSIS_SUMMARY.md
Normal file
@@ -0,0 +1,323 @@
|
|||||||
|
# Competitive Analysis & Roadmap - Summary
|
||||||
|
|
||||||
|
**Date:** 2025-01-24
|
||||||
|
**Prepared for:** Puaros Guardian
|
||||||
|
**Documents Created:**
|
||||||
|
1. ROADMAP_NEW.md - Updated roadmap with reprioritized features
|
||||||
|
2. COMPARISON.md - Comprehensive competitor comparison
|
||||||
|
3. docs/v0.6.0-CONFIGURATION-SPEC.md - Configuration feature specification
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Executive Summary
|
||||||
|
|
||||||
|
Guardian has **5 unique features** that no competitor offers, positioning it as the **only tool built for AI-assisted DDD/Clean Architecture development**. However, to achieve enterprise adoption, we need to first match competitors' baseline features (configuration, visualization, CI/CD, metrics).
|
||||||
|
|
||||||
|
### Current Position (v0.5.1)
|
||||||
|
|
||||||
|
**Strengths:**
|
||||||
|
- ✅ Hardcode detection with AI suggestions (UNIQUE)
|
||||||
|
- ✅ Framework leak detection (UNIQUE)
|
||||||
|
- ✅ Entity exposure detection (UNIQUE)
|
||||||
|
- ✅ Repository pattern validation (UNIQUE)
|
||||||
|
- ✅ DDD-specific naming conventions (UNIQUE)
|
||||||
|
|
||||||
|
**Gaps:**
|
||||||
|
- ❌ No configuration file support
|
||||||
|
- ❌ No visualization/graphs
|
||||||
|
- ❌ No ready-to-use CI/CD templates
|
||||||
|
- ❌ No metrics/quality score
|
||||||
|
- ❌ No auto-fix capabilities
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Competitive Landscape
|
||||||
|
|
||||||
|
### Main Competitors
|
||||||
|
|
||||||
|
| Tool | Strength | Weakness | Market Position |
|
||||||
|
|------|----------|----------|-----------------|
|
||||||
|
| **SonarQube** | Multi-language + Security | Complex setup, expensive | Enterprise leader |
|
||||||
|
| **dependency-cruiser** | Best visualization | No hardcode/DDD | Dependency specialist |
|
||||||
|
| **ArchUnit** | Java architecture | Java-only | Java ecosystem |
|
||||||
|
| **FTA** | Fast metrics (Rust) | No architecture checks | Metrics tool |
|
||||||
|
| **ESLint** | Huge ecosystem | No architecture | Code quality standard |
|
||||||
|
|
||||||
|
### Guardian's Unique Position
|
||||||
|
|
||||||
|
> **"The AI-First Architecture Guardian for TypeScript teams practicing DDD/Clean Architecture"**
|
||||||
|
|
||||||
|
**Market Gap Filled:**
|
||||||
|
- No tool optimizes for AI-assisted coding workflow
|
||||||
|
- No tool deeply understands DDD patterns (except ArchUnit for Java)
|
||||||
|
- No tool combines hardcode detection + architecture enforcement
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🚀 Strategic Roadmap
|
||||||
|
|
||||||
|
### Phase 1: Market Parity (v0.6-v0.10) - Q1-Q2 2026
|
||||||
|
|
||||||
|
**Goal:** Match competitors' baseline features
|
||||||
|
|
||||||
|
| Version | Feature | Why Critical | Competitor |
|
||||||
|
|---------|---------|--------------|------------|
|
||||||
|
| v0.6.0 | Configuration & Presets | All competitors have this | ESLint, SonarQube |
|
||||||
|
| v0.7.0 | Visualization | dependency-cruiser's main advantage | dependency-cruiser |
|
||||||
|
| v0.8.0 | CI/CD Integration Kit | Enterprise requirement | SonarQube |
|
||||||
|
| v0.9.0 | **Auto-Fix (UNIQUE!)** | Game-changer, no one has this | None |
|
||||||
|
| v0.10.0 | Metrics & Quality Score | Enterprise adoption | SonarQube |
|
||||||
|
|
||||||
|
**After v0.10.0:** Guardian competes with SonarQube/dependency-cruiser on features
|
||||||
|
|
||||||
|
### Phase 2: DDD Specialization (v0.11-v0.32) - Q3-Q4 2026
|
||||||
|
|
||||||
|
**Goal:** Deepen DDD/Clean Architecture expertise
|
||||||
|
|
||||||
|
30+ DDD pattern detectors:
|
||||||
|
- Aggregate boundaries
|
||||||
|
- Anemic domain model
|
||||||
|
- Domain events
|
||||||
|
- Value Object immutability
|
||||||
|
- CQRS validation
|
||||||
|
- Saga pattern
|
||||||
|
- Anti-Corruption Layer
|
||||||
|
- Ubiquitous Language
|
||||||
|
- And 22+ more...
|
||||||
|
|
||||||
|
**After Phase 2:** Guardian = THE tool for DDD/Clean Architecture
|
||||||
|
|
||||||
|
### Phase 3: Enterprise Ecosystem (v1.0+) - Q4 2026+
|
||||||
|
|
||||||
|
**Goal:** Full enterprise platform
|
||||||
|
|
||||||
|
- VS Code extension
|
||||||
|
- JetBrains plugin
|
||||||
|
- Web dashboard
|
||||||
|
- Team analytics
|
||||||
|
- Multi-language support (Python, C#, Java)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔥 Critical Changes to Current Roadmap
|
||||||
|
|
||||||
|
### Old Roadmap Issues
|
||||||
|
|
||||||
|
❌ **v0.6.0 was "Aggregate Boundaries"** → Too early for DDD-specific features
|
||||||
|
❌ **v0.12.0 was "Configuration"** → Way too late! Critical feature postponed
|
||||||
|
❌ **Missing:** Visualization, CI/CD, Auto-fix, Metrics
|
||||||
|
❌ **Too many consecutive DDD features** → Need market parity first
|
||||||
|
|
||||||
|
### New Roadmap Priorities
|
||||||
|
|
||||||
|
✅ **v0.6.0 = Configuration (MOVED UP)** → Critical for adoption
|
||||||
|
✅ **v0.7.0 = Visualization (NEW)** → Compete with dependency-cruiser
|
||||||
|
✅ **v0.8.0 = CI/CD Kit (NEW)** → Enterprise requirement
|
||||||
|
✅ **v0.9.0 = Auto-Fix (NEW, UNIQUE!)** → Game-changing differentiator
|
||||||
|
✅ **v0.10.0 = Metrics (NEW)** → Compete with SonarQube
|
||||||
|
✅ **v0.11+ = DDD Features** → After market parity
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💡 Key Recommendations
|
||||||
|
|
||||||
|
### Immediate Actions (Next 2 Weeks)
|
||||||
|
|
||||||
|
1. **Review & Approve New Roadmap**
|
||||||
|
- Read ROADMAP_NEW.md
|
||||||
|
- Approve priority changes
|
||||||
|
- Create GitHub milestones
|
||||||
|
|
||||||
|
2. **Start v0.6.0 Configuration**
|
||||||
|
- Read v0.6.0-CONFIGURATION-SPEC.md
|
||||||
|
- Create implementation tasks
|
||||||
|
- Start Phase 1 development
|
||||||
|
|
||||||
|
3. **Update Documentation**
|
||||||
|
- Update main README.md with comparison table
|
||||||
|
- Add "Guardian vs Competitors" section
|
||||||
|
- Link to COMPARISON.md
|
||||||
|
|
||||||
|
### Next 3 Months (Q1 2026)
|
||||||
|
|
||||||
|
4. **Complete v0.6.0 (Configuration)**
|
||||||
|
- 8-week timeline
|
||||||
|
- Beta test with community
|
||||||
|
- Stable release
|
||||||
|
|
||||||
|
5. **Start v0.7.0 (Visualization)**
|
||||||
|
- Design graph system
|
||||||
|
- Choose visualization library
|
||||||
|
- Prototype SVG/Mermaid output
|
||||||
|
|
||||||
|
6. **Marketing & Positioning**
|
||||||
|
- Create comparison blog post
|
||||||
|
- Submit to Product Hunt
|
||||||
|
- Share on Reddit/HackerNews
|
||||||
|
|
||||||
|
### Next 6 Months (Q1-Q2 2026)
|
||||||
|
|
||||||
|
7. **Complete Market Parity (v0.6-v0.10)**
|
||||||
|
- Configuration ✅
|
||||||
|
- Visualization ✅
|
||||||
|
- CI/CD Integration ✅
|
||||||
|
- Auto-Fix ✅ (UNIQUE!)
|
||||||
|
- Metrics ✅
|
||||||
|
|
||||||
|
8. **Community Growth**
|
||||||
|
- 1000+ GitHub stars
|
||||||
|
- 100+ weekly npm installs
|
||||||
|
- 10+ enterprise adopters
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📈 Success Metrics
|
||||||
|
|
||||||
|
### v0.10.0 (Market Parity Achieved) - June 2026
|
||||||
|
|
||||||
|
**Feature Parity:**
|
||||||
|
- ✅ Configuration support (compete with ESLint)
|
||||||
|
- ✅ Visualization (compete with dependency-cruiser)
|
||||||
|
- ✅ CI/CD integration (compete with SonarQube)
|
||||||
|
- ✅ Auto-fix (UNIQUE! Game-changer)
|
||||||
|
- ✅ Metrics dashboard (compete with SonarQube)
|
||||||
|
|
||||||
|
**Adoption Metrics:**
|
||||||
|
- 1,000+ GitHub stars
|
||||||
|
- 100+ weekly npm installs
|
||||||
|
- 50+ projects with guardian.config.js
|
||||||
|
- 10+ enterprise teams
|
||||||
|
|
||||||
|
### v1.0.0 (Enterprise Ready) - December 2026
|
||||||
|
|
||||||
|
**Feature Completeness:**
|
||||||
|
- ✅ All baseline features
|
||||||
|
- ✅ 30+ DDD pattern detectors
|
||||||
|
- ✅ IDE extensions (VS Code, JetBrains)
|
||||||
|
- ✅ Web dashboard
|
||||||
|
- ✅ Team analytics
|
||||||
|
|
||||||
|
**Market Position:**
|
||||||
|
- #1 tool for TypeScript DDD/Clean Architecture
|
||||||
|
- Top 3 in static analysis for TypeScript
|
||||||
|
- Known in enterprise as "the AI code reviewer"
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Positioning Strategy
|
||||||
|
|
||||||
|
### Target Segments
|
||||||
|
|
||||||
|
1. **Primary:** TypeScript developers using AI coding assistants (GitHub Copilot, Cursor, Windsurf, Claude, ChatGPT, Cline)
|
||||||
|
2. **Secondary:** Teams implementing DDD/Clean Architecture
|
||||||
|
3. **Tertiary:** Startups/scale-ups needing fast quality enforcement
|
||||||
|
|
||||||
|
### Messaging
|
||||||
|
|
||||||
|
**Tagline:** "The AI-First Architecture Guardian"
|
||||||
|
|
||||||
|
**Key Messages:**
|
||||||
|
- "Catches the #1 AI mistake: hardcoded values everywhere"
|
||||||
|
- "Enforces Clean Architecture that AI often ignores"
|
||||||
|
- "Closes the AI feedback loop for cleaner code"
|
||||||
|
- "The only tool with auto-fix for architecture" (v0.9+)
|
||||||
|
|
||||||
|
### Differentiation
|
||||||
|
|
||||||
|
**Guardian ≠ SonarQube:** We're specialized for TypeScript DDD, not multi-language security
|
||||||
|
**Guardian ≠ dependency-cruiser:** We detect patterns, not just dependencies
|
||||||
|
**Guardian ≠ ESLint:** We enforce architecture, not syntax
|
||||||
|
|
||||||
|
**Guardian = ESLint for architecture + AI code reviewer**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📚 Document Guide
|
||||||
|
|
||||||
|
### ROADMAP_NEW.md
|
||||||
|
|
||||||
|
**Purpose:** Complete technical roadmap with reprioritized features
|
||||||
|
**Audience:** Development team, contributors
|
||||||
|
**Key Sections:**
|
||||||
|
- Current state analysis
|
||||||
|
- Phase 1: Market Parity (v0.6-v0.10)
|
||||||
|
- Phase 2: DDD Specialization (v0.11-v0.32)
|
||||||
|
- Phase 3: Enterprise Ecosystem (v1.0+)
|
||||||
|
|
||||||
|
### COMPARISON.md
|
||||||
|
|
||||||
|
**Purpose:** Marketing-focused comparison with all competitors
|
||||||
|
**Audience:** Users, potential adopters, marketing
|
||||||
|
**Key Sections:**
|
||||||
|
- Feature comparison matrix
|
||||||
|
- Detailed tool comparisons
|
||||||
|
- When to use each tool
|
||||||
|
- Use case recommendations
|
||||||
|
- Winner by category
|
||||||
|
|
||||||
|
### v0.6.0-CONFIGURATION-SPEC.md
|
||||||
|
|
||||||
|
**Purpose:** Technical specification for Configuration feature
|
||||||
|
**Audience:** Development team
|
||||||
|
**Key Sections:**
|
||||||
|
- Configuration file format
|
||||||
|
- Preset system design
|
||||||
|
- Rule configuration
|
||||||
|
- Implementation plan (8 weeks)
|
||||||
|
- Testing strategy
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎬 Next Steps
|
||||||
|
|
||||||
|
### Week 1-2: Planning & Kickoff
|
||||||
|
|
||||||
|
- [ ] Review all three documents
|
||||||
|
- [ ] Approve new roadmap priorities
|
||||||
|
- [ ] Create GitHub milestones for v0.6.0-v0.10.0
|
||||||
|
- [ ] Create implementation issues for v0.6.0
|
||||||
|
- [ ] Update main README.md with comparison table
|
||||||
|
|
||||||
|
### Week 3-10: v0.6.0 Development
|
||||||
|
|
||||||
|
- [ ] Phase 1: Core Configuration (Week 3-4)
|
||||||
|
- [ ] Phase 2: Rule Configuration (Week 4-5)
|
||||||
|
- [ ] Phase 3: Preset System (Week 5-6)
|
||||||
|
- [ ] Phase 4: Ignore Patterns (Week 6-7)
|
||||||
|
- [ ] Phase 5: CLI Integration (Week 7-8)
|
||||||
|
- [ ] Phase 6: Documentation (Week 8-9)
|
||||||
|
- [ ] Phase 7: Beta & Release (Week 9-10)
|
||||||
|
|
||||||
|
### Post-v0.6.0
|
||||||
|
|
||||||
|
- [ ] Start v0.7.0 (Visualization) planning
|
||||||
|
- [ ] Marketing push (blog, Product Hunt, etc.)
|
||||||
|
- [ ] Community feedback gathering
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ❓ Questions?
|
||||||
|
|
||||||
|
**For technical questions:**
|
||||||
|
- Email: fozilbek.samiyev@gmail.com
|
||||||
|
- GitHub Issues: https://github.com/samiyev/puaros/issues
|
||||||
|
|
||||||
|
**For strategic decisions:**
|
||||||
|
- Review sessions: Schedule with team
|
||||||
|
- Roadmap adjustments: Create GitHub discussion
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📝 Changelog
|
||||||
|
|
||||||
|
**2025-01-24:** Initial competitive analysis and roadmap revision
|
||||||
|
- Created comprehensive competitor comparison
|
||||||
|
- Reprioritized roadmap (Configuration moved to v0.6.0)
|
||||||
|
- Added market parity phase (v0.6-v0.10)
|
||||||
|
- Designed v0.6.0 Configuration specification
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Status:** ✅ Analysis complete, ready for implementation
|
||||||
|
|
||||||
|
**Confidence Level:** HIGH - Analysis based on thorough competitor research and market positioning
|
||||||
@@ -2,9 +2,9 @@
|
|||||||
|
|
||||||
This document outlines the current features and future plans for @puaros/guardian.
|
This document outlines the current features and future plans for @puaros/guardian.
|
||||||
|
|
||||||
## Current Version: 0.6.0 ✅ RELEASED
|
## Current Version: 0.7.5 ✅ RELEASED
|
||||||
|
|
||||||
**Released:** 2025-11-24
|
**Released:** 2025-11-25
|
||||||
|
|
||||||
### Features Included in 0.1.0
|
### Features Included in 0.1.0
|
||||||
|
|
||||||
@@ -301,7 +301,223 @@ class Order {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.8.0 - Anemic Domain Model Detection 🩺
|
### Version 0.7.5 - Refactor AnalyzeProject Use-Case 🔧 ✅ RELEASED
|
||||||
|
|
||||||
|
**Released:** 2025-11-25
|
||||||
|
**Priority:** HIGH
|
||||||
|
**Scope:** Single session (~128K tokens)
|
||||||
|
|
||||||
|
Split `AnalyzeProject.ts` (615 lines) into focused pipeline components.
|
||||||
|
|
||||||
|
**Problem:**
|
||||||
|
- God Use-Case with 615 lines
|
||||||
|
- Mixing: file scanning, parsing, detection, aggregation
|
||||||
|
- Hard to test and modify individual steps
|
||||||
|
|
||||||
|
**Solution:**
|
||||||
|
```
|
||||||
|
application/use-cases/
|
||||||
|
├── AnalyzeProject.ts # Orchestrator (245 lines)
|
||||||
|
├── pipeline/
|
||||||
|
│ ├── FileCollectionStep.ts # File scanning (66 lines)
|
||||||
|
│ ├── ParsingStep.ts # AST + dependency graph (51 lines)
|
||||||
|
│ ├── DetectionPipeline.ts # All 7 detectors (371 lines)
|
||||||
|
│ └── ResultAggregator.ts # Build response DTO (81 lines)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Deliverables:**
|
||||||
|
- ✅ Extract 4 pipeline components
|
||||||
|
- ✅ Reduce `AnalyzeProject.ts` from 615 to 245 lines (60% reduction)
|
||||||
|
- ✅ All 345 tests pass, no breaking changes
|
||||||
|
- ✅ Publish to npm
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Version 0.7.6 - Refactor CLI Module 🔧 ✅ RELEASED
|
||||||
|
|
||||||
|
**Released:** 2025-11-25
|
||||||
|
**Priority:** MEDIUM
|
||||||
|
**Scope:** Single session (~128K tokens)
|
||||||
|
|
||||||
|
Split `cli/index.ts` (484 lines) into focused formatters.
|
||||||
|
|
||||||
|
**Problem:**
|
||||||
|
- CLI file has 484 lines
|
||||||
|
- Mixing: command setup, formatting, grouping, statistics
|
||||||
|
|
||||||
|
**Solution:**
|
||||||
|
```
|
||||||
|
cli/
|
||||||
|
├── index.ts # Commands only (260 lines)
|
||||||
|
├── formatters/
|
||||||
|
│ ├── OutputFormatter.ts # Violation formatting (190 lines)
|
||||||
|
│ └── StatisticsFormatter.ts # Metrics & summary (58 lines)
|
||||||
|
├── groupers/
|
||||||
|
│ └── ViolationGrouper.ts # Sorting & grouping (29 lines)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Deliverables:**
|
||||||
|
- ✅ Extract formatters and groupers
|
||||||
|
- ✅ Reduce `cli/index.ts` from 484 to 260 lines (46% reduction)
|
||||||
|
- ✅ CLI output identical to before
|
||||||
|
- ✅ All 345 tests pass, no breaking changes
|
||||||
|
- [ ] Publish to npm
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Version 0.7.7 - Improve Test Coverage 🧪
|
||||||
|
|
||||||
|
**Priority:** MEDIUM
|
||||||
|
**Scope:** Single session (~128K tokens)
|
||||||
|
|
||||||
|
Increase coverage for under-tested domain files.
|
||||||
|
|
||||||
|
**Current State:**
|
||||||
|
| File | Coverage |
|
||||||
|
|------|----------|
|
||||||
|
| SourceFile.ts | 46% |
|
||||||
|
| ProjectPath.ts | 50% |
|
||||||
|
| ValueObject.ts | 25% |
|
||||||
|
| RepositoryViolation.ts | 58% |
|
||||||
|
|
||||||
|
**Deliverables:**
|
||||||
|
- [ ] SourceFile.ts → 80%+
|
||||||
|
- [ ] ProjectPath.ts → 80%+
|
||||||
|
- [ ] ValueObject.ts → 80%+
|
||||||
|
- [ ] RepositoryViolation.ts → 80%+
|
||||||
|
- [ ] Publish to npm
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Version 0.7.8 - Add E2E Tests 🧪
|
||||||
|
|
||||||
|
**Priority:** MEDIUM
|
||||||
|
**Scope:** Single session (~128K tokens)
|
||||||
|
|
||||||
|
Add integration tests for full pipeline and CLI.
|
||||||
|
|
||||||
|
**Deliverables:**
|
||||||
|
- [ ] E2E test: `AnalyzeProject` full pipeline
|
||||||
|
- [ ] CLI smoke test (spawn process, check output)
|
||||||
|
- [ ] Test `examples/good-architecture/` → 0 violations
|
||||||
|
- [ ] Test `examples/bad/` → specific violations
|
||||||
|
- [ ] Test JSON output format
|
||||||
|
- [ ] Publish to npm
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Version 0.7.9 - Refactor Large Detectors 🔧 (Optional)
|
||||||
|
|
||||||
|
**Priority:** LOW
|
||||||
|
**Scope:** Single session (~128K tokens)
|
||||||
|
|
||||||
|
Refactor largest detectors to reduce complexity.
|
||||||
|
|
||||||
|
**Targets:**
|
||||||
|
| Detector | Lines | Complexity |
|
||||||
|
|----------|-------|------------|
|
||||||
|
| RepositoryPatternDetector | 479 | 35 |
|
||||||
|
| HardcodeDetector | 459 | 41 |
|
||||||
|
| AggregateBoundaryDetector | 381 | 47 |
|
||||||
|
|
||||||
|
**Deliverables:**
|
||||||
|
- [ ] Extract regex patterns into strategies
|
||||||
|
- [ ] Reduce cyclomatic complexity < 25
|
||||||
|
- [ ] Publish to npm
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Version 0.8.0 - Secret Detection 🔐
|
||||||
|
**Target:** Q1 2025
|
||||||
|
**Priority:** CRITICAL
|
||||||
|
|
||||||
|
Detect hardcoded secrets (API keys, tokens, credentials) using industry-standard Secretlint library.
|
||||||
|
|
||||||
|
**🎯 SecretDetector - NEW standalone detector:**
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// ❌ CRITICAL: Hardcoded AWS credentials
|
||||||
|
const AWS_KEY = "AKIA1234567890ABCDEF" // VIOLATION!
|
||||||
|
const AWS_SECRET = "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY" // VIOLATION!
|
||||||
|
|
||||||
|
// ❌ CRITICAL: Hardcoded GitHub token
|
||||||
|
const GITHUB_TOKEN = "ghp_1234567890abcdefghijklmnopqrstuv" // VIOLATION!
|
||||||
|
|
||||||
|
// ❌ CRITICAL: SSH Private Key in code
|
||||||
|
const privateKey = `-----BEGIN RSA PRIVATE KEY-----
|
||||||
|
MIIEpAIBAAKCAQEA...` // VIOLATION!
|
||||||
|
|
||||||
|
// ❌ CRITICAL: NPM token
|
||||||
|
//registry.npmjs.org/:_authToken=npm_abc123xyz // VIOLATION!
|
||||||
|
|
||||||
|
// ✅ GOOD: Use environment variables
|
||||||
|
const AWS_KEY = process.env.AWS_ACCESS_KEY_ID
|
||||||
|
const AWS_SECRET = process.env.AWS_SECRET_ACCESS_KEY
|
||||||
|
const GITHUB_TOKEN = process.env.GITHUB_TOKEN
|
||||||
|
```
|
||||||
|
|
||||||
|
**Planned Features:**
|
||||||
|
- ✅ **SecretDetector** - Standalone detector (separate from HardcodeDetector)
|
||||||
|
- ✅ **Secretlint Integration** - Industry-standard library (@secretlint/node)
|
||||||
|
- ✅ **350+ Secret Patterns** - AWS, GitHub, NPM, SSH, GCP, Slack, Basic Auth, etc.
|
||||||
|
- ✅ **CRITICAL Severity** - All secret violations marked as critical
|
||||||
|
- ✅ **Smart Suggestions** - Context-aware remediation per secret type
|
||||||
|
- ✅ **Clean Architecture** - New ISecretDetector interface, SecretViolation value object
|
||||||
|
- ✅ **CLI Integration** - New "🔐 Secrets" section in output
|
||||||
|
- ✅ **Parallel Execution** - Runs alongside existing detectors
|
||||||
|
|
||||||
|
**Secret Types Detected:**
|
||||||
|
- AWS Access Keys & Secret Keys
|
||||||
|
- GitHub Tokens (ghp_, github_pat_, gho_, etc.)
|
||||||
|
- NPM tokens in .npmrc and code
|
||||||
|
- SSH Private Keys
|
||||||
|
- GCP Service Account credentials
|
||||||
|
- Slack tokens (xoxb-, xoxp-, etc.)
|
||||||
|
- Basic Auth credentials
|
||||||
|
- JWT tokens
|
||||||
|
- Private encryption keys
|
||||||
|
|
||||||
|
**Architecture:**
|
||||||
|
```typescript
|
||||||
|
// New domain layer
|
||||||
|
interface ISecretDetector {
|
||||||
|
detectAll(code: string, filePath: string): Promise<SecretViolation[]>
|
||||||
|
}
|
||||||
|
|
||||||
|
class SecretViolation {
|
||||||
|
file: string
|
||||||
|
line: number
|
||||||
|
secretType: string // AWS, GitHub, NPM, etc.
|
||||||
|
message: string
|
||||||
|
severity: "critical"
|
||||||
|
suggestion: string // Context-aware guidance
|
||||||
|
}
|
||||||
|
|
||||||
|
// New infrastructure implementation
|
||||||
|
class SecretDetector implements ISecretDetector {
|
||||||
|
// Uses @secretlint/node internally
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Why Secretlint?**
|
||||||
|
- ✅ Actively maintained (updates weekly)
|
||||||
|
- ✅ TypeScript native
|
||||||
|
- ✅ Pluggable architecture
|
||||||
|
- ✅ Low false positives
|
||||||
|
- ✅ Industry standard
|
||||||
|
|
||||||
|
**Why NOT custom implementation?**
|
||||||
|
- ❌ No good npm library for magic numbers/strings
|
||||||
|
- ❌ Our HardcodeDetector is better than existing solutions
|
||||||
|
- ✅ Secretlint is perfect for secrets (don't reinvent the wheel)
|
||||||
|
- ✅ Two focused detectors better than one bloated detector
|
||||||
|
|
||||||
|
**Impact:**
|
||||||
|
Guardian will now catch critical security issues BEFORE they reach production, complementing existing magic number/string detection.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Version 0.9.0 - Anemic Domain Model Detection 🩺
|
||||||
**Target:** Q2 2026
|
**Target:** Q2 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -342,7 +558,7 @@ class Order {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.8.0 - Domain Event Usage Validation 📢
|
### Version 0.10.0 - Domain Event Usage Validation 📢
|
||||||
**Target:** Q2 2026
|
**Target:** Q2 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -381,7 +597,7 @@ class Order {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.9.0 - Value Object Immutability Check 🔐
|
### Version 0.11.0 - Value Object Immutability Check 🔐
|
||||||
**Target:** Q2 2026
|
**Target:** Q2 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -424,7 +640,7 @@ class Email {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.10.0 - Use Case Single Responsibility 🎯
|
### Version 0.12.0 - Use Case Single Responsibility 🎯
|
||||||
**Target:** Q2 2026
|
**Target:** Q2 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -461,7 +677,7 @@ class SendWelcomeEmail {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.11.0 - Interface Segregation Validation 🔌
|
### Version 0.13.0 - Interface Segregation Validation 🔌
|
||||||
**Target:** Q2 2026
|
**Target:** Q2 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -506,7 +722,7 @@ interface IUserExporter {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.12.0 - Port-Adapter Pattern Validation 🔌
|
### Version 0.14.0 - Port-Adapter Pattern Validation 🔌
|
||||||
**Target:** Q2 2026
|
**Target:** Q2 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -545,7 +761,7 @@ class TwilioAdapter implements INotificationPort {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.13.0 - Configuration File Support ⚙️
|
### Version 0.15.0 - Configuration File Support ⚙️
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -596,7 +812,7 @@ export default {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.14.0 - Command Query Separation (CQS/CQRS) 📝
|
### Version 0.16.0 - Command Query Separation (CQS/CQRS) 📝
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -657,7 +873,7 @@ class GetUser { // Query
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.15.0 - Factory Pattern Validation 🏭
|
### Version 0.17.0 - Factory Pattern Validation 🏭
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -740,7 +956,7 @@ class Order {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.16.0 - Specification Pattern Detection 🔍
|
### Version 0.18.0 - Specification Pattern Detection 🔍
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -812,7 +1028,7 @@ class ApproveOrder {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.17.0 - Layered Service Anti-pattern Detection ⚠️
|
### Version 0.19.0 - Layered Service Anti-pattern Detection ⚠️
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -889,7 +1105,7 @@ class OrderService {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.18.0 - Bounded Context Leak Detection 🚧
|
### Version 0.20.0 - Bounded Context Leak Detection 🚧
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -954,7 +1170,7 @@ class ProductPriceChangedHandler {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.19.0 - Transaction Script vs Domain Model Detection 📜
|
### Version 0.21.0 - Transaction Script vs Domain Model Detection 📜
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -1021,7 +1237,7 @@ class Order {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.20.0 - Persistence Ignorance Validation 💾
|
### Version 0.22.0 - Persistence Ignorance Validation 💾
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -1107,7 +1323,7 @@ class UserEntityMapper {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.21.0 - Null Object Pattern Detection 🎭
|
### Version 0.23.0 - Null Object Pattern Detection 🎭
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -1189,7 +1405,7 @@ class ProcessOrder {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.22.0 - Primitive Obsession in Methods 🔢
|
### Version 0.24.0 - Primitive Obsession in Methods 🔢
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -1256,7 +1472,7 @@ class Order {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.23.0 - Service Locator Anti-pattern 🔍
|
### Version 0.25.0 - Service Locator Anti-pattern 🔍
|
||||||
**Target:** Q4 2026
|
**Target:** Q4 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -1316,7 +1532,7 @@ class CreateUser {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.24.0 - Double Dispatch Pattern Validation 🎯
|
### Version 0.26.0 - Double Dispatch Pattern Validation 🎯
|
||||||
**Target:** Q4 2026
|
**Target:** Q4 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -1393,7 +1609,7 @@ class ShippingCostCalculator implements IOrderItemVisitor {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.25.0 - Entity Identity Validation 🆔
|
### Version 0.27.0 - Entity Identity Validation 🆔
|
||||||
**Target:** Q4 2026
|
**Target:** Q4 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -1486,7 +1702,7 @@ class UserId {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.26.0 - Saga Pattern Detection 🔄
|
### Version 0.28.0 - Saga Pattern Detection 🔄
|
||||||
**Target:** Q4 2026
|
**Target:** Q4 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -1584,7 +1800,7 @@ abstract class SagaStep {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.27.0 - Anti-Corruption Layer Detection 🛡️
|
### Version 0.29.0 - Anti-Corruption Layer Detection 🛡️
|
||||||
**Target:** Q4 2026
|
**Target:** Q4 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -1670,7 +1886,7 @@ interface IOrderSyncPort {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.28.0 - Ubiquitous Language Validation 📖
|
### Version 0.30.0 - Ubiquitous Language Validation 📖
|
||||||
**Target:** Q4 2026
|
**Target:** Q4 2026
|
||||||
**Priority:** HIGH
|
**Priority:** HIGH
|
||||||
|
|
||||||
@@ -1857,5 +2073,5 @@ Until we reach 1.0.0, minor version bumps (0.x.0) may include breaking changes a
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
**Last Updated:** 2025-11-24
|
**Last Updated:** 2025-11-25
|
||||||
**Current Version:** 0.6.0
|
**Current Version:** 0.7.4
|
||||||
|
|||||||
906
packages/guardian/ROADMAP_NEW.md
Normal file
906
packages/guardian/ROADMAP_NEW.md
Normal file
@@ -0,0 +1,906 @@
|
|||||||
|
# Guardian Roadmap 🗺️
|
||||||
|
|
||||||
|
**Last Updated:** 2025-01-24
|
||||||
|
**Current Version:** 0.5.1
|
||||||
|
|
||||||
|
This document outlines the current features and strategic roadmap for @puaros/guardian, prioritized based on market competition analysis and enterprise adoption requirements.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Current State (v0.5.1) ✅
|
||||||
|
|
||||||
|
### ✨ Unique Competitive Advantages
|
||||||
|
|
||||||
|
Guardian currently has **5 unique features** that competitors don't offer:
|
||||||
|
|
||||||
|
| Feature | Status | Competitors |
|
||||||
|
|---------|--------|-------------|
|
||||||
|
| **Hardcode Detection + AI Suggestions** | ✅ Released | ❌ None |
|
||||||
|
| **Framework Leak Detection** | ✅ Released | ❌ None |
|
||||||
|
| **Entity Exposure Detection** | ✅ Released (v0.3.0) | ❌ None |
|
||||||
|
| **Dependency Direction Enforcement** | ✅ Released (v0.4.0) | ⚠️ dependency-cruiser (via rules) |
|
||||||
|
| **Repository Pattern Validation** | ✅ Released (v0.5.0) | ❌ None |
|
||||||
|
|
||||||
|
### 🛠️ Core Features (v0.1.0-v0.5.0)
|
||||||
|
|
||||||
|
**Detection Capabilities:**
|
||||||
|
- ✅ Hardcode detection (magic numbers, magic strings) with smart suggestions
|
||||||
|
- ✅ Circular dependency detection
|
||||||
|
- ✅ Naming convention enforcement (DDD layer-based rules)
|
||||||
|
- ✅ Clean Architecture layer violations
|
||||||
|
- ✅ Framework leak detection (domain importing frameworks)
|
||||||
|
- ✅ Entity exposure in API responses (v0.3.0)
|
||||||
|
- ✅ Dependency direction validation (v0.4.0)
|
||||||
|
- ✅ Repository pattern validation (v0.5.0)
|
||||||
|
|
||||||
|
**Developer Experience:**
|
||||||
|
- ✅ CLI interface with `guardian check` command
|
||||||
|
- ✅ Smart constant name suggestions
|
||||||
|
- ✅ Layer distribution analysis
|
||||||
|
- ✅ Detailed violation reports with file:line:column
|
||||||
|
- ✅ Context snippets for each issue
|
||||||
|
|
||||||
|
**Quality & Testing:**
|
||||||
|
- ✅ 194 tests across 7 test files (all passing)
|
||||||
|
- ✅ 80%+ code coverage on all metrics
|
||||||
|
- ✅ Self-analysis: 0 violations (100% clean codebase)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Strategic Roadmap Overview
|
||||||
|
|
||||||
|
### Phase 1: Market Parity (v0.6-v0.10) - Q1-Q2 2026
|
||||||
|
**Goal:** Match competitors' baseline features to enable enterprise adoption
|
||||||
|
|
||||||
|
- Configuration & Presets
|
||||||
|
- Visualization & Dependency Graphs
|
||||||
|
- CI/CD Integration Kit
|
||||||
|
- Auto-Fix & Code Generation (UNIQUE!)
|
||||||
|
- Metrics & Quality Score
|
||||||
|
|
||||||
|
### Phase 2: DDD Specialization (v0.11-v0.27) - Q3-Q4 2026
|
||||||
|
**Goal:** Deepen DDD/Clean Architecture expertise
|
||||||
|
|
||||||
|
- Advanced DDD pattern detection (25+ features)
|
||||||
|
- Aggregate boundaries, Domain Events, Value Objects
|
||||||
|
- CQRS, Saga Pattern, Anti-Corruption Layer
|
||||||
|
- Ubiquitous Language validation
|
||||||
|
|
||||||
|
### Phase 3: Enterprise Ecosystem (v1.0+) - Q4 2026+
|
||||||
|
**Goal:** Full-featured enterprise platform
|
||||||
|
|
||||||
|
- VS Code extension
|
||||||
|
- JetBrains plugin
|
||||||
|
- Web dashboard
|
||||||
|
- Team analytics
|
||||||
|
- Multi-language support
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📅 Detailed Roadmap
|
||||||
|
|
||||||
|
## Version 0.6.0 - Configuration & Presets ⚙️
|
||||||
|
**Target:** Q1 2026 (January-February)
|
||||||
|
**Priority:** 🔥 CRITICAL
|
||||||
|
|
||||||
|
> **Why Critical:** All competitors (SonarQube, ESLint, dependency-cruiser) have configuration. Without this, Guardian cannot be customized for different teams/projects.
|
||||||
|
|
||||||
|
### Features
|
||||||
|
|
||||||
|
#### 1. Configuration File Support
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// guardian.config.js (primary)
|
||||||
|
export default {
|
||||||
|
// Zero-config presets
|
||||||
|
preset: 'clean-architecture', // or 'ddd', 'hexagonal', 'onion'
|
||||||
|
|
||||||
|
// Rule configuration
|
||||||
|
rules: {
|
||||||
|
'hardcode/magic-numbers': 'error',
|
||||||
|
'hardcode/magic-strings': 'warn',
|
||||||
|
'architecture/layer-violation': 'error',
|
||||||
|
'architecture/framework-leak': 'error',
|
||||||
|
'architecture/entity-exposure': 'error',
|
||||||
|
'circular-dependency': 'error',
|
||||||
|
'naming-convention': 'warn',
|
||||||
|
'dependency-direction': 'error',
|
||||||
|
'repository-pattern': 'error',
|
||||||
|
},
|
||||||
|
|
||||||
|
// Custom layer paths
|
||||||
|
layers: {
|
||||||
|
domain: 'src/core/domain',
|
||||||
|
application: 'src/core/application',
|
||||||
|
infrastructure: 'src/adapters',
|
||||||
|
shared: 'src/shared',
|
||||||
|
},
|
||||||
|
|
||||||
|
// Exclusions
|
||||||
|
exclude: [
|
||||||
|
'**/*.test.ts',
|
||||||
|
'**/*.spec.ts',
|
||||||
|
'scripts/',
|
||||||
|
'migrations/',
|
||||||
|
'node_modules/',
|
||||||
|
],
|
||||||
|
|
||||||
|
// Per-rule ignores
|
||||||
|
ignore: {
|
||||||
|
'hardcode/magic-numbers': {
|
||||||
|
'src/config/constants.ts': [3000, 8080],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. Built-in Presets
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Preset: clean-architecture (default)
|
||||||
|
preset: 'clean-architecture'
|
||||||
|
// Enables: layer-violation, dependency-direction, naming-convention
|
||||||
|
|
||||||
|
// Preset: ddd
|
||||||
|
preset: 'ddd'
|
||||||
|
// Enables all DDD patterns: aggregates, value-objects, domain-events
|
||||||
|
|
||||||
|
// Preset: hexagonal (Ports & Adapters)
|
||||||
|
preset: 'hexagonal'
|
||||||
|
// Validates port/adapter separation
|
||||||
|
|
||||||
|
// Preset: minimal (for prototyping)
|
||||||
|
preset: 'minimal'
|
||||||
|
// Only critical rules: hardcode, circular-deps
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Framework-Specific Presets
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// NestJS
|
||||||
|
preset: 'nestjs-clean-architecture'
|
||||||
|
|
||||||
|
// Express
|
||||||
|
preset: 'express-clean-architecture'
|
||||||
|
|
||||||
|
// Next.js
|
||||||
|
preset: 'nextjs-clean-architecture'
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 4. Configuration Discovery
|
||||||
|
|
||||||
|
Support multiple config file formats:
|
||||||
|
- `guardian.config.js` (ES modules)
|
||||||
|
- `guardian.config.cjs` (CommonJS)
|
||||||
|
- `.guardianrc` (JSON)
|
||||||
|
- `.guardianrc.json`
|
||||||
|
- `package.json` (`guardian` field)
|
||||||
|
|
||||||
|
#### 5. CLI Override
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Override config from CLI
|
||||||
|
guardian check ./src --rule hardcode/magic-numbers=off
|
||||||
|
|
||||||
|
# Use specific config file
|
||||||
|
guardian check ./src --config custom-config.js
|
||||||
|
|
||||||
|
# Generate config
|
||||||
|
guardian init --preset clean-architecture
|
||||||
|
```
|
||||||
|
|
||||||
|
### Implementation Tasks
|
||||||
|
- [ ] Create config parser and validator
|
||||||
|
- [ ] Implement preset system
|
||||||
|
- [ ] Add config discovery logic
|
||||||
|
- [ ] Update AnalyzeProject use case to accept config
|
||||||
|
- [ ] CLI integration for config override
|
||||||
|
- [ ] Add `guardian init` command
|
||||||
|
- [ ] Documentation and examples
|
||||||
|
- [ ] Tests (config parsing, presets, overrides)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Version 0.7.0 - Visualization & Dependency Graphs 🎨
|
||||||
|
**Target:** Q1 2026 (March)
|
||||||
|
**Priority:** 🔥 HIGH
|
||||||
|
|
||||||
|
> **Why High:** dependency-cruiser's main advantage is visualization. Guardian needs this to compete.
|
||||||
|
|
||||||
|
### Features
|
||||||
|
|
||||||
|
#### 1. Dependency Graph Visualization
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Generate SVG graph
|
||||||
|
guardian visualize ./src --output architecture.svg
|
||||||
|
|
||||||
|
# Interactive HTML
|
||||||
|
guardian visualize ./src --format html --output report.html
|
||||||
|
|
||||||
|
# Mermaid diagram for docs
|
||||||
|
guardian graph ./src --format mermaid > ARCHITECTURE.md
|
||||||
|
|
||||||
|
# ASCII tree for terminal
|
||||||
|
guardian visualize ./src --format ascii
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. Layer Dependency Diagram
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TD
|
||||||
|
I[Infrastructure Layer] --> A[Application Layer]
|
||||||
|
I --> D[Domain Layer]
|
||||||
|
A --> D
|
||||||
|
D --> S[Shared]
|
||||||
|
A --> S
|
||||||
|
I --> S
|
||||||
|
|
||||||
|
style D fill:#4CAF50
|
||||||
|
style A fill:#2196F3
|
||||||
|
style I fill:#FF9800
|
||||||
|
style S fill:#9E9E9E
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Violation Highlighting
|
||||||
|
|
||||||
|
Visualize violations on graph:
|
||||||
|
- 🔴 Circular dependencies (red arrows)
|
||||||
|
- ⚠️ Framework leaks (yellow highlights)
|
||||||
|
- 🚫 Wrong dependency direction (dashed red arrows)
|
||||||
|
- ✅ Correct dependencies (green arrows)
|
||||||
|
|
||||||
|
#### 4. Metrics Overlay
|
||||||
|
|
||||||
|
```bash
|
||||||
|
guardian visualize ./src --show-metrics
|
||||||
|
|
||||||
|
# Shows on each node:
|
||||||
|
# - File count per layer
|
||||||
|
# - Hardcode violations count
|
||||||
|
# - Complexity score
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 5. Export Formats
|
||||||
|
|
||||||
|
- SVG (for docs/website)
|
||||||
|
- PNG (for presentations)
|
||||||
|
- HTML (interactive, zoomable)
|
||||||
|
- Mermaid (for markdown docs)
|
||||||
|
- DOT (Graphviz format)
|
||||||
|
- JSON (for custom processing)
|
||||||
|
|
||||||
|
### Implementation Tasks
|
||||||
|
- [ ] Implement graph generation engine
|
||||||
|
- [ ] Add SVG/PNG renderer
|
||||||
|
- [ ] Create Mermaid diagram generator
|
||||||
|
- [ ] Build HTML interactive viewer
|
||||||
|
- [ ] Add violation highlighting
|
||||||
|
- [ ] Metrics overlay system
|
||||||
|
- [ ] CLI commands (`visualize`, `graph`)
|
||||||
|
- [ ] Documentation and examples
|
||||||
|
- [ ] Tests (graph generation, formats)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Version 0.8.0 - CI/CD Integration Kit 🚀
|
||||||
|
**Target:** Q2 2026 (April)
|
||||||
|
**Priority:** 🔥 HIGH
|
||||||
|
|
||||||
|
> **Why High:** Enterprise requires CI/CD integration. SonarQube succeeds because of this.
|
||||||
|
|
||||||
|
### Features
|
||||||
|
|
||||||
|
#### 1. GitHub Actions
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# .github/workflows/guardian.yml (ready-to-use template)
|
||||||
|
name: Guardian Quality Check
|
||||||
|
|
||||||
|
on: [push, pull_request]
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
guardian:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v3
|
||||||
|
- uses: actions/setup-node@v3
|
||||||
|
|
||||||
|
- name: Guardian Analysis
|
||||||
|
uses: puaros/guardian-action@v1
|
||||||
|
with:
|
||||||
|
path: './src'
|
||||||
|
fail-on: 'error'
|
||||||
|
report-format: 'markdown'
|
||||||
|
|
||||||
|
- name: Comment PR
|
||||||
|
uses: actions/github-script@v6
|
||||||
|
if: github.event_name == 'pull_request'
|
||||||
|
with:
|
||||||
|
script: |
|
||||||
|
// Auto-comment violations on PR
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. GitLab CI Template
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# .gitlab-ci.yml
|
||||||
|
include:
|
||||||
|
- template: Guardian.gitlab-ci.yml
|
||||||
|
|
||||||
|
guardian_check:
|
||||||
|
stage: test
|
||||||
|
extends: .guardian
|
||||||
|
variables:
|
||||||
|
GUARDIAN_FAIL_ON: "error"
|
||||||
|
GUARDIAN_FORMAT: "markdown"
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Quality Gate
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Fail build on violations
|
||||||
|
guardian check ./src --fail-on error
|
||||||
|
guardian check ./src --fail-on warning
|
||||||
|
|
||||||
|
# Threshold-based
|
||||||
|
guardian check ./src --max-violations 10
|
||||||
|
guardian check ./src --max-hardcode 5
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 4. PR Auto-Comments
|
||||||
|
|
||||||
|
Automatically comment on PRs with:
|
||||||
|
- Summary of violations
|
||||||
|
- Comparison with base branch
|
||||||
|
- Quality score change
|
||||||
|
- Actionable suggestions
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
## 🛡️ Guardian Report
|
||||||
|
|
||||||
|
**Quality Score:** 87/100 (⬆️ +3 from main)
|
||||||
|
|
||||||
|
### Violations Found: 5
|
||||||
|
|
||||||
|
#### 🔴 Critical (2)
|
||||||
|
- `src/api/server.ts:15` - Hardcoded port 3000
|
||||||
|
- `src/domain/User.ts:10` - Framework leak (Express)
|
||||||
|
|
||||||
|
#### ⚠️ Warnings (3)
|
||||||
|
- `src/services/UserService.ts` - Naming convention
|
||||||
|
- ...
|
||||||
|
|
||||||
|
[View Full Report](link)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 5. Pre-commit Hook
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install via npx
|
||||||
|
npx guardian install-hooks
|
||||||
|
|
||||||
|
# Creates .husky/pre-commit
|
||||||
|
#!/bin/sh
|
||||||
|
guardian check --staged --fail-on error
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 6. Status Checks
|
||||||
|
|
||||||
|
Integrate with GitHub/GitLab status checks:
|
||||||
|
- ✅ No violations
|
||||||
|
- ⚠️ Warnings only
|
||||||
|
- ❌ Errors found
|
||||||
|
|
||||||
|
### Implementation Tasks
|
||||||
|
- [ ] Create GitHub Action
|
||||||
|
- [ ] Create GitLab CI template
|
||||||
|
- [ ] Implement quality gate logic
|
||||||
|
- [ ] Build PR comment generator
|
||||||
|
- [ ] Pre-commit hook installer
|
||||||
|
- [ ] Status check integration
|
||||||
|
- [ ] Bitbucket Pipelines support
|
||||||
|
- [ ] Documentation and examples
|
||||||
|
- [ ] Tests (CI/CD scenarios)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Version 0.9.0 - Auto-Fix & Code Generation 🤖
|
||||||
|
**Target:** Q2 2026 (May)
|
||||||
|
**Priority:** 🚀 GAME-CHANGER (UNIQUE!)
|
||||||
|
|
||||||
|
> **Why Game-Changer:** No competitor has intelligent auto-fix for architecture. This makes Guardian unique!
|
||||||
|
|
||||||
|
### Features
|
||||||
|
|
||||||
|
#### 1. Auto-Fix Hardcode
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Fix all hardcode violations automatically
|
||||||
|
guardian fix ./src --auto
|
||||||
|
|
||||||
|
# Preview changes
|
||||||
|
guardian fix ./src --dry-run
|
||||||
|
|
||||||
|
# Fix specific types
|
||||||
|
guardian fix ./src --type hardcode
|
||||||
|
guardian fix ./src --type naming
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example:**
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Before
|
||||||
|
const timeout = 5000
|
||||||
|
app.listen(3000)
|
||||||
|
|
||||||
|
// After (auto-generated constants.ts)
|
||||||
|
export const DEFAULT_TIMEOUT_MS = 5000
|
||||||
|
export const DEFAULT_PORT = 3000
|
||||||
|
|
||||||
|
// After (fixed code)
|
||||||
|
import { DEFAULT_TIMEOUT_MS, DEFAULT_PORT } from './constants'
|
||||||
|
const timeout = DEFAULT_TIMEOUT_MS
|
||||||
|
app.listen(DEFAULT_PORT)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. Generate Constants File
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Extract all hardcodes to constants
|
||||||
|
guardian generate constants ./src --output src/config/constants.ts
|
||||||
|
|
||||||
|
# Generated file:
|
||||||
|
// src/config/constants.ts
|
||||||
|
export const DEFAULT_TIMEOUT_MS = 5000
|
||||||
|
export const DEFAULT_PORT = 3000
|
||||||
|
export const MAX_RETRIES = 3
|
||||||
|
export const API_BASE_URL = 'http://localhost:8080'
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Fix Naming Violations
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Rename files to match conventions
|
||||||
|
guardian fix naming ./src --auto
|
||||||
|
|
||||||
|
# Before: src/application/use-cases/user.ts
|
||||||
|
# After: src/application/use-cases/CreateUser.ts
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 4. AI-Friendly Fix Prompts
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Generate prompt for AI assistant
|
||||||
|
guardian check ./src --format ai-prompt > fix-prompt.txt
|
||||||
|
|
||||||
|
# Output (optimized for Claude/GPT):
|
||||||
|
"""
|
||||||
|
Fix the following Guardian violations:
|
||||||
|
|
||||||
|
1. HARDCODE (src/api/server.ts:15)
|
||||||
|
- Replace: app.listen(3000)
|
||||||
|
- With: Extract 3000 to DEFAULT_PORT constant
|
||||||
|
- Location: Create src/config/constants.ts
|
||||||
|
|
||||||
|
2. FRAMEWORK_LEAK (src/domain/User.ts:5)
|
||||||
|
- Remove: import { Request } from 'express'
|
||||||
|
- Reason: Domain layer cannot import frameworks
|
||||||
|
- Suggestion: Use dependency injection via interfaces
|
||||||
|
|
||||||
|
[Complete fix suggestions...]
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Then feed to Claude:
|
||||||
|
# cat fix-prompt.txt | pbcopy
|
||||||
|
# Paste into Claude: "Fix these Guardian violations"
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 5. Interactive Fix Mode
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Interactive fix selection
|
||||||
|
guardian fix ./src --interactive
|
||||||
|
|
||||||
|
# Prompts:
|
||||||
|
# ? Fix hardcode in server.ts:15 (3000)? (Y/n)
|
||||||
|
# ? Suggested constant name: DEFAULT_PORT
|
||||||
|
# [Edit name] [Skip] [Fix All]
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 6. Refactoring Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Break circular dependency
|
||||||
|
guardian refactor circular ./src/services/UserService.ts
|
||||||
|
# Suggests: Extract shared interface
|
||||||
|
|
||||||
|
# Fix layer violation
|
||||||
|
guardian refactor layer ./src/domain/entities/User.ts
|
||||||
|
# Suggests: Move framework imports to infrastructure
|
||||||
|
```
|
||||||
|
|
||||||
|
### Implementation Tasks
|
||||||
|
- [ ] Implement auto-fix engine (AST transformation)
|
||||||
|
- [ ] Constants extractor and generator
|
||||||
|
- [ ] File renaming system
|
||||||
|
- [ ] AI prompt generator
|
||||||
|
- [ ] Interactive fix mode
|
||||||
|
- [ ] Refactoring suggestions
|
||||||
|
- [ ] Safe rollback mechanism
|
||||||
|
- [ ] Documentation and examples
|
||||||
|
- [ ] Tests (fix scenarios, edge cases)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Version 0.10.0 - Metrics & Quality Score 📊
|
||||||
|
**Target:** Q2 2026 (June)
|
||||||
|
**Priority:** 🔥 HIGH
|
||||||
|
|
||||||
|
> **Why High:** Enterprise needs metrics to justify investment. SonarQube's dashboard is a major selling point.
|
||||||
|
|
||||||
|
### Features
|
||||||
|
|
||||||
|
#### 1. Quality Score (0-100)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
guardian score ./src
|
||||||
|
|
||||||
|
# Output:
|
||||||
|
# 🛡️ Guardian Quality Score: 87/100 (Good)
|
||||||
|
#
|
||||||
|
# Category Breakdown:
|
||||||
|
# ✅ Architecture: 95/100 (Excellent)
|
||||||
|
# ⚠️ Hardcode: 78/100 (Needs Improvement)
|
||||||
|
# ✅ Naming: 92/100 (Excellent)
|
||||||
|
# ✅ Dependencies: 89/100 (Good)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Score Calculation:**
|
||||||
|
- Architecture violations: -5 per error
|
||||||
|
- Hardcode violations: -1 per occurrence
|
||||||
|
- Circular dependencies: -10 per cycle
|
||||||
|
- Naming violations: -2 per error
|
||||||
|
|
||||||
|
#### 2. Metrics Dashboard (JSON/HTML)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Export metrics
|
||||||
|
guardian metrics ./src --format json > metrics.json
|
||||||
|
guardian metrics ./src --format html > dashboard.html
|
||||||
|
|
||||||
|
# Metrics included:
|
||||||
|
{
|
||||||
|
"qualityScore": 87,
|
||||||
|
"violations": {
|
||||||
|
"hardcode": 12,
|
||||||
|
"circular": 0,
|
||||||
|
"architecture": 2,
|
||||||
|
"naming": 5
|
||||||
|
},
|
||||||
|
"metrics": {
|
||||||
|
"totalFiles": 45,
|
||||||
|
"totalLOC": 3500,
|
||||||
|
"hardcodePerKLOC": 3.4,
|
||||||
|
"averageFilesPerLayer": 11.25
|
||||||
|
},
|
||||||
|
"trends": {
|
||||||
|
"scoreChange": "+5",
|
||||||
|
"violationsChange": "-8"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Trend Analysis
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Compare with main branch
|
||||||
|
guardian metrics ./src --compare-with main
|
||||||
|
|
||||||
|
# Output:
|
||||||
|
# Quality Score: 87/100 (⬆️ +3 from main)
|
||||||
|
#
|
||||||
|
# Changes:
|
||||||
|
# ✅ Hardcode violations: 12 (⬇️ -5)
|
||||||
|
# ⚠️ Naming violations: 5 (⬆️ +2)
|
||||||
|
# ✅ Circular deps: 0 (⬇️ -1)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 4. Historical Tracking
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Store metrics history
|
||||||
|
guardian metrics ./src --save
|
||||||
|
|
||||||
|
# View trends
|
||||||
|
guardian trends --last 30d
|
||||||
|
|
||||||
|
# Output: ASCII graph showing quality score over time
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 5. Export for Dashboards
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Prometheus format
|
||||||
|
guardian metrics ./src --format prometheus
|
||||||
|
|
||||||
|
# Grafana JSON
|
||||||
|
guardian metrics ./src --format grafana
|
||||||
|
|
||||||
|
# CSV for Excel
|
||||||
|
guardian metrics ./src --format csv
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 6. Badge Generation
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Generate badge for README
|
||||||
|
guardian badge ./src --output badge.svg
|
||||||
|
|
||||||
|
# Markdown badge
|
||||||
|

|
||||||
|
```
|
||||||
|
|
||||||
|
### Implementation Tasks
|
||||||
|
- [ ] Quality score calculation algorithm
|
||||||
|
- [ ] Metrics collection system
|
||||||
|
- [ ] Trend analysis engine
|
||||||
|
- [ ] JSON/HTML/Prometheus exporters
|
||||||
|
- [ ] Historical data storage
|
||||||
|
- [ ] Badge generator
|
||||||
|
- [ ] CLI commands (`score`, `metrics`, `trends`, `badge`)
|
||||||
|
- [ ] Documentation and examples
|
||||||
|
- [ ] Tests (metrics calculation, exports)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Version 0.11.0+ - DDD Specialization 🏗️
|
||||||
|
**Target:** Q3-Q4 2026
|
||||||
|
**Priority:** MEDIUM (After Market Parity)
|
||||||
|
|
||||||
|
Now we can focus on Guardian's unique DDD/Clean Architecture specialization:
|
||||||
|
|
||||||
|
### v0.11.0 - Aggregate Boundary Validation 🔒
|
||||||
|
- Detect entity references across aggregates
|
||||||
|
- Enforce ID-only references between aggregates
|
||||||
|
- Validate aggregate root access patterns
|
||||||
|
|
||||||
|
### v0.12.0 - Anemic Domain Model Detection 🩺
|
||||||
|
- Detect entities with only getters/setters
|
||||||
|
- Count methods vs properties ratio
|
||||||
|
- Suggest moving logic from services to entities
|
||||||
|
|
||||||
|
### v0.13.0 - Domain Event Validation 📢
|
||||||
|
- Validate event publishing pattern
|
||||||
|
- Check events inherit from DomainEvent base
|
||||||
|
- Detect direct infrastructure calls from entities
|
||||||
|
|
||||||
|
### v0.14.0 - Value Object Immutability 🔐
|
||||||
|
- Ensure Value Objects have readonly fields
|
||||||
|
- Detect public setters
|
||||||
|
- Verify equals() method exists
|
||||||
|
|
||||||
|
### v0.15.0 - Use Case Single Responsibility 🎯
|
||||||
|
- Check Use Case has single public method (execute)
|
||||||
|
- Detect multiple responsibilities
|
||||||
|
- Suggest splitting large Use Cases
|
||||||
|
|
||||||
|
### v0.16.0 - Interface Segregation 🔌
|
||||||
|
- Count methods per interface (> 10 = warning)
|
||||||
|
- Check method cohesion
|
||||||
|
- Suggest interface splitting
|
||||||
|
|
||||||
|
### v0.17.0 - Port-Adapter Pattern 🔌
|
||||||
|
- Check Ports (interfaces) are in application/domain
|
||||||
|
- Verify Adapters are in infrastructure
|
||||||
|
- Detect external library imports in use cases
|
||||||
|
|
||||||
|
### v0.18.0 - Command Query Separation (CQRS) 📝
|
||||||
|
- Detect methods that both change state and return data
|
||||||
|
- Check Use Case names for CQS violations
|
||||||
|
- Validate Command Use Cases return void
|
||||||
|
|
||||||
|
### v0.19.0 - Factory Pattern Validation 🏭
|
||||||
|
- Detect complex logic in entity constructors
|
||||||
|
- Check for `new Entity()` calls in use cases
|
||||||
|
- Suggest extracting construction to Factory
|
||||||
|
|
||||||
|
### v0.20.0 - Specification Pattern Detection 🔍
|
||||||
|
- Detect complex business rules in use cases
|
||||||
|
- Validate Specification classes in domain
|
||||||
|
- Suggest extracting rules to Specifications
|
||||||
|
|
||||||
|
### v0.21.0 - Layered Service Anti-pattern ⚠️
|
||||||
|
- Detect service methods operating on single entity
|
||||||
|
- Validate entities have behavior methods
|
||||||
|
- Suggest moving service methods to entities
|
||||||
|
|
||||||
|
### v0.22.0 - Bounded Context Leak Detection 🚧
|
||||||
|
- Detect entity imports across contexts
|
||||||
|
- Validate only ID references between contexts
|
||||||
|
- Verify event-based integration
|
||||||
|
|
||||||
|
### v0.23.0 - Transaction Script Detection 📜
|
||||||
|
- Detect procedural logic in use cases
|
||||||
|
- Check use case length (> 30-50 lines = warning)
|
||||||
|
- Suggest moving logic to domain entities
|
||||||
|
|
||||||
|
### v0.24.0 - Persistence Ignorance 💾
|
||||||
|
- Detect ORM decorators in domain entities
|
||||||
|
- Check for ORM library imports in domain
|
||||||
|
- Suggest persistence ignorance pattern
|
||||||
|
|
||||||
|
### v0.25.0 - Null Object Pattern Detection 🎭
|
||||||
|
- Count null checks in use cases
|
||||||
|
- Suggest Null Object pattern
|
||||||
|
- Detect repositories returning null vs Null Object
|
||||||
|
|
||||||
|
### v0.26.0 - Primitive Obsession Detection 🔢
|
||||||
|
- Detect methods with > 3 primitive parameters
|
||||||
|
- Check for common Value Object candidates
|
||||||
|
- Suggest creating Value Objects
|
||||||
|
|
||||||
|
### v0.27.0 - Service Locator Anti-pattern 🔍
|
||||||
|
- Detect global ServiceLocator/Registry classes
|
||||||
|
- Validate constructor injection
|
||||||
|
- Suggest DI container usage
|
||||||
|
|
||||||
|
### v0.28.0 - Double Dispatch Pattern 🎯
|
||||||
|
- Detect frequent instanceof or type checking
|
||||||
|
- Check for long if-else/switch by type
|
||||||
|
- Suggest Visitor pattern
|
||||||
|
|
||||||
|
### v0.29.0 - Entity Identity Validation 🆔
|
||||||
|
- Detect public mutable ID fields
|
||||||
|
- Validate ID is Value Object
|
||||||
|
- Check for equals() method implementation
|
||||||
|
|
||||||
|
### v0.30.0 - Saga Pattern Detection 🔄
|
||||||
|
- Detect multiple external calls without compensation
|
||||||
|
- Validate compensating transactions
|
||||||
|
- Suggest Saga pattern for distributed operations
|
||||||
|
|
||||||
|
### v0.31.0 - Anti-Corruption Layer Detection 🛡️
|
||||||
|
- Detect direct legacy library imports
|
||||||
|
- Check for domain adaptation to external APIs
|
||||||
|
- Validate translator/adapter layer exists
|
||||||
|
|
||||||
|
### v0.32.0 - Ubiquitous Language Validation 📖
|
||||||
|
**Priority: HIGH**
|
||||||
|
- Detect synonyms for same concepts (User/Customer/Client)
|
||||||
|
- Check inconsistent verbs (Create/Register/SignUp)
|
||||||
|
- Require Ubiquitous Language glossary
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Version 1.0.0 - Stable Release 🚀
|
||||||
|
**Target:** Q4 2026 (December)
|
||||||
|
**Priority:** 🔥 CRITICAL
|
||||||
|
|
||||||
|
Production-ready stable release with ecosystem:
|
||||||
|
|
||||||
|
### Core Features
|
||||||
|
- ✅ All detection features stabilized
|
||||||
|
- ✅ Configuration & presets
|
||||||
|
- ✅ Visualization & graphs
|
||||||
|
- ✅ CI/CD integration
|
||||||
|
- ✅ Auto-fix & code generation
|
||||||
|
- ✅ Metrics & quality score
|
||||||
|
- ✅ 30+ DDD pattern detectors
|
||||||
|
|
||||||
|
### Ecosystem
|
||||||
|
|
||||||
|
#### VS Code Extension
|
||||||
|
- Real-time detection as you type
|
||||||
|
- Inline suggestions and quick fixes
|
||||||
|
- Problem panel integration
|
||||||
|
- Code actions for auto-fix
|
||||||
|
|
||||||
|
#### JetBrains Plugin
|
||||||
|
- IntelliJ IDEA, WebStorm support
|
||||||
|
- Inspection integration
|
||||||
|
- Quick fixes
|
||||||
|
|
||||||
|
#### Web Dashboard
|
||||||
|
- Team quality metrics
|
||||||
|
- Historical trends
|
||||||
|
- Per-developer analytics
|
||||||
|
- Project comparison
|
||||||
|
|
||||||
|
#### GitHub Integration
|
||||||
|
- GitHub App
|
||||||
|
- Code scanning integration
|
||||||
|
- Dependency insights
|
||||||
|
- Security alerts for architecture violations
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💡 Future Ideas (Post-1.0.0)
|
||||||
|
|
||||||
|
### Multi-Language Support
|
||||||
|
- Python (Django/Flask + DDD)
|
||||||
|
- C# (.NET + Clean Architecture)
|
||||||
|
- Java (Spring Boot + DDD)
|
||||||
|
- Go (Clean Architecture)
|
||||||
|
|
||||||
|
### AI-Powered Features
|
||||||
|
- LLM-based fix suggestions
|
||||||
|
- AI generates code for complex refactorings
|
||||||
|
- Claude/GPT API integration
|
||||||
|
- Natural language architecture queries
|
||||||
|
|
||||||
|
### Team Analytics
|
||||||
|
- Per-developer quality metrics
|
||||||
|
- Team quality trends dashboard
|
||||||
|
- Technical debt tracking
|
||||||
|
- Leaderboards (gamification)
|
||||||
|
|
||||||
|
### Security Features
|
||||||
|
- Secrets detection (API keys, passwords)
|
||||||
|
- SQL injection pattern detection
|
||||||
|
- XSS vulnerability patterns
|
||||||
|
- Dependency vulnerability scanning
|
||||||
|
|
||||||
|
### Code Quality Metrics
|
||||||
|
- Maintainability index
|
||||||
|
- Technical debt estimation
|
||||||
|
- Code duplication detection
|
||||||
|
- Complexity trends
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Success Criteria
|
||||||
|
|
||||||
|
### v0.10.0 (Market Parity Achieved)
|
||||||
|
- ✅ Configuration support (compete with ESLint)
|
||||||
|
- ✅ Visualization (compete with dependency-cruiser)
|
||||||
|
- ✅ CI/CD integration (compete with SonarQube)
|
||||||
|
- ✅ Auto-fix (UNIQUE! Game-changer)
|
||||||
|
- ✅ Metrics dashboard (compete with SonarQube)
|
||||||
|
|
||||||
|
### v1.0.0 (Enterprise Ready)
|
||||||
|
- ✅ 1000+ GitHub stars
|
||||||
|
- ✅ 100+ npm installs/week
|
||||||
|
- ✅ 10+ enterprise adopters
|
||||||
|
- ✅ 99%+ test coverage
|
||||||
|
- ✅ Complete documentation
|
||||||
|
- ✅ IDE extensions available
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Competitive Positioning
|
||||||
|
|
||||||
|
| Feature | Guardian v1.0 | SonarQube | dependency-cruiser | ArchUnit | FTA |
|
||||||
|
|---------|---------------|-----------|-------------------|----------|-----|
|
||||||
|
| TypeScript Focus | ✅✅ | ⚠️ | ✅✅ | ❌ | ✅✅ |
|
||||||
|
| Hardcode + AI Tips | ✅✅ UNIQUE | ⚠️ | ❌ | ❌ | ❌ |
|
||||||
|
| Architecture (DDD) | ✅✅ UNIQUE | ⚠️ | ⚠️ | ✅ | ❌ |
|
||||||
|
| Visualization | ✅ | ✅ | ✅✅ | ❌ | ⚠️ |
|
||||||
|
| Auto-Fix | ✅✅ UNIQUE | ❌ | ❌ | ❌ | ❌ |
|
||||||
|
| Configuration | ✅ | ✅✅ | ✅ | ✅ | ⚠️ |
|
||||||
|
| CI/CD | ✅ | ✅✅ | ✅ | ✅ | ⚠️ |
|
||||||
|
| Metrics | ✅ | ✅✅ | ⚠️ | ❌ | ✅✅ |
|
||||||
|
| Security (SAST) | ❌ | ✅✅ | ❌ | ❌ | ❌ |
|
||||||
|
| Multi-language | ❌ | ✅✅ | ⚠️ | ⚠️ | ❌ |
|
||||||
|
|
||||||
|
**Guardian's Position:** The AI-First Architecture Guardian for TypeScript/DDD teams
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🤝 Contributing
|
||||||
|
|
||||||
|
Want to help build Guardian? Check out:
|
||||||
|
- [GitHub Issues](https://github.com/samiyev/puaros/issues)
|
||||||
|
- [CONTRIBUTING.md](../../CONTRIBUTING.md)
|
||||||
|
- [Discord Community](#) (coming soon)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📈 Versioning
|
||||||
|
|
||||||
|
Guardian follows [Semantic Versioning](https://semver.org/):
|
||||||
|
- **MAJOR** (1.0.0) - Breaking changes
|
||||||
|
- **MINOR** (0.x.0) - New features, backwards compatible
|
||||||
|
- **PATCH** (0.x.y) - Bug fixes
|
||||||
|
|
||||||
|
Until 1.0.0, minor versions may include breaking changes as we iterate on the API.
|
||||||
1176
packages/guardian/docs/v0.6.0-CONFIGURATION-SPEC.md
Normal file
1176
packages/guardian/docs/v0.6.0-CONFIGURATION-SPEC.md
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "@samiyev/guardian",
|
"name": "@samiyev/guardian",
|
||||||
"version": "0.7.1",
|
"version": "0.7.6",
|
||||||
"description": "Research-backed code quality guardian for AI-assisted development. Detects hardcodes, circular deps, framework leaks, entity exposure, and 8 architecture violations. Enforces Clean Architecture/DDD principles. Works with GitHub Copilot, Cursor, Windsurf, Claude, ChatGPT, Cline, and any AI coding tool.",
|
"description": "Research-backed code quality guardian for AI-assisted development. Detects hardcodes, circular deps, framework leaks, entity exposure, and 8 architecture violations. Enforces Clean Architecture/DDD principles. Works with GitHub Copilot, Cursor, Windsurf, Claude, ChatGPT, Cline, and any AI coding tool.",
|
||||||
"keywords": [
|
"keywords": [
|
||||||
"puaros",
|
"puaros",
|
||||||
|
|||||||
@@ -11,18 +11,17 @@ import { IRepositoryPatternDetector } from "../../domain/services/RepositoryPatt
|
|||||||
import { IAggregateBoundaryDetector } from "../../domain/services/IAggregateBoundaryDetector"
|
import { IAggregateBoundaryDetector } from "../../domain/services/IAggregateBoundaryDetector"
|
||||||
import { SourceFile } from "../../domain/entities/SourceFile"
|
import { SourceFile } from "../../domain/entities/SourceFile"
|
||||||
import { DependencyGraph } from "../../domain/entities/DependencyGraph"
|
import { DependencyGraph } from "../../domain/entities/DependencyGraph"
|
||||||
import { ProjectPath } from "../../domain/value-objects/ProjectPath"
|
import { FileCollectionStep } from "./pipeline/FileCollectionStep"
|
||||||
|
import { ParsingStep } from "./pipeline/ParsingStep"
|
||||||
|
import { DetectionPipeline } from "./pipeline/DetectionPipeline"
|
||||||
|
import { ResultAggregator } from "./pipeline/ResultAggregator"
|
||||||
import {
|
import {
|
||||||
ERROR_MESSAGES,
|
ERROR_MESSAGES,
|
||||||
HARDCODE_TYPES,
|
HARDCODE_TYPES,
|
||||||
LAYERS,
|
|
||||||
NAMING_VIOLATION_TYPES,
|
NAMING_VIOLATION_TYPES,
|
||||||
REGEX_PATTERNS,
|
|
||||||
REPOSITORY_VIOLATION_TYPES,
|
REPOSITORY_VIOLATION_TYPES,
|
||||||
RULES,
|
RULES,
|
||||||
SEVERITY_ORDER,
|
|
||||||
type SeverityLevel,
|
type SeverityLevel,
|
||||||
VIOLATION_SEVERITY_MAP,
|
|
||||||
} from "../../shared/constants"
|
} from "../../shared/constants"
|
||||||
|
|
||||||
export interface AnalyzeProjectRequest {
|
export interface AnalyzeProjectRequest {
|
||||||
@@ -173,442 +172,74 @@ export interface ProjectMetrics {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* Main use case for analyzing a project's codebase
|
* Main use case for analyzing a project's codebase
|
||||||
|
* Orchestrates the analysis pipeline through focused components
|
||||||
*/
|
*/
|
||||||
export class AnalyzeProject extends UseCase<
|
export class AnalyzeProject extends UseCase<
|
||||||
AnalyzeProjectRequest,
|
AnalyzeProjectRequest,
|
||||||
ResponseDto<AnalyzeProjectResponse>
|
ResponseDto<AnalyzeProjectResponse>
|
||||||
> {
|
> {
|
||||||
|
private readonly fileCollectionStep: FileCollectionStep
|
||||||
|
private readonly parsingStep: ParsingStep
|
||||||
|
private readonly detectionPipeline: DetectionPipeline
|
||||||
|
private readonly resultAggregator: ResultAggregator
|
||||||
|
|
||||||
constructor(
|
constructor(
|
||||||
private readonly fileScanner: IFileScanner,
|
fileScanner: IFileScanner,
|
||||||
private readonly codeParser: ICodeParser,
|
codeParser: ICodeParser,
|
||||||
private readonly hardcodeDetector: IHardcodeDetector,
|
hardcodeDetector: IHardcodeDetector,
|
||||||
private readonly namingConventionDetector: INamingConventionDetector,
|
namingConventionDetector: INamingConventionDetector,
|
||||||
private readonly frameworkLeakDetector: IFrameworkLeakDetector,
|
frameworkLeakDetector: IFrameworkLeakDetector,
|
||||||
private readonly entityExposureDetector: IEntityExposureDetector,
|
entityExposureDetector: IEntityExposureDetector,
|
||||||
private readonly dependencyDirectionDetector: IDependencyDirectionDetector,
|
dependencyDirectionDetector: IDependencyDirectionDetector,
|
||||||
private readonly repositoryPatternDetector: IRepositoryPatternDetector,
|
repositoryPatternDetector: IRepositoryPatternDetector,
|
||||||
private readonly aggregateBoundaryDetector: IAggregateBoundaryDetector,
|
aggregateBoundaryDetector: IAggregateBoundaryDetector,
|
||||||
) {
|
) {
|
||||||
super()
|
super()
|
||||||
|
this.fileCollectionStep = new FileCollectionStep(fileScanner)
|
||||||
|
this.parsingStep = new ParsingStep(codeParser)
|
||||||
|
this.detectionPipeline = new DetectionPipeline(
|
||||||
|
hardcodeDetector,
|
||||||
|
namingConventionDetector,
|
||||||
|
frameworkLeakDetector,
|
||||||
|
entityExposureDetector,
|
||||||
|
dependencyDirectionDetector,
|
||||||
|
repositoryPatternDetector,
|
||||||
|
aggregateBoundaryDetector,
|
||||||
|
)
|
||||||
|
this.resultAggregator = new ResultAggregator()
|
||||||
}
|
}
|
||||||
|
|
||||||
public async execute(
|
public async execute(
|
||||||
request: AnalyzeProjectRequest,
|
request: AnalyzeProjectRequest,
|
||||||
): Promise<ResponseDto<AnalyzeProjectResponse>> {
|
): Promise<ResponseDto<AnalyzeProjectResponse>> {
|
||||||
try {
|
try {
|
||||||
const filePaths = await this.fileScanner.scan({
|
const { sourceFiles } = await this.fileCollectionStep.execute({
|
||||||
rootDir: request.rootDir,
|
rootDir: request.rootDir,
|
||||||
include: request.include,
|
include: request.include,
|
||||||
exclude: request.exclude,
|
exclude: request.exclude,
|
||||||
})
|
})
|
||||||
|
|
||||||
const sourceFiles: SourceFile[] = []
|
const { dependencyGraph, totalFunctions } = this.parsingStep.execute({
|
||||||
const dependencyGraph = new DependencyGraph()
|
sourceFiles,
|
||||||
let totalFunctions = 0
|
rootDir: request.rootDir,
|
||||||
|
|
||||||
for (const filePath of filePaths) {
|
|
||||||
const content = await this.fileScanner.readFile(filePath)
|
|
||||||
const projectPath = ProjectPath.create(filePath, request.rootDir)
|
|
||||||
|
|
||||||
const imports = this.extractImports(content)
|
|
||||||
const exports = this.extractExports(content)
|
|
||||||
|
|
||||||
const sourceFile = new SourceFile(projectPath, content, imports, exports)
|
|
||||||
|
|
||||||
sourceFiles.push(sourceFile)
|
|
||||||
dependencyGraph.addFile(sourceFile)
|
|
||||||
|
|
||||||
if (projectPath.isTypeScript()) {
|
|
||||||
const tree = this.codeParser.parseTypeScript(content)
|
|
||||||
const functions = this.codeParser.extractFunctions(tree)
|
|
||||||
totalFunctions += functions.length
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const imp of imports) {
|
|
||||||
dependencyGraph.addDependency(
|
|
||||||
projectPath.relative,
|
|
||||||
this.resolveImportPath(imp, filePath, request.rootDir),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const violations = this.sortBySeverity(this.detectViolations(sourceFiles))
|
|
||||||
const hardcodeViolations = this.sortBySeverity(this.detectHardcode(sourceFiles))
|
|
||||||
const circularDependencyViolations = this.sortBySeverity(
|
|
||||||
this.detectCircularDependencies(dependencyGraph),
|
|
||||||
)
|
|
||||||
const namingViolations = this.sortBySeverity(this.detectNamingConventions(sourceFiles))
|
|
||||||
const frameworkLeakViolations = this.sortBySeverity(
|
|
||||||
this.detectFrameworkLeaks(sourceFiles),
|
|
||||||
)
|
|
||||||
const entityExposureViolations = this.sortBySeverity(
|
|
||||||
this.detectEntityExposures(sourceFiles),
|
|
||||||
)
|
|
||||||
const dependencyDirectionViolations = this.sortBySeverity(
|
|
||||||
this.detectDependencyDirections(sourceFiles),
|
|
||||||
)
|
|
||||||
const repositoryPatternViolations = this.sortBySeverity(
|
|
||||||
this.detectRepositoryPatternViolations(sourceFiles),
|
|
||||||
)
|
|
||||||
const aggregateBoundaryViolations = this.sortBySeverity(
|
|
||||||
this.detectAggregateBoundaryViolations(sourceFiles),
|
|
||||||
)
|
|
||||||
const metrics = this.calculateMetrics(sourceFiles, totalFunctions, dependencyGraph)
|
|
||||||
|
|
||||||
return ResponseDto.ok({
|
|
||||||
files: sourceFiles,
|
|
||||||
dependencyGraph,
|
|
||||||
violations,
|
|
||||||
hardcodeViolations,
|
|
||||||
circularDependencyViolations,
|
|
||||||
namingViolations,
|
|
||||||
frameworkLeakViolations,
|
|
||||||
entityExposureViolations,
|
|
||||||
dependencyDirectionViolations,
|
|
||||||
repositoryPatternViolations,
|
|
||||||
aggregateBoundaryViolations,
|
|
||||||
metrics,
|
|
||||||
})
|
})
|
||||||
|
|
||||||
|
const detectionResult = this.detectionPipeline.execute({
|
||||||
|
sourceFiles,
|
||||||
|
dependencyGraph,
|
||||||
|
})
|
||||||
|
|
||||||
|
const response = this.resultAggregator.execute({
|
||||||
|
sourceFiles,
|
||||||
|
dependencyGraph,
|
||||||
|
totalFunctions,
|
||||||
|
...detectionResult,
|
||||||
|
})
|
||||||
|
|
||||||
|
return ResponseDto.ok(response)
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
const errorMessage = `${ERROR_MESSAGES.FAILED_TO_ANALYZE}: ${error instanceof Error ? error.message : String(error)}`
|
const errorMessage = `${ERROR_MESSAGES.FAILED_TO_ANALYZE}: ${error instanceof Error ? error.message : String(error)}`
|
||||||
return ResponseDto.fail(errorMessage)
|
return ResponseDto.fail(errorMessage)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private extractImports(content: string): string[] {
|
|
||||||
const imports: string[] = []
|
|
||||||
let match
|
|
||||||
|
|
||||||
while ((match = REGEX_PATTERNS.IMPORT_STATEMENT.exec(content)) !== null) {
|
|
||||||
imports.push(match[1])
|
|
||||||
}
|
|
||||||
|
|
||||||
return imports
|
|
||||||
}
|
|
||||||
|
|
||||||
private extractExports(content: string): string[] {
|
|
||||||
const exports: string[] = []
|
|
||||||
let match
|
|
||||||
|
|
||||||
while ((match = REGEX_PATTERNS.EXPORT_STATEMENT.exec(content)) !== null) {
|
|
||||||
exports.push(match[1])
|
|
||||||
}
|
|
||||||
|
|
||||||
return exports
|
|
||||||
}
|
|
||||||
|
|
||||||
private resolveImportPath(importPath: string, _currentFile: string, _rootDir: string): string {
|
|
||||||
if (importPath.startsWith(".")) {
|
|
||||||
return importPath
|
|
||||||
}
|
|
||||||
return importPath
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectViolations(sourceFiles: SourceFile[]): ArchitectureViolation[] {
|
|
||||||
const violations: ArchitectureViolation[] = []
|
|
||||||
|
|
||||||
const layerRules: Record<string, string[]> = {
|
|
||||||
[LAYERS.DOMAIN]: [LAYERS.SHARED],
|
|
||||||
[LAYERS.APPLICATION]: [LAYERS.DOMAIN, LAYERS.SHARED],
|
|
||||||
[LAYERS.INFRASTRUCTURE]: [LAYERS.DOMAIN, LAYERS.APPLICATION, LAYERS.SHARED],
|
|
||||||
[LAYERS.SHARED]: [],
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
if (!file.layer) {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
const allowedLayers = layerRules[file.layer]
|
|
||||||
|
|
||||||
for (const imp of file.imports) {
|
|
||||||
const importedLayer = this.detectLayerFromImport(imp)
|
|
||||||
|
|
||||||
if (
|
|
||||||
importedLayer &&
|
|
||||||
importedLayer !== file.layer &&
|
|
||||||
!allowedLayers.includes(importedLayer)
|
|
||||||
) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.CLEAN_ARCHITECTURE,
|
|
||||||
message: `Layer "${file.layer}" cannot import from "${importedLayer}"`,
|
|
||||||
file: file.path.relative,
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.ARCHITECTURE,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectLayerFromImport(importPath: string): string | undefined {
|
|
||||||
const layers = Object.values(LAYERS)
|
|
||||||
|
|
||||||
for (const layer of layers) {
|
|
||||||
if (importPath.toLowerCase().includes(layer)) {
|
|
||||||
return layer
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return undefined
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectHardcode(sourceFiles: SourceFile[]): HardcodeViolation[] {
|
|
||||||
const violations: HardcodeViolation[] = []
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
const hardcodedValues = this.hardcodeDetector.detectAll(
|
|
||||||
file.content,
|
|
||||||
file.path.relative,
|
|
||||||
)
|
|
||||||
|
|
||||||
for (const hardcoded of hardcodedValues) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.HARDCODED_VALUE,
|
|
||||||
type: hardcoded.type,
|
|
||||||
value: hardcoded.value,
|
|
||||||
file: file.path.relative,
|
|
||||||
line: hardcoded.line,
|
|
||||||
column: hardcoded.column,
|
|
||||||
context: hardcoded.context,
|
|
||||||
suggestion: {
|
|
||||||
constantName: hardcoded.suggestConstantName(),
|
|
||||||
location: hardcoded.suggestLocation(file.layer),
|
|
||||||
},
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.HARDCODE,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectCircularDependencies(
|
|
||||||
dependencyGraph: DependencyGraph,
|
|
||||||
): CircularDependencyViolation[] {
|
|
||||||
const violations: CircularDependencyViolation[] = []
|
|
||||||
const cycles = dependencyGraph.findCycles()
|
|
||||||
|
|
||||||
for (const cycle of cycles) {
|
|
||||||
const cycleChain = [...cycle, cycle[0]].join(" → ")
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.CIRCULAR_DEPENDENCY,
|
|
||||||
message: `Circular dependency detected: ${cycleChain}`,
|
|
||||||
cycle,
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.CIRCULAR_DEPENDENCY,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectNamingConventions(sourceFiles: SourceFile[]): NamingConventionViolation[] {
|
|
||||||
const violations: NamingConventionViolation[] = []
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
const namingViolations = this.namingConventionDetector.detectViolations(
|
|
||||||
file.path.filename,
|
|
||||||
file.layer,
|
|
||||||
file.path.relative,
|
|
||||||
)
|
|
||||||
|
|
||||||
for (const violation of namingViolations) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.NAMING_CONVENTION,
|
|
||||||
type: violation.violationType,
|
|
||||||
fileName: violation.fileName,
|
|
||||||
layer: violation.layer,
|
|
||||||
file: violation.filePath,
|
|
||||||
expected: violation.expected,
|
|
||||||
actual: violation.actual,
|
|
||||||
message: violation.getMessage(),
|
|
||||||
suggestion: violation.suggestion,
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.NAMING_CONVENTION,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectFrameworkLeaks(sourceFiles: SourceFile[]): FrameworkLeakViolation[] {
|
|
||||||
const violations: FrameworkLeakViolation[] = []
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
const leaks = this.frameworkLeakDetector.detectLeaks(
|
|
||||||
file.imports,
|
|
||||||
file.path.relative,
|
|
||||||
file.layer,
|
|
||||||
)
|
|
||||||
|
|
||||||
for (const leak of leaks) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.FRAMEWORK_LEAK,
|
|
||||||
packageName: leak.packageName,
|
|
||||||
category: leak.category,
|
|
||||||
categoryDescription: leak.getCategoryDescription(),
|
|
||||||
file: file.path.relative,
|
|
||||||
layer: leak.layer,
|
|
||||||
line: leak.line,
|
|
||||||
message: leak.getMessage(),
|
|
||||||
suggestion: leak.getSuggestion(),
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.FRAMEWORK_LEAK,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectEntityExposures(sourceFiles: SourceFile[]): EntityExposureViolation[] {
|
|
||||||
const violations: EntityExposureViolation[] = []
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
const exposures = this.entityExposureDetector.detectExposures(
|
|
||||||
file.content,
|
|
||||||
file.path.relative,
|
|
||||||
file.layer,
|
|
||||||
)
|
|
||||||
|
|
||||||
for (const exposure of exposures) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.ENTITY_EXPOSURE,
|
|
||||||
entityName: exposure.entityName,
|
|
||||||
returnType: exposure.returnType,
|
|
||||||
file: file.path.relative,
|
|
||||||
layer: exposure.layer,
|
|
||||||
line: exposure.line,
|
|
||||||
methodName: exposure.methodName,
|
|
||||||
message: exposure.getMessage(),
|
|
||||||
suggestion: exposure.getSuggestion(),
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.ENTITY_EXPOSURE,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectDependencyDirections(sourceFiles: SourceFile[]): DependencyDirectionViolation[] {
|
|
||||||
const violations: DependencyDirectionViolation[] = []
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
const directionViolations = this.dependencyDirectionDetector.detectViolations(
|
|
||||||
file.content,
|
|
||||||
file.path.relative,
|
|
||||||
file.layer,
|
|
||||||
)
|
|
||||||
|
|
||||||
for (const violation of directionViolations) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.DEPENDENCY_DIRECTION,
|
|
||||||
fromLayer: violation.fromLayer,
|
|
||||||
toLayer: violation.toLayer,
|
|
||||||
importPath: violation.importPath,
|
|
||||||
file: file.path.relative,
|
|
||||||
line: violation.line,
|
|
||||||
message: violation.getMessage(),
|
|
||||||
suggestion: violation.getSuggestion(),
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.DEPENDENCY_DIRECTION,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectRepositoryPatternViolations(
|
|
||||||
sourceFiles: SourceFile[],
|
|
||||||
): RepositoryPatternViolation[] {
|
|
||||||
const violations: RepositoryPatternViolation[] = []
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
const patternViolations = this.repositoryPatternDetector.detectViolations(
|
|
||||||
file.content,
|
|
||||||
file.path.relative,
|
|
||||||
file.layer,
|
|
||||||
)
|
|
||||||
|
|
||||||
for (const violation of patternViolations) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.REPOSITORY_PATTERN,
|
|
||||||
violationType: violation.violationType as
|
|
||||||
| typeof REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE
|
|
||||||
| typeof REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE
|
|
||||||
| typeof REPOSITORY_VIOLATION_TYPES.NEW_REPOSITORY_IN_USE_CASE
|
|
||||||
| typeof REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
|
||||||
file: file.path.relative,
|
|
||||||
layer: violation.layer,
|
|
||||||
line: violation.line,
|
|
||||||
details: violation.details,
|
|
||||||
message: violation.getMessage(),
|
|
||||||
suggestion: violation.getSuggestion(),
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.REPOSITORY_PATTERN,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectAggregateBoundaryViolations(
|
|
||||||
sourceFiles: SourceFile[],
|
|
||||||
): AggregateBoundaryViolation[] {
|
|
||||||
const violations: AggregateBoundaryViolation[] = []
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
const boundaryViolations = this.aggregateBoundaryDetector.detectViolations(
|
|
||||||
file.content,
|
|
||||||
file.path.relative,
|
|
||||||
file.layer,
|
|
||||||
)
|
|
||||||
|
|
||||||
for (const violation of boundaryViolations) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.AGGREGATE_BOUNDARY,
|
|
||||||
fromAggregate: violation.fromAggregate,
|
|
||||||
toAggregate: violation.toAggregate,
|
|
||||||
entityName: violation.entityName,
|
|
||||||
importPath: violation.importPath,
|
|
||||||
file: file.path.relative,
|
|
||||||
line: violation.line,
|
|
||||||
message: violation.getMessage(),
|
|
||||||
suggestion: violation.getSuggestion(),
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.AGGREGATE_BOUNDARY,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private calculateMetrics(
|
|
||||||
sourceFiles: SourceFile[],
|
|
||||||
totalFunctions: number,
|
|
||||||
_dependencyGraph: DependencyGraph,
|
|
||||||
): ProjectMetrics {
|
|
||||||
const layerDistribution: Record<string, number> = {}
|
|
||||||
let totalImports = 0
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
if (file.layer) {
|
|
||||||
layerDistribution[file.layer] = (layerDistribution[file.layer] || 0) + 1
|
|
||||||
}
|
|
||||||
totalImports += file.imports.length
|
|
||||||
}
|
|
||||||
|
|
||||||
return {
|
|
||||||
totalFiles: sourceFiles.length,
|
|
||||||
totalFunctions,
|
|
||||||
totalImports,
|
|
||||||
layerDistribution,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private sortBySeverity<T extends { severity: SeverityLevel }>(violations: T[]): T[] {
|
|
||||||
return violations.sort((a, b) => {
|
|
||||||
return SEVERITY_ORDER[a.severity] - SEVERITY_ORDER[b.severity]
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -0,0 +1,373 @@
|
|||||||
|
import { IHardcodeDetector } from "../../../domain/services/IHardcodeDetector"
|
||||||
|
import { INamingConventionDetector } from "../../../domain/services/INamingConventionDetector"
|
||||||
|
import { IFrameworkLeakDetector } from "../../../domain/services/IFrameworkLeakDetector"
|
||||||
|
import { IEntityExposureDetector } from "../../../domain/services/IEntityExposureDetector"
|
||||||
|
import { IDependencyDirectionDetector } from "../../../domain/services/IDependencyDirectionDetector"
|
||||||
|
import { IRepositoryPatternDetector } from "../../../domain/services/RepositoryPatternDetectorService"
|
||||||
|
import { IAggregateBoundaryDetector } from "../../../domain/services/IAggregateBoundaryDetector"
|
||||||
|
import { SourceFile } from "../../../domain/entities/SourceFile"
|
||||||
|
import { DependencyGraph } from "../../../domain/entities/DependencyGraph"
|
||||||
|
import {
|
||||||
|
LAYERS,
|
||||||
|
REPOSITORY_VIOLATION_TYPES,
|
||||||
|
RULES,
|
||||||
|
SEVERITY_ORDER,
|
||||||
|
type SeverityLevel,
|
||||||
|
VIOLATION_SEVERITY_MAP,
|
||||||
|
} from "../../../shared/constants"
|
||||||
|
import type {
|
||||||
|
AggregateBoundaryViolation,
|
||||||
|
ArchitectureViolation,
|
||||||
|
CircularDependencyViolation,
|
||||||
|
DependencyDirectionViolation,
|
||||||
|
EntityExposureViolation,
|
||||||
|
FrameworkLeakViolation,
|
||||||
|
HardcodeViolation,
|
||||||
|
NamingConventionViolation,
|
||||||
|
RepositoryPatternViolation,
|
||||||
|
} from "../AnalyzeProject"
|
||||||
|
|
||||||
|
export interface DetectionRequest {
|
||||||
|
sourceFiles: SourceFile[]
|
||||||
|
dependencyGraph: DependencyGraph
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface DetectionResult {
|
||||||
|
violations: ArchitectureViolation[]
|
||||||
|
hardcodeViolations: HardcodeViolation[]
|
||||||
|
circularDependencyViolations: CircularDependencyViolation[]
|
||||||
|
namingViolations: NamingConventionViolation[]
|
||||||
|
frameworkLeakViolations: FrameworkLeakViolation[]
|
||||||
|
entityExposureViolations: EntityExposureViolation[]
|
||||||
|
dependencyDirectionViolations: DependencyDirectionViolation[]
|
||||||
|
repositoryPatternViolations: RepositoryPatternViolation[]
|
||||||
|
aggregateBoundaryViolations: AggregateBoundaryViolation[]
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Pipeline step responsible for running all detectors
|
||||||
|
*/
|
||||||
|
export class DetectionPipeline {
|
||||||
|
constructor(
|
||||||
|
private readonly hardcodeDetector: IHardcodeDetector,
|
||||||
|
private readonly namingConventionDetector: INamingConventionDetector,
|
||||||
|
private readonly frameworkLeakDetector: IFrameworkLeakDetector,
|
||||||
|
private readonly entityExposureDetector: IEntityExposureDetector,
|
||||||
|
private readonly dependencyDirectionDetector: IDependencyDirectionDetector,
|
||||||
|
private readonly repositoryPatternDetector: IRepositoryPatternDetector,
|
||||||
|
private readonly aggregateBoundaryDetector: IAggregateBoundaryDetector,
|
||||||
|
) {}
|
||||||
|
|
||||||
|
public execute(request: DetectionRequest): DetectionResult {
|
||||||
|
return {
|
||||||
|
violations: this.sortBySeverity(this.detectViolations(request.sourceFiles)),
|
||||||
|
hardcodeViolations: this.sortBySeverity(this.detectHardcode(request.sourceFiles)),
|
||||||
|
circularDependencyViolations: this.sortBySeverity(
|
||||||
|
this.detectCircularDependencies(request.dependencyGraph),
|
||||||
|
),
|
||||||
|
namingViolations: this.sortBySeverity(
|
||||||
|
this.detectNamingConventions(request.sourceFiles),
|
||||||
|
),
|
||||||
|
frameworkLeakViolations: this.sortBySeverity(
|
||||||
|
this.detectFrameworkLeaks(request.sourceFiles),
|
||||||
|
),
|
||||||
|
entityExposureViolations: this.sortBySeverity(
|
||||||
|
this.detectEntityExposures(request.sourceFiles),
|
||||||
|
),
|
||||||
|
dependencyDirectionViolations: this.sortBySeverity(
|
||||||
|
this.detectDependencyDirections(request.sourceFiles),
|
||||||
|
),
|
||||||
|
repositoryPatternViolations: this.sortBySeverity(
|
||||||
|
this.detectRepositoryPatternViolations(request.sourceFiles),
|
||||||
|
),
|
||||||
|
aggregateBoundaryViolations: this.sortBySeverity(
|
||||||
|
this.detectAggregateBoundaryViolations(request.sourceFiles),
|
||||||
|
),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectViolations(sourceFiles: SourceFile[]): ArchitectureViolation[] {
|
||||||
|
const violations: ArchitectureViolation[] = []
|
||||||
|
|
||||||
|
const layerRules: Record<string, string[]> = {
|
||||||
|
[LAYERS.DOMAIN]: [LAYERS.SHARED],
|
||||||
|
[LAYERS.APPLICATION]: [LAYERS.DOMAIN, LAYERS.SHARED],
|
||||||
|
[LAYERS.INFRASTRUCTURE]: [LAYERS.DOMAIN, LAYERS.APPLICATION, LAYERS.SHARED],
|
||||||
|
[LAYERS.SHARED]: [],
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
if (!file.layer) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
const allowedLayers = layerRules[file.layer]
|
||||||
|
|
||||||
|
for (const imp of file.imports) {
|
||||||
|
const importedLayer = this.detectLayerFromImport(imp)
|
||||||
|
|
||||||
|
if (
|
||||||
|
importedLayer &&
|
||||||
|
importedLayer !== file.layer &&
|
||||||
|
!allowedLayers.includes(importedLayer)
|
||||||
|
) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.CLEAN_ARCHITECTURE,
|
||||||
|
message: `Layer "${file.layer}" cannot import from "${importedLayer}"`,
|
||||||
|
file: file.path.relative,
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.ARCHITECTURE,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectLayerFromImport(importPath: string): string | undefined {
|
||||||
|
const layers = Object.values(LAYERS)
|
||||||
|
|
||||||
|
for (const layer of layers) {
|
||||||
|
if (importPath.toLowerCase().includes(layer)) {
|
||||||
|
return layer
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectHardcode(sourceFiles: SourceFile[]): HardcodeViolation[] {
|
||||||
|
const violations: HardcodeViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const hardcodedValues = this.hardcodeDetector.detectAll(
|
||||||
|
file.content,
|
||||||
|
file.path.relative,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const hardcoded of hardcodedValues) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.HARDCODED_VALUE,
|
||||||
|
type: hardcoded.type,
|
||||||
|
value: hardcoded.value,
|
||||||
|
file: file.path.relative,
|
||||||
|
line: hardcoded.line,
|
||||||
|
column: hardcoded.column,
|
||||||
|
context: hardcoded.context,
|
||||||
|
suggestion: {
|
||||||
|
constantName: hardcoded.suggestConstantName(),
|
||||||
|
location: hardcoded.suggestLocation(file.layer),
|
||||||
|
},
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.HARDCODE,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectCircularDependencies(
|
||||||
|
dependencyGraph: DependencyGraph,
|
||||||
|
): CircularDependencyViolation[] {
|
||||||
|
const violations: CircularDependencyViolation[] = []
|
||||||
|
const cycles = dependencyGraph.findCycles()
|
||||||
|
|
||||||
|
for (const cycle of cycles) {
|
||||||
|
const cycleChain = [...cycle, cycle[0]].join(" → ")
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.CIRCULAR_DEPENDENCY,
|
||||||
|
message: `Circular dependency detected: ${cycleChain}`,
|
||||||
|
cycle,
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.CIRCULAR_DEPENDENCY,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectNamingConventions(sourceFiles: SourceFile[]): NamingConventionViolation[] {
|
||||||
|
const violations: NamingConventionViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const namingViolations = this.namingConventionDetector.detectViolations(
|
||||||
|
file.path.filename,
|
||||||
|
file.layer,
|
||||||
|
file.path.relative,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const violation of namingViolations) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.NAMING_CONVENTION,
|
||||||
|
type: violation.violationType,
|
||||||
|
fileName: violation.fileName,
|
||||||
|
layer: violation.layer,
|
||||||
|
file: violation.filePath,
|
||||||
|
expected: violation.expected,
|
||||||
|
actual: violation.actual,
|
||||||
|
message: violation.getMessage(),
|
||||||
|
suggestion: violation.suggestion,
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.NAMING_CONVENTION,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectFrameworkLeaks(sourceFiles: SourceFile[]): FrameworkLeakViolation[] {
|
||||||
|
const violations: FrameworkLeakViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const leaks = this.frameworkLeakDetector.detectLeaks(
|
||||||
|
file.imports,
|
||||||
|
file.path.relative,
|
||||||
|
file.layer,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const leak of leaks) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.FRAMEWORK_LEAK,
|
||||||
|
packageName: leak.packageName,
|
||||||
|
category: leak.category,
|
||||||
|
categoryDescription: leak.getCategoryDescription(),
|
||||||
|
file: file.path.relative,
|
||||||
|
layer: leak.layer,
|
||||||
|
line: leak.line,
|
||||||
|
message: leak.getMessage(),
|
||||||
|
suggestion: leak.getSuggestion(),
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.FRAMEWORK_LEAK,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectEntityExposures(sourceFiles: SourceFile[]): EntityExposureViolation[] {
|
||||||
|
const violations: EntityExposureViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const exposures = this.entityExposureDetector.detectExposures(
|
||||||
|
file.content,
|
||||||
|
file.path.relative,
|
||||||
|
file.layer,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const exposure of exposures) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.ENTITY_EXPOSURE,
|
||||||
|
entityName: exposure.entityName,
|
||||||
|
returnType: exposure.returnType,
|
||||||
|
file: file.path.relative,
|
||||||
|
layer: exposure.layer,
|
||||||
|
line: exposure.line,
|
||||||
|
methodName: exposure.methodName,
|
||||||
|
message: exposure.getMessage(),
|
||||||
|
suggestion: exposure.getSuggestion(),
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.ENTITY_EXPOSURE,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectDependencyDirections(sourceFiles: SourceFile[]): DependencyDirectionViolation[] {
|
||||||
|
const violations: DependencyDirectionViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const directionViolations = this.dependencyDirectionDetector.detectViolations(
|
||||||
|
file.content,
|
||||||
|
file.path.relative,
|
||||||
|
file.layer,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const violation of directionViolations) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.DEPENDENCY_DIRECTION,
|
||||||
|
fromLayer: violation.fromLayer,
|
||||||
|
toLayer: violation.toLayer,
|
||||||
|
importPath: violation.importPath,
|
||||||
|
file: file.path.relative,
|
||||||
|
line: violation.line,
|
||||||
|
message: violation.getMessage(),
|
||||||
|
suggestion: violation.getSuggestion(),
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.DEPENDENCY_DIRECTION,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectRepositoryPatternViolations(
|
||||||
|
sourceFiles: SourceFile[],
|
||||||
|
): RepositoryPatternViolation[] {
|
||||||
|
const violations: RepositoryPatternViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const patternViolations = this.repositoryPatternDetector.detectViolations(
|
||||||
|
file.content,
|
||||||
|
file.path.relative,
|
||||||
|
file.layer,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const violation of patternViolations) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.REPOSITORY_PATTERN,
|
||||||
|
violationType: violation.violationType as
|
||||||
|
| typeof REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE
|
||||||
|
| typeof REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE
|
||||||
|
| typeof REPOSITORY_VIOLATION_TYPES.NEW_REPOSITORY_IN_USE_CASE
|
||||||
|
| typeof REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
||||||
|
file: file.path.relative,
|
||||||
|
layer: violation.layer,
|
||||||
|
line: violation.line,
|
||||||
|
details: violation.details,
|
||||||
|
message: violation.getMessage(),
|
||||||
|
suggestion: violation.getSuggestion(),
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.REPOSITORY_PATTERN,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectAggregateBoundaryViolations(
|
||||||
|
sourceFiles: SourceFile[],
|
||||||
|
): AggregateBoundaryViolation[] {
|
||||||
|
const violations: AggregateBoundaryViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const boundaryViolations = this.aggregateBoundaryDetector.detectViolations(
|
||||||
|
file.content,
|
||||||
|
file.path.relative,
|
||||||
|
file.layer,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const violation of boundaryViolations) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.AGGREGATE_BOUNDARY,
|
||||||
|
fromAggregate: violation.fromAggregate,
|
||||||
|
toAggregate: violation.toAggregate,
|
||||||
|
entityName: violation.entityName,
|
||||||
|
importPath: violation.importPath,
|
||||||
|
file: file.path.relative,
|
||||||
|
line: violation.line,
|
||||||
|
message: violation.getMessage(),
|
||||||
|
suggestion: violation.getSuggestion(),
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.AGGREGATE_BOUNDARY,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private sortBySeverity<T extends { severity: SeverityLevel }>(violations: T[]): T[] {
|
||||||
|
return violations.sort((a, b) => {
|
||||||
|
return SEVERITY_ORDER[a.severity] - SEVERITY_ORDER[b.severity]
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,66 @@
|
|||||||
|
import { IFileScanner } from "../../../domain/services/IFileScanner"
|
||||||
|
import { SourceFile } from "../../../domain/entities/SourceFile"
|
||||||
|
import { ProjectPath } from "../../../domain/value-objects/ProjectPath"
|
||||||
|
import { REGEX_PATTERNS } from "../../../shared/constants"
|
||||||
|
|
||||||
|
export interface FileCollectionRequest {
|
||||||
|
rootDir: string
|
||||||
|
include?: string[]
|
||||||
|
exclude?: string[]
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface FileCollectionResult {
|
||||||
|
sourceFiles: SourceFile[]
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Pipeline step responsible for file collection and basic parsing
|
||||||
|
*/
|
||||||
|
export class FileCollectionStep {
|
||||||
|
constructor(private readonly fileScanner: IFileScanner) {}
|
||||||
|
|
||||||
|
public async execute(request: FileCollectionRequest): Promise<FileCollectionResult> {
|
||||||
|
const filePaths = await this.fileScanner.scan({
|
||||||
|
rootDir: request.rootDir,
|
||||||
|
include: request.include,
|
||||||
|
exclude: request.exclude,
|
||||||
|
})
|
||||||
|
|
||||||
|
const sourceFiles: SourceFile[] = []
|
||||||
|
|
||||||
|
for (const filePath of filePaths) {
|
||||||
|
const content = await this.fileScanner.readFile(filePath)
|
||||||
|
const projectPath = ProjectPath.create(filePath, request.rootDir)
|
||||||
|
|
||||||
|
const imports = this.extractImports(content)
|
||||||
|
const exports = this.extractExports(content)
|
||||||
|
|
||||||
|
const sourceFile = new SourceFile(projectPath, content, imports, exports)
|
||||||
|
sourceFiles.push(sourceFile)
|
||||||
|
}
|
||||||
|
|
||||||
|
return { sourceFiles }
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractImports(content: string): string[] {
|
||||||
|
const imports: string[] = []
|
||||||
|
let match
|
||||||
|
|
||||||
|
while ((match = REGEX_PATTERNS.IMPORT_STATEMENT.exec(content)) !== null) {
|
||||||
|
imports.push(match[1])
|
||||||
|
}
|
||||||
|
|
||||||
|
return imports
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractExports(content: string): string[] {
|
||||||
|
const exports: string[] = []
|
||||||
|
let match
|
||||||
|
|
||||||
|
while ((match = REGEX_PATTERNS.EXPORT_STATEMENT.exec(content)) !== null) {
|
||||||
|
exports.push(match[1])
|
||||||
|
}
|
||||||
|
|
||||||
|
return exports
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,51 @@
|
|||||||
|
import { ICodeParser } from "../../../domain/services/ICodeParser"
|
||||||
|
import { SourceFile } from "../../../domain/entities/SourceFile"
|
||||||
|
import { DependencyGraph } from "../../../domain/entities/DependencyGraph"
|
||||||
|
|
||||||
|
export interface ParsingRequest {
|
||||||
|
sourceFiles: SourceFile[]
|
||||||
|
rootDir: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ParsingResult {
|
||||||
|
dependencyGraph: DependencyGraph
|
||||||
|
totalFunctions: number
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Pipeline step responsible for AST parsing and dependency graph construction
|
||||||
|
*/
|
||||||
|
export class ParsingStep {
|
||||||
|
constructor(private readonly codeParser: ICodeParser) {}
|
||||||
|
|
||||||
|
public execute(request: ParsingRequest): ParsingResult {
|
||||||
|
const dependencyGraph = new DependencyGraph()
|
||||||
|
let totalFunctions = 0
|
||||||
|
|
||||||
|
for (const sourceFile of request.sourceFiles) {
|
||||||
|
dependencyGraph.addFile(sourceFile)
|
||||||
|
|
||||||
|
if (sourceFile.path.isTypeScript()) {
|
||||||
|
const tree = this.codeParser.parseTypeScript(sourceFile.content)
|
||||||
|
const functions = this.codeParser.extractFunctions(tree)
|
||||||
|
totalFunctions += functions.length
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const imp of sourceFile.imports) {
|
||||||
|
dependencyGraph.addDependency(
|
||||||
|
sourceFile.path.relative,
|
||||||
|
this.resolveImportPath(imp, sourceFile.path.relative, request.rootDir),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return { dependencyGraph, totalFunctions }
|
||||||
|
}
|
||||||
|
|
||||||
|
private resolveImportPath(importPath: string, _currentFile: string, _rootDir: string): string {
|
||||||
|
if (importPath.startsWith(".")) {
|
||||||
|
return importPath
|
||||||
|
}
|
||||||
|
return importPath
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,81 @@
|
|||||||
|
import { SourceFile } from "../../../domain/entities/SourceFile"
|
||||||
|
import { DependencyGraph } from "../../../domain/entities/DependencyGraph"
|
||||||
|
import type {
|
||||||
|
AggregateBoundaryViolation,
|
||||||
|
AnalyzeProjectResponse,
|
||||||
|
ArchitectureViolation,
|
||||||
|
CircularDependencyViolation,
|
||||||
|
DependencyDirectionViolation,
|
||||||
|
EntityExposureViolation,
|
||||||
|
FrameworkLeakViolation,
|
||||||
|
HardcodeViolation,
|
||||||
|
NamingConventionViolation,
|
||||||
|
ProjectMetrics,
|
||||||
|
RepositoryPatternViolation,
|
||||||
|
} from "../AnalyzeProject"
|
||||||
|
|
||||||
|
export interface AggregationRequest {
|
||||||
|
sourceFiles: SourceFile[]
|
||||||
|
dependencyGraph: DependencyGraph
|
||||||
|
totalFunctions: number
|
||||||
|
violations: ArchitectureViolation[]
|
||||||
|
hardcodeViolations: HardcodeViolation[]
|
||||||
|
circularDependencyViolations: CircularDependencyViolation[]
|
||||||
|
namingViolations: NamingConventionViolation[]
|
||||||
|
frameworkLeakViolations: FrameworkLeakViolation[]
|
||||||
|
entityExposureViolations: EntityExposureViolation[]
|
||||||
|
dependencyDirectionViolations: DependencyDirectionViolation[]
|
||||||
|
repositoryPatternViolations: RepositoryPatternViolation[]
|
||||||
|
aggregateBoundaryViolations: AggregateBoundaryViolation[]
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Pipeline step responsible for building final response DTO
|
||||||
|
*/
|
||||||
|
export class ResultAggregator {
|
||||||
|
public execute(request: AggregationRequest): AnalyzeProjectResponse {
|
||||||
|
const metrics = this.calculateMetrics(
|
||||||
|
request.sourceFiles,
|
||||||
|
request.totalFunctions,
|
||||||
|
request.dependencyGraph,
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
files: request.sourceFiles,
|
||||||
|
dependencyGraph: request.dependencyGraph,
|
||||||
|
violations: request.violations,
|
||||||
|
hardcodeViolations: request.hardcodeViolations,
|
||||||
|
circularDependencyViolations: request.circularDependencyViolations,
|
||||||
|
namingViolations: request.namingViolations,
|
||||||
|
frameworkLeakViolations: request.frameworkLeakViolations,
|
||||||
|
entityExposureViolations: request.entityExposureViolations,
|
||||||
|
dependencyDirectionViolations: request.dependencyDirectionViolations,
|
||||||
|
repositoryPatternViolations: request.repositoryPatternViolations,
|
||||||
|
aggregateBoundaryViolations: request.aggregateBoundaryViolations,
|
||||||
|
metrics,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private calculateMetrics(
|
||||||
|
sourceFiles: SourceFile[],
|
||||||
|
totalFunctions: number,
|
||||||
|
_dependencyGraph: DependencyGraph,
|
||||||
|
): ProjectMetrics {
|
||||||
|
const layerDistribution: Record<string, number> = {}
|
||||||
|
let totalImports = 0
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
if (file.layer) {
|
||||||
|
layerDistribution[file.layer] = (layerDistribution[file.layer] || 0) + 1
|
||||||
|
}
|
||||||
|
totalImports += file.imports.length
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
totalFiles: sourceFiles.length,
|
||||||
|
totalFunctions,
|
||||||
|
totalImports,
|
||||||
|
layerDistribution,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -150,4 +150,30 @@ export const CLI_HELP_TEXT = {
|
|||||||
FIX_REPOSITORY:
|
FIX_REPOSITORY:
|
||||||
" Repository pattern → Create IUserRepository in domain, implement in infra\n\n",
|
" Repository pattern → Create IUserRepository in domain, implement in infra\n\n",
|
||||||
FOOTER: "Each violation includes a 💡 Suggestion with specific fix instructions.\n",
|
FOOTER: "Each violation includes a 💡 Suggestion with specific fix instructions.\n",
|
||||||
|
AI_AGENT_HEADER: "AI AGENT INSTRUCTIONS:\n",
|
||||||
|
AI_AGENT_INTRO:
|
||||||
|
" When an AI coding assistant (Claude, Copilot, Cursor, etc.) uses Guardian:\n\n",
|
||||||
|
AI_AGENT_STEP1: " STEP 1: Run initial scan\n",
|
||||||
|
AI_AGENT_STEP1_CMD: " $ guardian check ./src --only-critical --limit 5\n\n",
|
||||||
|
AI_AGENT_STEP2: " STEP 2: For each violation in output:\n",
|
||||||
|
AI_AGENT_STEP2_DETAIL:
|
||||||
|
" - Read the file at reported location (file:line:column)\n" +
|
||||||
|
" - Apply the 💡 Suggestion provided\n" +
|
||||||
|
" - The suggestion contains exact fix instructions\n\n",
|
||||||
|
AI_AGENT_STEP3: " STEP 3: After fixing, verify:\n",
|
||||||
|
AI_AGENT_STEP3_CMD: " $ guardian check ./src --only-critical\n\n",
|
||||||
|
AI_AGENT_STEP4: " STEP 4: Expand scope progressively:\n",
|
||||||
|
AI_AGENT_STEP4_CMDS:
|
||||||
|
" $ guardian check ./src --min-severity high # Fix HIGH issues\n" +
|
||||||
|
" $ guardian check ./src --min-severity medium # Fix MEDIUM issues\n" +
|
||||||
|
" $ guardian check ./src # Full scan\n\n",
|
||||||
|
AI_AGENT_OUTPUT: " OUTPUT FORMAT (parse this):\n",
|
||||||
|
AI_AGENT_OUTPUT_DETAIL:
|
||||||
|
" <index>. <file>:<line>:<column>\n" +
|
||||||
|
" Severity: <emoji> <LEVEL>\n" +
|
||||||
|
" Type: <violation-type>\n" +
|
||||||
|
" Value: <problematic-value>\n" +
|
||||||
|
" Context: <code-snippet>\n" +
|
||||||
|
" 💡 Suggestion: <exact-fix-instruction>\n\n",
|
||||||
|
AI_AGENT_PRIORITY: " PRIORITY ORDER: CRITICAL → HIGH → MEDIUM → LOW\n\n",
|
||||||
} as const
|
} as const
|
||||||
|
|||||||
190
packages/guardian/src/cli/formatters/OutputFormatter.ts
Normal file
190
packages/guardian/src/cli/formatters/OutputFormatter.ts
Normal file
@@ -0,0 +1,190 @@
|
|||||||
|
import { SEVERITY_LEVELS, type SeverityLevel } from "../../shared/constants"
|
||||||
|
import type {
|
||||||
|
AggregateBoundaryViolation,
|
||||||
|
ArchitectureViolation,
|
||||||
|
CircularDependencyViolation,
|
||||||
|
DependencyDirectionViolation,
|
||||||
|
EntityExposureViolation,
|
||||||
|
FrameworkLeakViolation,
|
||||||
|
HardcodeViolation,
|
||||||
|
NamingConventionViolation,
|
||||||
|
RepositoryPatternViolation,
|
||||||
|
} from "../../application/use-cases/AnalyzeProject"
|
||||||
|
import { SEVERITY_DISPLAY_LABELS, SEVERITY_SECTION_HEADERS } from "../constants"
|
||||||
|
import { ViolationGrouper } from "../groupers/ViolationGrouper"
|
||||||
|
|
||||||
|
const SEVERITY_LABELS: Record<SeverityLevel, string> = {
|
||||||
|
[SEVERITY_LEVELS.CRITICAL]: SEVERITY_DISPLAY_LABELS.CRITICAL,
|
||||||
|
[SEVERITY_LEVELS.HIGH]: SEVERITY_DISPLAY_LABELS.HIGH,
|
||||||
|
[SEVERITY_LEVELS.MEDIUM]: SEVERITY_DISPLAY_LABELS.MEDIUM,
|
||||||
|
[SEVERITY_LEVELS.LOW]: SEVERITY_DISPLAY_LABELS.LOW,
|
||||||
|
}
|
||||||
|
|
||||||
|
const SEVERITY_HEADER: Record<SeverityLevel, string> = {
|
||||||
|
[SEVERITY_LEVELS.CRITICAL]: SEVERITY_SECTION_HEADERS.CRITICAL,
|
||||||
|
[SEVERITY_LEVELS.HIGH]: SEVERITY_SECTION_HEADERS.HIGH,
|
||||||
|
[SEVERITY_LEVELS.MEDIUM]: SEVERITY_SECTION_HEADERS.MEDIUM,
|
||||||
|
[SEVERITY_LEVELS.LOW]: SEVERITY_SECTION_HEADERS.LOW,
|
||||||
|
}
|
||||||
|
|
||||||
|
export class OutputFormatter {
|
||||||
|
private readonly grouper = new ViolationGrouper()
|
||||||
|
|
||||||
|
displayGroupedViolations<T extends { severity: SeverityLevel }>(
|
||||||
|
violations: T[],
|
||||||
|
displayFn: (v: T, index: number) => void,
|
||||||
|
limit?: number,
|
||||||
|
): void {
|
||||||
|
const grouped = this.grouper.groupBySeverity(violations)
|
||||||
|
const severities: SeverityLevel[] = [
|
||||||
|
SEVERITY_LEVELS.CRITICAL,
|
||||||
|
SEVERITY_LEVELS.HIGH,
|
||||||
|
SEVERITY_LEVELS.MEDIUM,
|
||||||
|
SEVERITY_LEVELS.LOW,
|
||||||
|
]
|
||||||
|
|
||||||
|
let totalDisplayed = 0
|
||||||
|
const totalAvailable = violations.length
|
||||||
|
|
||||||
|
for (const severity of severities) {
|
||||||
|
const items = grouped.get(severity)
|
||||||
|
if (items && items.length > 0) {
|
||||||
|
console.warn(SEVERITY_HEADER[severity])
|
||||||
|
console.warn(`Found ${String(items.length)} issue(s)\n`)
|
||||||
|
|
||||||
|
const itemsToDisplay =
|
||||||
|
limit !== undefined ? items.slice(0, limit - totalDisplayed) : items
|
||||||
|
itemsToDisplay.forEach((item, index) => {
|
||||||
|
displayFn(item, totalDisplayed + index)
|
||||||
|
})
|
||||||
|
totalDisplayed += itemsToDisplay.length
|
||||||
|
|
||||||
|
if (limit !== undefined && totalDisplayed >= limit) {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (limit !== undefined && totalAvailable > limit) {
|
||||||
|
console.warn(
|
||||||
|
`\n⚠️ Showing first ${String(limit)} of ${String(totalAvailable)} issues (use --limit to adjust)\n`,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
formatArchitectureViolation(v: ArchitectureViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${v.file}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[v.severity]}`)
|
||||||
|
console.log(` Rule: ${v.rule}`)
|
||||||
|
console.log(` ${v.message}`)
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatCircularDependency(cd: CircularDependencyViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${cd.message}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[cd.severity]}`)
|
||||||
|
console.log(" Cycle path:")
|
||||||
|
cd.cycle.forEach((file, i) => {
|
||||||
|
console.log(` ${String(i + 1)}. ${file}`)
|
||||||
|
})
|
||||||
|
console.log(` ${String(cd.cycle.length + 1)}. ${cd.cycle[0]} (back to start)`)
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatNamingViolation(nc: NamingConventionViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${nc.file}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[nc.severity]}`)
|
||||||
|
console.log(` File: ${nc.fileName}`)
|
||||||
|
console.log(` Layer: ${nc.layer}`)
|
||||||
|
console.log(` Type: ${nc.type}`)
|
||||||
|
console.log(` Message: ${nc.message}`)
|
||||||
|
if (nc.suggestion) {
|
||||||
|
console.log(` 💡 Suggestion: ${nc.suggestion}`)
|
||||||
|
}
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatFrameworkLeak(fl: FrameworkLeakViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${fl.file}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[fl.severity]}`)
|
||||||
|
console.log(` Package: ${fl.packageName}`)
|
||||||
|
console.log(` Category: ${fl.categoryDescription}`)
|
||||||
|
console.log(` Layer: ${fl.layer}`)
|
||||||
|
console.log(` Rule: ${fl.rule}`)
|
||||||
|
console.log(` ${fl.message}`)
|
||||||
|
console.log(` 💡 Suggestion: ${fl.suggestion}`)
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatEntityExposure(ee: EntityExposureViolation, index: number): void {
|
||||||
|
const location = ee.line ? `${ee.file}:${String(ee.line)}` : ee.file
|
||||||
|
console.log(`${String(index + 1)}. ${location}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[ee.severity]}`)
|
||||||
|
console.log(` Entity: ${ee.entityName}`)
|
||||||
|
console.log(` Return Type: ${ee.returnType}`)
|
||||||
|
if (ee.methodName) {
|
||||||
|
console.log(` Method: ${ee.methodName}`)
|
||||||
|
}
|
||||||
|
console.log(` Layer: ${ee.layer}`)
|
||||||
|
console.log(` Rule: ${ee.rule}`)
|
||||||
|
console.log(` ${ee.message}`)
|
||||||
|
console.log(" 💡 Suggestion:")
|
||||||
|
ee.suggestion.split("\n").forEach((line) => {
|
||||||
|
if (line.trim()) {
|
||||||
|
console.log(` ${line}`)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatDependencyDirection(dd: DependencyDirectionViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${dd.file}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[dd.severity]}`)
|
||||||
|
console.log(` From Layer: ${dd.fromLayer}`)
|
||||||
|
console.log(` To Layer: ${dd.toLayer}`)
|
||||||
|
console.log(` Import: ${dd.importPath}`)
|
||||||
|
console.log(` ${dd.message}`)
|
||||||
|
console.log(` 💡 Suggestion: ${dd.suggestion}`)
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatRepositoryPattern(rp: RepositoryPatternViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${rp.file}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[rp.severity]}`)
|
||||||
|
console.log(` Layer: ${rp.layer}`)
|
||||||
|
console.log(` Type: ${rp.violationType}`)
|
||||||
|
console.log(` Details: ${rp.details}`)
|
||||||
|
console.log(` ${rp.message}`)
|
||||||
|
console.log(` 💡 Suggestion: ${rp.suggestion}`)
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatAggregateBoundary(ab: AggregateBoundaryViolation, index: number): void {
|
||||||
|
const location = ab.line ? `${ab.file}:${String(ab.line)}` : ab.file
|
||||||
|
console.log(`${String(index + 1)}. ${location}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[ab.severity]}`)
|
||||||
|
console.log(` From Aggregate: ${ab.fromAggregate}`)
|
||||||
|
console.log(` To Aggregate: ${ab.toAggregate}`)
|
||||||
|
console.log(` Entity: ${ab.entityName}`)
|
||||||
|
console.log(` Import: ${ab.importPath}`)
|
||||||
|
console.log(` ${ab.message}`)
|
||||||
|
console.log(" 💡 Suggestion:")
|
||||||
|
ab.suggestion.split("\n").forEach((line) => {
|
||||||
|
if (line.trim()) {
|
||||||
|
console.log(` ${line}`)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatHardcodeViolation(hc: HardcodeViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${hc.file}:${String(hc.line)}:${String(hc.column)}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[hc.severity]}`)
|
||||||
|
console.log(` Type: ${hc.type}`)
|
||||||
|
console.log(` Value: ${JSON.stringify(hc.value)}`)
|
||||||
|
console.log(` Context: ${hc.context.trim()}`)
|
||||||
|
console.log(` 💡 Suggested: ${hc.suggestion.constantName}`)
|
||||||
|
console.log(` 📁 Location: ${hc.suggestion.location}`)
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
}
|
||||||
59
packages/guardian/src/cli/formatters/StatisticsFormatter.ts
Normal file
59
packages/guardian/src/cli/formatters/StatisticsFormatter.ts
Normal file
@@ -0,0 +1,59 @@
|
|||||||
|
import { CLI_LABELS, CLI_MESSAGES } from "../constants"
|
||||||
|
|
||||||
|
interface ProjectMetrics {
|
||||||
|
totalFiles: number
|
||||||
|
totalFunctions: number
|
||||||
|
totalImports: number
|
||||||
|
layerDistribution: Record<string, number>
|
||||||
|
}
|
||||||
|
|
||||||
|
export class StatisticsFormatter {
|
||||||
|
displayMetrics(metrics: ProjectMetrics): void {
|
||||||
|
console.log(CLI_MESSAGES.METRICS_HEADER)
|
||||||
|
console.log(` ${CLI_LABELS.FILES_ANALYZED} ${String(metrics.totalFiles)}`)
|
||||||
|
console.log(` ${CLI_LABELS.TOTAL_FUNCTIONS} ${String(metrics.totalFunctions)}`)
|
||||||
|
console.log(` ${CLI_LABELS.TOTAL_IMPORTS} ${String(metrics.totalImports)}`)
|
||||||
|
|
||||||
|
if (Object.keys(metrics.layerDistribution).length > 0) {
|
||||||
|
console.log(CLI_MESSAGES.LAYER_DISTRIBUTION_HEADER)
|
||||||
|
for (const [layer, count] of Object.entries(metrics.layerDistribution)) {
|
||||||
|
console.log(` ${layer}: ${String(count)} ${CLI_LABELS.FILES}`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
displaySummary(totalIssues: number, verbose: boolean): void {
|
||||||
|
if (totalIssues === 0) {
|
||||||
|
console.log(CLI_MESSAGES.NO_ISSUES)
|
||||||
|
process.exit(0)
|
||||||
|
} else {
|
||||||
|
console.log(
|
||||||
|
`${CLI_MESSAGES.ISSUES_TOTAL} ${String(totalIssues)} ${CLI_LABELS.ISSUES_TOTAL}`,
|
||||||
|
)
|
||||||
|
console.log(CLI_MESSAGES.TIP)
|
||||||
|
|
||||||
|
if (verbose) {
|
||||||
|
console.log(CLI_MESSAGES.HELP_FOOTER)
|
||||||
|
}
|
||||||
|
|
||||||
|
process.exit(1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
displaySeverityFilterMessage(onlyCritical: boolean, minSeverity?: string): void {
|
||||||
|
if (onlyCritical) {
|
||||||
|
console.log("\n🔴 Filtering: Showing only CRITICAL severity issues\n")
|
||||||
|
} else if (minSeverity) {
|
||||||
|
console.log(
|
||||||
|
`\n⚠️ Filtering: Showing ${minSeverity.toUpperCase()} severity and above\n`,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
displayError(message: string): void {
|
||||||
|
console.error(`\n❌ ${CLI_MESSAGES.ERROR_PREFIX}`)
|
||||||
|
console.error(message)
|
||||||
|
console.error("")
|
||||||
|
process.exit(1)
|
||||||
|
}
|
||||||
|
}
|
||||||
29
packages/guardian/src/cli/groupers/ViolationGrouper.ts
Normal file
29
packages/guardian/src/cli/groupers/ViolationGrouper.ts
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
import { SEVERITY_ORDER, type SeverityLevel } from "../../shared/constants"
|
||||||
|
|
||||||
|
export class ViolationGrouper {
|
||||||
|
groupBySeverity<T extends { severity: SeverityLevel }>(
|
||||||
|
violations: T[],
|
||||||
|
): Map<SeverityLevel, T[]> {
|
||||||
|
const grouped = new Map<SeverityLevel, T[]>()
|
||||||
|
|
||||||
|
for (const violation of violations) {
|
||||||
|
const existing = grouped.get(violation.severity) ?? []
|
||||||
|
existing.push(violation)
|
||||||
|
grouped.set(violation.severity, existing)
|
||||||
|
}
|
||||||
|
|
||||||
|
return grouped
|
||||||
|
}
|
||||||
|
|
||||||
|
filterBySeverity<T extends { severity: SeverityLevel }>(
|
||||||
|
violations: T[],
|
||||||
|
minSeverity?: SeverityLevel,
|
||||||
|
): T[] {
|
||||||
|
if (!minSeverity) {
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
const minSeverityOrder = SEVERITY_ORDER[minSeverity]
|
||||||
|
return violations.filter((v) => SEVERITY_ORDER[v.severity] <= minSeverityOrder)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -11,92 +11,11 @@ import {
|
|||||||
CLI_MESSAGES,
|
CLI_MESSAGES,
|
||||||
CLI_OPTIONS,
|
CLI_OPTIONS,
|
||||||
DEFAULT_EXCLUDES,
|
DEFAULT_EXCLUDES,
|
||||||
SEVERITY_DISPLAY_LABELS,
|
|
||||||
SEVERITY_SECTION_HEADERS,
|
|
||||||
} from "./constants"
|
} from "./constants"
|
||||||
import { SEVERITY_LEVELS, SEVERITY_ORDER, type SeverityLevel } from "../shared/constants"
|
import { SEVERITY_LEVELS, type SeverityLevel } from "../shared/constants"
|
||||||
|
import { ViolationGrouper } from "./groupers/ViolationGrouper"
|
||||||
const SEVERITY_LABELS: Record<SeverityLevel, string> = {
|
import { OutputFormatter } from "./formatters/OutputFormatter"
|
||||||
[SEVERITY_LEVELS.CRITICAL]: SEVERITY_DISPLAY_LABELS.CRITICAL,
|
import { StatisticsFormatter } from "./formatters/StatisticsFormatter"
|
||||||
[SEVERITY_LEVELS.HIGH]: SEVERITY_DISPLAY_LABELS.HIGH,
|
|
||||||
[SEVERITY_LEVELS.MEDIUM]: SEVERITY_DISPLAY_LABELS.MEDIUM,
|
|
||||||
[SEVERITY_LEVELS.LOW]: SEVERITY_DISPLAY_LABELS.LOW,
|
|
||||||
}
|
|
||||||
|
|
||||||
const SEVERITY_HEADER: Record<SeverityLevel, string> = {
|
|
||||||
[SEVERITY_LEVELS.CRITICAL]: SEVERITY_SECTION_HEADERS.CRITICAL,
|
|
||||||
[SEVERITY_LEVELS.HIGH]: SEVERITY_SECTION_HEADERS.HIGH,
|
|
||||||
[SEVERITY_LEVELS.MEDIUM]: SEVERITY_SECTION_HEADERS.MEDIUM,
|
|
||||||
[SEVERITY_LEVELS.LOW]: SEVERITY_SECTION_HEADERS.LOW,
|
|
||||||
}
|
|
||||||
|
|
||||||
function groupBySeverity<T extends { severity: SeverityLevel }>(
|
|
||||||
violations: T[],
|
|
||||||
): Map<SeverityLevel, T[]> {
|
|
||||||
const grouped = new Map<SeverityLevel, T[]>()
|
|
||||||
|
|
||||||
for (const violation of violations) {
|
|
||||||
const existing = grouped.get(violation.severity) ?? []
|
|
||||||
existing.push(violation)
|
|
||||||
grouped.set(violation.severity, existing)
|
|
||||||
}
|
|
||||||
|
|
||||||
return grouped
|
|
||||||
}
|
|
||||||
|
|
||||||
function filterBySeverity<T extends { severity: SeverityLevel }>(
|
|
||||||
violations: T[],
|
|
||||||
minSeverity?: SeverityLevel,
|
|
||||||
): T[] {
|
|
||||||
if (!minSeverity) {
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
const minSeverityOrder = SEVERITY_ORDER[minSeverity]
|
|
||||||
return violations.filter((v) => SEVERITY_ORDER[v.severity] <= minSeverityOrder)
|
|
||||||
}
|
|
||||||
|
|
||||||
function displayGroupedViolations<T extends { severity: SeverityLevel }>(
|
|
||||||
violations: T[],
|
|
||||||
displayFn: (v: T, index: number) => void,
|
|
||||||
limit?: number,
|
|
||||||
): void {
|
|
||||||
const grouped = groupBySeverity(violations)
|
|
||||||
const severities: SeverityLevel[] = [
|
|
||||||
SEVERITY_LEVELS.CRITICAL,
|
|
||||||
SEVERITY_LEVELS.HIGH,
|
|
||||||
SEVERITY_LEVELS.MEDIUM,
|
|
||||||
SEVERITY_LEVELS.LOW,
|
|
||||||
]
|
|
||||||
|
|
||||||
let totalDisplayed = 0
|
|
||||||
const totalAvailable = violations.length
|
|
||||||
|
|
||||||
for (const severity of severities) {
|
|
||||||
const items = grouped.get(severity)
|
|
||||||
if (items && items.length > 0) {
|
|
||||||
console.warn(SEVERITY_HEADER[severity])
|
|
||||||
console.warn(`Found ${String(items.length)} issue(s)\n`)
|
|
||||||
|
|
||||||
const itemsToDisplay =
|
|
||||||
limit !== undefined ? items.slice(0, limit - totalDisplayed) : items
|
|
||||||
itemsToDisplay.forEach((item, index) => {
|
|
||||||
displayFn(item, totalDisplayed + index)
|
|
||||||
})
|
|
||||||
totalDisplayed += itemsToDisplay.length
|
|
||||||
|
|
||||||
if (limit !== undefined && totalDisplayed >= limit) {
|
|
||||||
break
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (limit !== undefined && totalAvailable > limit) {
|
|
||||||
console.warn(
|
|
||||||
`\n⚠️ Showing first ${String(limit)} of ${String(totalAvailable)} issues (use --limit to adjust)\n`,
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const program = new Command()
|
const program = new Command()
|
||||||
|
|
||||||
@@ -122,7 +41,20 @@ program
|
|||||||
CLI_HELP_TEXT.FIX_ENTITY +
|
CLI_HELP_TEXT.FIX_ENTITY +
|
||||||
CLI_HELP_TEXT.FIX_DEPENDENCY +
|
CLI_HELP_TEXT.FIX_DEPENDENCY +
|
||||||
CLI_HELP_TEXT.FIX_REPOSITORY +
|
CLI_HELP_TEXT.FIX_REPOSITORY +
|
||||||
CLI_HELP_TEXT.FOOTER,
|
CLI_HELP_TEXT.FOOTER +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_HEADER +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_INTRO +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP1 +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP1_CMD +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP2 +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP2_DETAIL +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP3 +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP3_CMD +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP4 +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP4_CMDS +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_OUTPUT +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_OUTPUT_DETAIL +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_PRIORITY,
|
||||||
)
|
)
|
||||||
|
|
||||||
program
|
program
|
||||||
@@ -137,6 +69,10 @@ program
|
|||||||
.option(CLI_OPTIONS.ONLY_CRITICAL, CLI_DESCRIPTIONS.ONLY_CRITICAL_OPTION, false)
|
.option(CLI_OPTIONS.ONLY_CRITICAL, CLI_DESCRIPTIONS.ONLY_CRITICAL_OPTION, false)
|
||||||
.option(CLI_OPTIONS.LIMIT, CLI_DESCRIPTIONS.LIMIT_OPTION)
|
.option(CLI_OPTIONS.LIMIT, CLI_DESCRIPTIONS.LIMIT_OPTION)
|
||||||
.action(async (path: string, options) => {
|
.action(async (path: string, options) => {
|
||||||
|
const grouper = new ViolationGrouper()
|
||||||
|
const outputFormatter = new OutputFormatter()
|
||||||
|
const statsFormatter = new StatisticsFormatter()
|
||||||
|
|
||||||
try {
|
try {
|
||||||
console.log(CLI_MESSAGES.ANALYZING)
|
console.log(CLI_MESSAGES.ANALYZING)
|
||||||
|
|
||||||
@@ -169,270 +105,159 @@ program
|
|||||||
: undefined
|
: undefined
|
||||||
|
|
||||||
if (minSeverity) {
|
if (minSeverity) {
|
||||||
violations = filterBySeverity(violations, minSeverity)
|
violations = grouper.filterBySeverity(violations, minSeverity)
|
||||||
hardcodeViolations = filterBySeverity(hardcodeViolations, minSeverity)
|
hardcodeViolations = grouper.filterBySeverity(hardcodeViolations, minSeverity)
|
||||||
circularDependencyViolations = filterBySeverity(
|
circularDependencyViolations = grouper.filterBySeverity(
|
||||||
circularDependencyViolations,
|
circularDependencyViolations,
|
||||||
minSeverity,
|
minSeverity,
|
||||||
)
|
)
|
||||||
namingViolations = filterBySeverity(namingViolations, minSeverity)
|
namingViolations = grouper.filterBySeverity(namingViolations, minSeverity)
|
||||||
frameworkLeakViolations = filterBySeverity(frameworkLeakViolations, minSeverity)
|
frameworkLeakViolations = grouper.filterBySeverity(
|
||||||
entityExposureViolations = filterBySeverity(entityExposureViolations, minSeverity)
|
frameworkLeakViolations,
|
||||||
dependencyDirectionViolations = filterBySeverity(
|
minSeverity,
|
||||||
|
)
|
||||||
|
entityExposureViolations = grouper.filterBySeverity(
|
||||||
|
entityExposureViolations,
|
||||||
|
minSeverity,
|
||||||
|
)
|
||||||
|
dependencyDirectionViolations = grouper.filterBySeverity(
|
||||||
dependencyDirectionViolations,
|
dependencyDirectionViolations,
|
||||||
minSeverity,
|
minSeverity,
|
||||||
)
|
)
|
||||||
repositoryPatternViolations = filterBySeverity(
|
repositoryPatternViolations = grouper.filterBySeverity(
|
||||||
repositoryPatternViolations,
|
repositoryPatternViolations,
|
||||||
minSeverity,
|
minSeverity,
|
||||||
)
|
)
|
||||||
aggregateBoundaryViolations = filterBySeverity(
|
aggregateBoundaryViolations = grouper.filterBySeverity(
|
||||||
aggregateBoundaryViolations,
|
aggregateBoundaryViolations,
|
||||||
minSeverity,
|
minSeverity,
|
||||||
)
|
)
|
||||||
|
|
||||||
if (options.onlyCritical) {
|
statsFormatter.displaySeverityFilterMessage(
|
||||||
console.log("\n🔴 Filtering: Showing only CRITICAL severity issues\n")
|
options.onlyCritical,
|
||||||
} else {
|
options.minSeverity,
|
||||||
console.log(
|
)
|
||||||
`\n⚠️ Filtering: Showing ${minSeverity.toUpperCase()} severity and above\n`,
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Display metrics
|
statsFormatter.displayMetrics(metrics)
|
||||||
console.log(CLI_MESSAGES.METRICS_HEADER)
|
|
||||||
console.log(` ${CLI_LABELS.FILES_ANALYZED} ${String(metrics.totalFiles)}`)
|
|
||||||
console.log(` ${CLI_LABELS.TOTAL_FUNCTIONS} ${String(metrics.totalFunctions)}`)
|
|
||||||
console.log(` ${CLI_LABELS.TOTAL_IMPORTS} ${String(metrics.totalImports)}`)
|
|
||||||
|
|
||||||
if (Object.keys(metrics.layerDistribution).length > 0) {
|
|
||||||
console.log(CLI_MESSAGES.LAYER_DISTRIBUTION_HEADER)
|
|
||||||
for (const [layer, count] of Object.entries(metrics.layerDistribution)) {
|
|
||||||
console.log(` ${layer}: ${String(count)} ${CLI_LABELS.FILES}`)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Architecture violations
|
|
||||||
if (options.architecture && violations.length > 0) {
|
if (options.architecture && violations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n${CLI_MESSAGES.VIOLATIONS_HEADER} ${String(violations.length)} ${CLI_LABELS.ARCHITECTURE_VIOLATIONS}`,
|
`\n${CLI_MESSAGES.VIOLATIONS_HEADER} ${String(violations.length)} ${CLI_LABELS.ARCHITECTURE_VIOLATIONS}`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
violations,
|
violations,
|
||||||
(v, index) => {
|
(v, i) => {
|
||||||
console.log(`${String(index + 1)}. ${v.file}`)
|
outputFormatter.formatArchitectureViolation(v, i)
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[v.severity]}`)
|
|
||||||
console.log(` Rule: ${v.rule}`)
|
|
||||||
console.log(` ${v.message}`)
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Circular dependency violations
|
|
||||||
if (options.architecture && circularDependencyViolations.length > 0) {
|
if (options.architecture && circularDependencyViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n${CLI_MESSAGES.CIRCULAR_DEPS_HEADER} ${String(circularDependencyViolations.length)} ${CLI_LABELS.CIRCULAR_DEPENDENCIES}`,
|
`\n${CLI_MESSAGES.CIRCULAR_DEPS_HEADER} ${String(circularDependencyViolations.length)} ${CLI_LABELS.CIRCULAR_DEPENDENCIES}`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
circularDependencyViolations,
|
circularDependencyViolations,
|
||||||
(cd, index) => {
|
(cd, i) => {
|
||||||
console.log(`${String(index + 1)}. ${cd.message}`)
|
outputFormatter.formatCircularDependency(cd, i)
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[cd.severity]}`)
|
|
||||||
console.log(" Cycle path:")
|
|
||||||
cd.cycle.forEach((file, i) => {
|
|
||||||
console.log(` ${String(i + 1)}. ${file}`)
|
|
||||||
})
|
|
||||||
console.log(
|
|
||||||
` ${String(cd.cycle.length + 1)}. ${cd.cycle[0]} (back to start)`,
|
|
||||||
)
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Naming convention violations
|
|
||||||
if (options.architecture && namingViolations.length > 0) {
|
if (options.architecture && namingViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n${CLI_MESSAGES.NAMING_VIOLATIONS_HEADER} ${String(namingViolations.length)} ${CLI_LABELS.NAMING_VIOLATIONS}`,
|
`\n${CLI_MESSAGES.NAMING_VIOLATIONS_HEADER} ${String(namingViolations.length)} ${CLI_LABELS.NAMING_VIOLATIONS}`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
namingViolations,
|
namingViolations,
|
||||||
(nc, index) => {
|
(nc, i) => {
|
||||||
console.log(`${String(index + 1)}. ${nc.file}`)
|
outputFormatter.formatNamingViolation(nc, i)
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[nc.severity]}`)
|
|
||||||
console.log(` File: ${nc.fileName}`)
|
|
||||||
console.log(` Layer: ${nc.layer}`)
|
|
||||||
console.log(` Type: ${nc.type}`)
|
|
||||||
console.log(` Message: ${nc.message}`)
|
|
||||||
if (nc.suggestion) {
|
|
||||||
console.log(` 💡 Suggestion: ${nc.suggestion}`)
|
|
||||||
}
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Framework leak violations
|
|
||||||
if (options.architecture && frameworkLeakViolations.length > 0) {
|
if (options.architecture && frameworkLeakViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n🏗️ Found ${String(frameworkLeakViolations.length)} framework leak(s)`,
|
`\n🏗️ Found ${String(frameworkLeakViolations.length)} framework leak(s)`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
frameworkLeakViolations,
|
frameworkLeakViolations,
|
||||||
(fl, index) => {
|
(fl, i) => {
|
||||||
console.log(`${String(index + 1)}. ${fl.file}`)
|
outputFormatter.formatFrameworkLeak(fl, i)
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[fl.severity]}`)
|
|
||||||
console.log(` Package: ${fl.packageName}`)
|
|
||||||
console.log(` Category: ${fl.categoryDescription}`)
|
|
||||||
console.log(` Layer: ${fl.layer}`)
|
|
||||||
console.log(` Rule: ${fl.rule}`)
|
|
||||||
console.log(` ${fl.message}`)
|
|
||||||
console.log(` 💡 Suggestion: ${fl.suggestion}`)
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Entity exposure violations
|
|
||||||
if (options.architecture && entityExposureViolations.length > 0) {
|
if (options.architecture && entityExposureViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n🎭 Found ${String(entityExposureViolations.length)} entity exposure(s)`,
|
`\n🎭 Found ${String(entityExposureViolations.length)} entity exposure(s)`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
entityExposureViolations,
|
entityExposureViolations,
|
||||||
(ee, index) => {
|
(ee, i) => {
|
||||||
const location = ee.line ? `${ee.file}:${String(ee.line)}` : ee.file
|
outputFormatter.formatEntityExposure(ee, i)
|
||||||
console.log(`${String(index + 1)}. ${location}`)
|
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[ee.severity]}`)
|
|
||||||
console.log(` Entity: ${ee.entityName}`)
|
|
||||||
console.log(` Return Type: ${ee.returnType}`)
|
|
||||||
if (ee.methodName) {
|
|
||||||
console.log(` Method: ${ee.methodName}`)
|
|
||||||
}
|
|
||||||
console.log(` Layer: ${ee.layer}`)
|
|
||||||
console.log(` Rule: ${ee.rule}`)
|
|
||||||
console.log(` ${ee.message}`)
|
|
||||||
console.log(" 💡 Suggestion:")
|
|
||||||
ee.suggestion.split("\n").forEach((line) => {
|
|
||||||
if (line.trim()) {
|
|
||||||
console.log(` ${line}`)
|
|
||||||
}
|
|
||||||
})
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Dependency direction violations
|
|
||||||
if (options.architecture && dependencyDirectionViolations.length > 0) {
|
if (options.architecture && dependencyDirectionViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n⚠️ Found ${String(dependencyDirectionViolations.length)} dependency direction violation(s)`,
|
`\n⚠️ Found ${String(dependencyDirectionViolations.length)} dependency direction violation(s)`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
dependencyDirectionViolations,
|
dependencyDirectionViolations,
|
||||||
(dd, index) => {
|
(dd, i) => {
|
||||||
console.log(`${String(index + 1)}. ${dd.file}`)
|
outputFormatter.formatDependencyDirection(dd, i)
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[dd.severity]}`)
|
|
||||||
console.log(` From Layer: ${dd.fromLayer}`)
|
|
||||||
console.log(` To Layer: ${dd.toLayer}`)
|
|
||||||
console.log(` Import: ${dd.importPath}`)
|
|
||||||
console.log(` ${dd.message}`)
|
|
||||||
console.log(` 💡 Suggestion: ${dd.suggestion}`)
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Repository pattern violations
|
|
||||||
if (options.architecture && repositoryPatternViolations.length > 0) {
|
if (options.architecture && repositoryPatternViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n📦 Found ${String(repositoryPatternViolations.length)} repository pattern violation(s)`,
|
`\n📦 Found ${String(repositoryPatternViolations.length)} repository pattern violation(s)`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
repositoryPatternViolations,
|
repositoryPatternViolations,
|
||||||
(rp, index) => {
|
(rp, i) => {
|
||||||
console.log(`${String(index + 1)}. ${rp.file}`)
|
outputFormatter.formatRepositoryPattern(rp, i)
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[rp.severity]}`)
|
|
||||||
console.log(` Layer: ${rp.layer}`)
|
|
||||||
console.log(` Type: ${rp.violationType}`)
|
|
||||||
console.log(` Details: ${rp.details}`)
|
|
||||||
console.log(` ${rp.message}`)
|
|
||||||
console.log(` 💡 Suggestion: ${rp.suggestion}`)
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Aggregate boundary violations
|
|
||||||
if (options.architecture && aggregateBoundaryViolations.length > 0) {
|
if (options.architecture && aggregateBoundaryViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n🔒 Found ${String(aggregateBoundaryViolations.length)} aggregate boundary violation(s)`,
|
`\n🔒 Found ${String(aggregateBoundaryViolations.length)} aggregate boundary violation(s)`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
aggregateBoundaryViolations,
|
aggregateBoundaryViolations,
|
||||||
(ab, index) => {
|
(ab, i) => {
|
||||||
const location = ab.line ? `${ab.file}:${String(ab.line)}` : ab.file
|
outputFormatter.formatAggregateBoundary(ab, i)
|
||||||
console.log(`${String(index + 1)}. ${location}`)
|
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[ab.severity]}`)
|
|
||||||
console.log(` From Aggregate: ${ab.fromAggregate}`)
|
|
||||||
console.log(` To Aggregate: ${ab.toAggregate}`)
|
|
||||||
console.log(` Entity: ${ab.entityName}`)
|
|
||||||
console.log(` Import: ${ab.importPath}`)
|
|
||||||
console.log(` ${ab.message}`)
|
|
||||||
console.log(" 💡 Suggestion:")
|
|
||||||
ab.suggestion.split("\n").forEach((line) => {
|
|
||||||
if (line.trim()) {
|
|
||||||
console.log(` ${line}`)
|
|
||||||
}
|
|
||||||
})
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Hardcode violations
|
|
||||||
if (options.hardcode && hardcodeViolations.length > 0) {
|
if (options.hardcode && hardcodeViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n${CLI_MESSAGES.HARDCODE_VIOLATIONS_HEADER} ${String(hardcodeViolations.length)} ${CLI_LABELS.HARDCODE_VIOLATIONS}`,
|
`\n${CLI_MESSAGES.HARDCODE_VIOLATIONS_HEADER} ${String(hardcodeViolations.length)} ${CLI_LABELS.HARDCODE_VIOLATIONS}`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
hardcodeViolations,
|
hardcodeViolations,
|
||||||
(hc, index) => {
|
(hc, i) => {
|
||||||
console.log(
|
outputFormatter.formatHardcodeViolation(hc, i)
|
||||||
`${String(index + 1)}. ${hc.file}:${String(hc.line)}:${String(hc.column)}`,
|
|
||||||
)
|
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[hc.severity]}`)
|
|
||||||
console.log(` Type: ${hc.type}`)
|
|
||||||
console.log(` Value: ${JSON.stringify(hc.value)}`)
|
|
||||||
console.log(` Context: ${hc.context.trim()}`)
|
|
||||||
console.log(` 💡 Suggested: ${hc.suggestion.constantName}`)
|
|
||||||
console.log(` 📁 Location: ${hc.suggestion.location}`)
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Summary
|
|
||||||
const totalIssues =
|
const totalIssues =
|
||||||
violations.length +
|
violations.length +
|
||||||
hardcodeViolations.length +
|
hardcodeViolations.length +
|
||||||
@@ -444,26 +269,9 @@ program
|
|||||||
repositoryPatternViolations.length +
|
repositoryPatternViolations.length +
|
||||||
aggregateBoundaryViolations.length
|
aggregateBoundaryViolations.length
|
||||||
|
|
||||||
if (totalIssues === 0) {
|
statsFormatter.displaySummary(totalIssues, options.verbose)
|
||||||
console.log(CLI_MESSAGES.NO_ISSUES)
|
|
||||||
process.exit(0)
|
|
||||||
} else {
|
|
||||||
console.log(
|
|
||||||
`${CLI_MESSAGES.ISSUES_TOTAL} ${String(totalIssues)} ${CLI_LABELS.ISSUES_TOTAL}`,
|
|
||||||
)
|
|
||||||
console.log(CLI_MESSAGES.TIP)
|
|
||||||
|
|
||||||
if (options.verbose) {
|
|
||||||
console.log(CLI_MESSAGES.HELP_FOOTER)
|
|
||||||
}
|
|
||||||
|
|
||||||
process.exit(1)
|
|
||||||
}
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error(`\n❌ ${CLI_MESSAGES.ERROR_PREFIX}`)
|
statsFormatter.displayError(error instanceof Error ? error.message : String(error))
|
||||||
console.error(error instanceof Error ? error.message : String(error))
|
|
||||||
console.error("")
|
|
||||||
process.exit(1)
|
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
|
|||||||
@@ -54,6 +54,8 @@ export class AggregateBoundaryDetector implements IAggregateBoundaryDetector {
|
|||||||
DDD_FOLDER_NAMES.REPOSITORIES,
|
DDD_FOLDER_NAMES.REPOSITORIES,
|
||||||
DDD_FOLDER_NAMES.SERVICES,
|
DDD_FOLDER_NAMES.SERVICES,
|
||||||
DDD_FOLDER_NAMES.SPECIFICATIONS,
|
DDD_FOLDER_NAMES.SPECIFICATIONS,
|
||||||
|
DDD_FOLDER_NAMES.ERRORS,
|
||||||
|
DDD_FOLDER_NAMES.EXCEPTIONS,
|
||||||
])
|
])
|
||||||
private readonly nonAggregateFolderNames = new Set<string>([
|
private readonly nonAggregateFolderNames = new Set<string>([
|
||||||
DDD_FOLDER_NAMES.VALUE_OBJECTS,
|
DDD_FOLDER_NAMES.VALUE_OBJECTS,
|
||||||
@@ -69,6 +71,8 @@ export class AggregateBoundaryDetector implements IAggregateBoundaryDetector {
|
|||||||
DDD_FOLDER_NAMES.FACTORIES,
|
DDD_FOLDER_NAMES.FACTORIES,
|
||||||
DDD_FOLDER_NAMES.PORTS,
|
DDD_FOLDER_NAMES.PORTS,
|
||||||
DDD_FOLDER_NAMES.INTERFACES,
|
DDD_FOLDER_NAMES.INTERFACES,
|
||||||
|
DDD_FOLDER_NAMES.ERRORS,
|
||||||
|
DDD_FOLDER_NAMES.EXCEPTIONS,
|
||||||
])
|
])
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -191,6 +195,11 @@ export class AggregateBoundaryDetector implements IAggregateBoundaryDetector {
|
|||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Check if import stays within the same bounded context
|
||||||
|
if (this.isInternalBoundedContextImport(normalizedPath)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
const targetAggregate = this.extractAggregateFromImport(normalizedPath)
|
const targetAggregate = this.extractAggregateFromImport(normalizedPath)
|
||||||
if (!targetAggregate || targetAggregate === currentAggregate) {
|
if (!targetAggregate || targetAggregate === currentAggregate) {
|
||||||
return false
|
return false
|
||||||
@@ -203,6 +212,36 @@ export class AggregateBoundaryDetector implements IAggregateBoundaryDetector {
|
|||||||
return this.seemsLikeEntityImport(normalizedPath)
|
return this.seemsLikeEntityImport(normalizedPath)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if the import is internal to the same bounded context
|
||||||
|
*
|
||||||
|
* An import like "../aggregates/Entity" from "repositories/Repo" stays within
|
||||||
|
* the same bounded context (one level up goes to the bounded context root).
|
||||||
|
*
|
||||||
|
* An import like "../../other-context/Entity" crosses bounded context boundaries.
|
||||||
|
*/
|
||||||
|
private isInternalBoundedContextImport(normalizedPath: string): boolean {
|
||||||
|
const parts = normalizedPath.split("/")
|
||||||
|
const dotDotCount = parts.filter((p) => p === "..").length
|
||||||
|
|
||||||
|
/*
|
||||||
|
* If only one ".." and path goes into aggregates/entities folder,
|
||||||
|
* it's likely an internal import within the same bounded context
|
||||||
|
*/
|
||||||
|
if (dotDotCount === 1) {
|
||||||
|
const nonDotParts = parts.filter((p) => p !== ".." && p !== ".")
|
||||||
|
if (nonDotParts.length >= 1) {
|
||||||
|
const firstFolder = nonDotParts[0]
|
||||||
|
// Importing from aggregates/entities within same bounded context is allowed
|
||||||
|
if (this.entityFolderNames.has(firstFolder)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Checks if the import path is from an allowed folder (value-objects, events, etc.)
|
* Checks if the import path is from an allowed folder (value-objects, events, etc.)
|
||||||
*/
|
*/
|
||||||
|
|||||||
@@ -26,6 +26,19 @@ export class HardcodeDetector implements IHardcodeDetector {
|
|||||||
|
|
||||||
private readonly ALLOWED_STRING_PATTERNS = [/^[a-z]$/i, /^\/$/, /^\\$/, /^\s+$/, /^,$/, /^\.$/]
|
private readonly ALLOWED_STRING_PATTERNS = [/^[a-z]$/i, /^\/$/, /^\\$/, /^\s+$/, /^,$/, /^\.$/]
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Patterns to detect TypeScript type contexts where strings should be ignored
|
||||||
|
*/
|
||||||
|
private readonly TYPE_CONTEXT_PATTERNS = [
|
||||||
|
/^\s*type\s+\w+\s*=/i, // type Foo = ...
|
||||||
|
/^\s*interface\s+\w+/i, // interface Foo { ... }
|
||||||
|
/^\s*\w+\s*:\s*['"`]/, // property: 'value' (in type or interface)
|
||||||
|
/\s+as\s+['"`]/, // ... as 'type'
|
||||||
|
/Record<.*,\s*import\(/, // Record with import type
|
||||||
|
/typeof\s+\w+\s*===\s*['"`]/, // typeof x === 'string'
|
||||||
|
/['"`]\s*===\s*typeof\s+\w+/, // 'string' === typeof x
|
||||||
|
]
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Detects all hardcoded values (both numbers and strings) in the given code
|
* Detects all hardcoded values (both numbers and strings) in the given code
|
||||||
*
|
*
|
||||||
@@ -43,14 +56,15 @@ export class HardcodeDetector implements IHardcodeDetector {
|
|||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Check if a file is a constants definition file
|
* Check if a file is a constants definition file or DI tokens file
|
||||||
*/
|
*/
|
||||||
private isConstantsFile(filePath: string): boolean {
|
private isConstantsFile(filePath: string): boolean {
|
||||||
const _fileName = filePath.split("/").pop() ?? ""
|
const _fileName = filePath.split("/").pop() ?? ""
|
||||||
const constantsPatterns = [
|
const constantsPatterns = [
|
||||||
/^constants?\.(ts|js)$/i,
|
/^constants?\.(ts|js)$/i,
|
||||||
/constants?\/.*\.(ts|js)$/i,
|
/constants?\/.*\.(ts|js)$/i,
|
||||||
/\/(constants|config|settings|defaults)\.ts$/i,
|
/\/(constants|config|settings|defaults|tokens)\.ts$/i,
|
||||||
|
/\/di\/tokens\.(ts|js)$/i,
|
||||||
]
|
]
|
||||||
return constantsPatterns.some((pattern) => pattern.test(filePath))
|
return constantsPatterns.some((pattern) => pattern.test(filePath))
|
||||||
}
|
}
|
||||||
@@ -341,6 +355,18 @@ export class HardcodeDetector implements IHardcodeDetector {
|
|||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (this.isInTypeContext(line)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInSymbolCall(line, value)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInImportCall(line, value)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
if (value.includes(DETECTION_KEYWORDS.HTTP) || value.includes(DETECTION_KEYWORDS.API)) {
|
if (value.includes(DETECTION_KEYWORDS.HTTP) || value.includes(DETECTION_KEYWORDS.API)) {
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
@@ -388,4 +414,46 @@ export class HardcodeDetector implements IHardcodeDetector {
|
|||||||
const end = Math.min(line.length, index + 30)
|
const end = Math.min(line.length, index + 30)
|
||||||
return line.substring(start, end)
|
return line.substring(start, end)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if a line is in a TypeScript type definition context
|
||||||
|
* Examples:
|
||||||
|
* - type Foo = 'a' | 'b'
|
||||||
|
* - interface Bar { prop: 'value' }
|
||||||
|
* - Record<X, import('path')>
|
||||||
|
* - ... as 'type'
|
||||||
|
*/
|
||||||
|
private isInTypeContext(line: string): boolean {
|
||||||
|
const trimmedLine = line.trim()
|
||||||
|
|
||||||
|
if (this.TYPE_CONTEXT_PATTERNS.some((pattern) => pattern.test(trimmedLine))) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
if (trimmedLine.includes("|") && /['"`][^'"`]+['"`]\s*\|/.test(trimmedLine)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if a string is inside a Symbol() call
|
||||||
|
* Example: Symbol('TOKEN_NAME')
|
||||||
|
*/
|
||||||
|
private isInSymbolCall(line: string, stringValue: string): boolean {
|
||||||
|
const symbolPattern = new RegExp(
|
||||||
|
`Symbol\\s*\\(\\s*['"\`]${stringValue.replace(/[.*+?^${}()|[\]\\]/g, "\\$&")}['"\`]\\s*\\)`,
|
||||||
|
)
|
||||||
|
return symbolPattern.test(line)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if a string is inside an import() call
|
||||||
|
* Example: import('../../path/to/module.js')
|
||||||
|
*/
|
||||||
|
private isInImportCall(line: string, stringValue: string): boolean {
|
||||||
|
const importPattern = /import\s*\(\s*['"`][^'"`]+['"`]\s*\)/
|
||||||
|
return importPattern.test(line) && line.includes(stringValue)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -84,6 +84,8 @@ export const DDD_FOLDER_NAMES = {
|
|||||||
FACTORIES: "factories",
|
FACTORIES: "factories",
|
||||||
PORTS: "ports",
|
PORTS: "ports",
|
||||||
INTERFACES: "interfaces",
|
INTERFACES: "interfaces",
|
||||||
|
ERRORS: "errors",
|
||||||
|
EXCEPTIONS: "exceptions",
|
||||||
} as const
|
} as const
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|||||||
@@ -468,4 +468,102 @@ const b = 2`
|
|||||||
expect(result[0].context).toContain("5000")
|
expect(result[0].context).toContain("5000")
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
|
|
||||||
|
describe("TypeScript type contexts (false positive reduction)", () => {
|
||||||
|
it("should NOT detect strings in union types", () => {
|
||||||
|
const code = `type Status = 'active' | 'inactive' | 'pending'`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in interface property types", () => {
|
||||||
|
const code = `interface Config { mode: 'development' | 'production' }`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in type aliases", () => {
|
||||||
|
const code = `type Theme = 'light' | 'dark'`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in type assertions", () => {
|
||||||
|
const code = `const mode = getMode() as 'read' | 'write'`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in Symbol() calls", () => {
|
||||||
|
const code = `const TOKEN = Symbol('MY_TOKEN')`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in multiple Symbol() calls", () => {
|
||||||
|
const code = `
|
||||||
|
export const LOGGER = Symbol('LOGGER')
|
||||||
|
export const DATABASE = Symbol('DATABASE')
|
||||||
|
export const CACHE = Symbol('CACHE')
|
||||||
|
`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in import() calls", () => {
|
||||||
|
const code = `const module = import('../../path/to/module.js')`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in typeof checks", () => {
|
||||||
|
const code = `if (typeof x === 'string') { }`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in reverse typeof checks", () => {
|
||||||
|
const code = `if ('number' === typeof count) { }`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should skip tokens.ts files completely", () => {
|
||||||
|
const code = `
|
||||||
|
export const LOGGER = Symbol('LOGGER')
|
||||||
|
export const DATABASE = Symbol('DATABASE')
|
||||||
|
const url = "http://localhost:8080"
|
||||||
|
`
|
||||||
|
const result = detector.detectAll(code, "src/di/tokens.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should skip tokens.js files completely", () => {
|
||||||
|
const code = `const TOKEN = Symbol('TOKEN')`
|
||||||
|
const result = detector.detectAll(code, "src/di/tokens.js")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect real magic strings even with type contexts nearby", () => {
|
||||||
|
const code = `
|
||||||
|
type Mode = 'read' | 'write'
|
||||||
|
const apiKey = "secret-key-12345"
|
||||||
|
`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result.length).toBeGreaterThan(0)
|
||||||
|
expect(result.some((r) => r.value === "secret-key-12345")).toBe(true)
|
||||||
|
})
|
||||||
|
})
|
||||||
})
|
})
|
||||||
|
|||||||
Reference in New Issue
Block a user