mirror of
https://github.com/samiyev/puaros.git
synced 2025-12-28 07:16:53 +05:00
Compare commits
27 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
a6b4c69b75 | ||
|
|
1d6c2a0e00 | ||
|
|
db8a97202e | ||
|
|
0b1cc5a79a | ||
|
|
8d400c9517 | ||
|
|
9fb9beb311 | ||
|
|
5a43fbf116 | ||
|
|
669e764718 | ||
|
|
0b9b8564bf | ||
|
|
0da25d9046 | ||
|
|
7fea9a8fdb | ||
|
|
b5f54fc3f8 | ||
|
|
8a2c6fdc0e | ||
|
|
2479bde9a8 | ||
|
|
f6bb65f2f1 | ||
|
|
8916ce9eab | ||
|
|
24f54d4b57 | ||
|
|
d038f90bd2 | ||
|
|
e79874e420 | ||
|
|
1663d191ee | ||
|
|
7b4cb60f13 | ||
|
|
33d763c41b | ||
|
|
3cd97c6197 | ||
|
|
8dd445995d | ||
|
|
c75738ba51 | ||
|
|
83b5dccee4 | ||
|
|
5a648e2c29 |
@@ -5,6 +5,440 @@ All notable changes to @samiyev/guardian will be documented in this file.
|
|||||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||||
|
|
||||||
|
## [0.9.0] - 2025-11-26
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- 🏛️ **Anemic Model Detection** - NEW feature to detect anemic domain models lacking business logic:
|
||||||
|
- Detects entities with only getters/setters (violates DDD principles)
|
||||||
|
- Identifies classes with public setters (breaks encapsulation)
|
||||||
|
- Analyzes method-to-property ratio to find data-heavy, logic-light classes
|
||||||
|
- Provides detailed suggestions: add business methods, move logic from services, encapsulate invariants
|
||||||
|
- New `AnemicModelDetector` infrastructure component
|
||||||
|
- New `AnemicModelViolation` value object with rich example fixes
|
||||||
|
- New `IAnemicModelDetector` domain interface
|
||||||
|
- Integrated into CLI with detailed violation reports
|
||||||
|
- 12 comprehensive tests for anemic model detection
|
||||||
|
|
||||||
|
- 📦 **New shared constants** - Centralized constants for better code maintainability:
|
||||||
|
- `CLASS_KEYWORDS` - TypeScript class and method keywords (constructor, public, private, protected)
|
||||||
|
- `EXAMPLE_CODE_CONSTANTS` - Documentation example code strings (ORDER_STATUS_PENDING, ORDER_STATUS_APPROVED, CANNOT_APPROVE_ERROR)
|
||||||
|
- `ANEMIC_MODEL_MESSAGES` - 8 suggestion messages for fixing anemic models
|
||||||
|
|
||||||
|
- 📚 **Example files** - Added DDD examples demonstrating anemic vs rich domain models:
|
||||||
|
- `examples/bad/domain/entities/anemic-model-only-getters-setters.ts`
|
||||||
|
- `examples/bad/domain/entities/anemic-model-public-setters.ts`
|
||||||
|
- `examples/good-architecture/domain/entities/Customer.ts`
|
||||||
|
- `examples/good-architecture/domain/entities/Order.ts`
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- ♻️ **Refactored hardcoded values** - Extracted all remaining hardcoded values to centralized constants:
|
||||||
|
- Updated `AnemicModelDetector.ts` to use `CLASS_KEYWORDS` constants
|
||||||
|
- Updated `AnemicModelViolation.ts` to use `EXAMPLE_CODE_CONSTANTS` for example fix strings
|
||||||
|
- Replaced local constants with shared constants from `shared/constants`
|
||||||
|
- Improved code maintainability and consistency
|
||||||
|
|
||||||
|
- 🎯 **Enhanced violation detection pipeline** - Added anemic model detection to `ExecuteDetection.ts`
|
||||||
|
- 📊 **Updated API** - Added anemic model violations to response DTO
|
||||||
|
- 🔧 **CLI improvements** - Added anemic model section to output formatting
|
||||||
|
|
||||||
|
### Quality
|
||||||
|
|
||||||
|
- ✅ **Guardian self-check** - 0 issues (was 5) - 100% clean codebase
|
||||||
|
- ✅ **All tests pass** - 578/578 tests passing (added 12 new tests)
|
||||||
|
- ✅ **Build successful** - TypeScript compilation with no errors
|
||||||
|
- ✅ **Linter clean** - 0 errors, 3 acceptable warnings (complexity, params)
|
||||||
|
- ✅ **Format verified** - All files properly formatted with 4-space indentation
|
||||||
|
|
||||||
|
## [0.8.1] - 2025-11-25
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- 🧹 **Code quality improvements** - Fixed all 63 hardcoded value issues detected by Guardian self-check:
|
||||||
|
- Fixed 1 CRITICAL: Removed hardcoded Slack token from documentation examples
|
||||||
|
- Fixed 1 HIGH: Removed aws-sdk framework leak from domain layer examples
|
||||||
|
- Fixed 4 MEDIUM: Renamed pipeline files to follow verb-noun convention
|
||||||
|
- Fixed 57 LOW: Extracted all magic strings to reusable constants
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- 📦 **New constants file** - `domain/constants/SecretExamples.ts`:
|
||||||
|
- 32 secret keyword constants (AWS, GitHub, NPM, SSH, Slack, etc.)
|
||||||
|
- 15 secret type name constants
|
||||||
|
- 7 example secret values for documentation
|
||||||
|
- Regex patterns and encoding constants
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- ♻️ **Refactored pipeline naming** - Updated use case files to follow naming conventions:
|
||||||
|
- `DetectionPipeline.ts` → `ExecuteDetection.ts`
|
||||||
|
- `FileCollectionStep.ts` → `CollectFiles.ts`
|
||||||
|
- `ParsingStep.ts` → `ParseSourceFiles.ts`
|
||||||
|
- `ResultAggregator.ts` → `AggregateResults.ts`
|
||||||
|
- Added `Aggregate`, `Collect`, `Parse` to `USE_CASE_VERBS` list
|
||||||
|
- 🔧 **Updated 3 core files to use constants**:
|
||||||
|
- `SecretViolation.ts`: All secret examples use constants, `getSeverity()` returns `typeof SEVERITY_LEVELS.CRITICAL`
|
||||||
|
- `SecretDetector.ts`: All secret keywords use constants
|
||||||
|
- `MagicStringMatcher.ts`: Regex patterns extracted to constants
|
||||||
|
- 📝 **Test updates** - Updated 2 tests to match new example fix messages
|
||||||
|
|
||||||
|
### Quality
|
||||||
|
|
||||||
|
- ✅ **Guardian self-check** - 0 issues (was 63) - 100% clean codebase
|
||||||
|
- ✅ **All tests pass** - 566/566 tests passing
|
||||||
|
- ✅ **Build successful** - TypeScript compilation with no errors
|
||||||
|
- ✅ **Linter clean** - 0 errors, 2 acceptable warnings (complexity, params)
|
||||||
|
- ✅ **Format verified** - All files properly formatted with 4-space indentation
|
||||||
|
|
||||||
|
## [0.8.0] - 2025-11-25
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- 🔐 **Secret Detection** - NEW CRITICAL security feature using industry-standard Secretlint:
|
||||||
|
- Detects 350+ types of hardcoded secrets (AWS keys, GitHub tokens, NPM tokens, SSH keys, API keys, etc.)
|
||||||
|
- All secrets marked as **CRITICAL severity** for immediate attention
|
||||||
|
- Context-aware remediation suggestions for each secret type
|
||||||
|
- Integrated seamlessly with existing detectors
|
||||||
|
- New `SecretDetector` infrastructure component using `@secretlint/node`
|
||||||
|
- New `SecretViolation` value object with rich examples
|
||||||
|
- New `ISecretDetector` domain interface
|
||||||
|
- CLI output with "🔐 Found X hardcoded secrets - CRITICAL SECURITY RISK" section
|
||||||
|
- Added dependencies: `@secretlint/node`, `@secretlint/core`, `@secretlint/types`, `@secretlint/secretlint-rule-preset-recommend`
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- 🔄 **Pipeline async support** - `DetectionPipeline.execute()` now async for secret detection
|
||||||
|
- 📊 **Test suite expanded** - Added 47 new tests (23 for SecretViolation, 24 for SecretDetector)
|
||||||
|
- Total: 566 tests (was 519), 100% pass rate
|
||||||
|
- Coverage: 93.3% statements, 83.74% branches, 98.17% functions
|
||||||
|
- SecretViolation: 100% coverage
|
||||||
|
- 📝 **Documentation updated**:
|
||||||
|
- README.md: Added Secret Detection section with examples
|
||||||
|
- ROADMAP.md: Marked v0.8.0 as released
|
||||||
|
- Updated package description to mention secrets detection
|
||||||
|
|
||||||
|
### Security
|
||||||
|
|
||||||
|
- 🛡️ **Prevents credentials in version control** - catches AWS, GitHub, NPM, SSH, Slack, GCP secrets before commit
|
||||||
|
- ⚠️ **CRITICAL violations** - all hardcoded secrets immediately flagged with highest severity
|
||||||
|
- 💡 **Smart remediation** - provides specific guidance per secret type (environment variables, secret managers, etc.)
|
||||||
|
|
||||||
|
## [0.7.9] - 2025-11-25
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- ♻️ **Refactored large detectors** - significantly improved maintainability and reduced complexity:
|
||||||
|
- **AggregateBoundaryDetector**: Reduced from 381 to 162 lines (57% reduction)
|
||||||
|
- **HardcodeDetector**: Reduced from 459 to 89 lines (81% reduction)
|
||||||
|
- **RepositoryPatternDetector**: Reduced from 479 to 106 lines (78% reduction)
|
||||||
|
- Extracted 13 focused strategy classes for single responsibilities
|
||||||
|
- All 519 tests pass, no breaking changes
|
||||||
|
- Zero ESLint errors (1 pre-existing warning unrelated to refactoring)
|
||||||
|
- Improved code organization and separation of concerns
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- 🏗️ **13 new strategy classes** for focused responsibilities:
|
||||||
|
- `FolderRegistry` - Centralized DDD folder name management
|
||||||
|
- `AggregatePathAnalyzer` - Path parsing and aggregate extraction
|
||||||
|
- `ImportValidator` - Import validation logic
|
||||||
|
- `BraceTracker` - Brace and bracket counting
|
||||||
|
- `ConstantsFileChecker` - Constants file detection
|
||||||
|
- `ExportConstantAnalyzer` - Export const analysis
|
||||||
|
- `MagicNumberMatcher` - Magic number detection
|
||||||
|
- `MagicStringMatcher` - Magic string detection
|
||||||
|
- `OrmTypeMatcher` - ORM type matching
|
||||||
|
- `MethodNameValidator` - Repository method validation
|
||||||
|
- `RepositoryFileAnalyzer` - File role detection
|
||||||
|
- `RepositoryViolationDetector` - Violation detection logic
|
||||||
|
- Enhanced testability with smaller, focused classes
|
||||||
|
|
||||||
|
### Improved
|
||||||
|
|
||||||
|
- 📊 **Code quality metrics**:
|
||||||
|
- Reduced cyclomatic complexity across all three detectors
|
||||||
|
- Better separation of concerns with strategy pattern
|
||||||
|
- More maintainable and extensible codebase
|
||||||
|
- Easier to add new detection patterns
|
||||||
|
- Improved code readability and self-documentation
|
||||||
|
|
||||||
|
## [0.7.8] - 2025-11-25
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- 🧪 **Comprehensive E2E test suite** - full pipeline and CLI integration tests:
|
||||||
|
- Added `tests/e2e/AnalyzeProject.e2e.test.ts` - 21 tests for full analysis pipeline
|
||||||
|
- Added `tests/e2e/CLI.e2e.test.ts` - 22 tests for CLI command execution and output
|
||||||
|
- Added `tests/e2e/JSONOutput.e2e.test.ts` - 19 tests for JSON structure validation
|
||||||
|
- Total of 62 new E2E tests covering all major use cases
|
||||||
|
- Tests validate `examples/good-architecture/` returns zero violations
|
||||||
|
- Tests validate `examples/bad/` detects specific violations
|
||||||
|
- CLI smoke tests with process spawning and output verification
|
||||||
|
- JSON serialization and structure validation for all violation types
|
||||||
|
- Total test count increased from 457 to 519 tests
|
||||||
|
- **100% test pass rate achieved** 🎉 (519/519 tests passing)
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- 🔧 **Improved test robustness**:
|
||||||
|
- E2E tests handle exit codes gracefully (CLI exits with non-zero when violations found)
|
||||||
|
- Added helper function `runCLI()` for consistent error handling
|
||||||
|
- Made validation tests conditional for better reliability
|
||||||
|
- Fixed metrics structure assertions to match actual implementation
|
||||||
|
- Enhanced error handling in CLI process spawning tests
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- 🐛 **Test reliability improvements**:
|
||||||
|
- Fixed CLI tests expecting zero exit codes when violations present
|
||||||
|
- Updated metrics assertions to use correct field names (totalFiles, totalFunctions, totalImports, layerDistribution)
|
||||||
|
- Corrected violation structure property names in E2E tests
|
||||||
|
- Made bad example tests conditional to handle empty results gracefully
|
||||||
|
|
||||||
|
## [0.7.7] - 2025-11-25
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- 🧪 **Comprehensive test coverage for under-tested domain files**:
|
||||||
|
- Added 31 tests for `SourceFile.ts` - coverage improved from 46% to 100%
|
||||||
|
- Added 31 tests for `ProjectPath.ts` - coverage improved from 50% to 100%
|
||||||
|
- Added 18 tests for `ValueObject.ts` - coverage improved from 25% to 100%
|
||||||
|
- Added 32 tests for `RepositoryViolation.ts` - coverage improved from 58% to 92.68%
|
||||||
|
- Total test count increased from 345 to 457 tests
|
||||||
|
- Overall coverage improved to 95.4% statements, 86.25% branches, 96.68% functions
|
||||||
|
- All tests pass with no breaking changes
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- 📊 **Improved code quality and maintainability**:
|
||||||
|
- Enhanced test suite for core domain entities and value objects
|
||||||
|
- Better coverage of edge cases and error handling
|
||||||
|
- Increased confidence in domain layer correctness
|
||||||
|
|
||||||
|
## [0.7.6] - 2025-11-25
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- ♻️ **Refactored CLI module** - improved maintainability and separation of concerns:
|
||||||
|
- Split 484-line `cli/index.ts` into focused modules
|
||||||
|
- Created `cli/groupers/ViolationGrouper.ts` for severity grouping and filtering (29 lines)
|
||||||
|
- Created `cli/formatters/OutputFormatter.ts` for violation formatting (190 lines)
|
||||||
|
- Created `cli/formatters/StatisticsFormatter.ts` for metrics and summary (58 lines)
|
||||||
|
- Reduced `cli/index.ts` from 484 to 260 lines (46% reduction)
|
||||||
|
- All 345 tests pass, CLI output identical to before
|
||||||
|
- No breaking changes
|
||||||
|
|
||||||
|
## [0.7.5] - 2025-11-25
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- ♻️ **Refactored AnalyzeProject use-case** - improved maintainability and testability:
|
||||||
|
- Split 615-line God Use-Case into focused pipeline components
|
||||||
|
- Created `FileCollectionStep.ts` for file scanning and basic parsing (66 lines)
|
||||||
|
- Created `ParsingStep.ts` for AST parsing and dependency graph construction (51 lines)
|
||||||
|
- Created `DetectionPipeline.ts` for running all 7 detectors (371 lines)
|
||||||
|
- Created `ResultAggregator.ts` for building response DTO (81 lines)
|
||||||
|
- Reduced `AnalyzeProject.ts` from 615 to 245 lines (60% reduction)
|
||||||
|
- All 345 tests pass, no breaking changes
|
||||||
|
- Improved separation of concerns and single responsibility
|
||||||
|
- Easier to test and modify individual pipeline steps
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- 🤖 **AI Agent Instructions in CLI help** - dedicated section for AI coding assistants:
|
||||||
|
- Step-by-step workflow: scan → fix → verify → expand scope
|
||||||
|
- Recommended commands for each step (`--only-critical --limit 5`)
|
||||||
|
- Output format description for easy parsing
|
||||||
|
- Priority order guidance (CRITICAL → HIGH → MEDIUM → LOW)
|
||||||
|
- Helps Claude, Copilot, Cursor, and other AI agents immediately take action
|
||||||
|
|
||||||
|
Run `guardian --help` to see the new "AI AGENT INSTRUCTIONS" section.
|
||||||
|
|
||||||
|
## [0.7.4] - 2025-11-25
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- 🐛 **TypeScript-aware hardcode detection** - dramatically reduces false positives by 35.7%:
|
||||||
|
- Ignore strings in TypeScript union types (`type Status = 'active' | 'inactive'`)
|
||||||
|
- Ignore strings in interface property types (`interface { mode: 'development' | 'production' }`)
|
||||||
|
- Ignore strings in type assertions (`as 'read' | 'write'`)
|
||||||
|
- Ignore strings in typeof checks (`typeof x === 'string'`)
|
||||||
|
- Ignore strings in Symbol() calls for DI tokens (`Symbol('LOGGER')`)
|
||||||
|
- Ignore strings in dynamic import() calls (`import('../../module.js')`)
|
||||||
|
- Exclude tokens.ts/tokens.js files completely (DI container files)
|
||||||
|
- Tested on real-world TypeScript project: 985 → 633 issues (352 false positives eliminated)
|
||||||
|
- ✅ **Added 13 new tests** for TypeScript type context filtering
|
||||||
|
|
||||||
|
## [0.7.3] - 2025-11-25
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- 🐛 **False positive: repository importing its own aggregate:**
|
||||||
|
- Added `isInternalBoundedContextImport()` method to detect internal imports
|
||||||
|
- Imports like `../aggregates/Entity` from `repositories/Repo` are now allowed
|
||||||
|
- This correctly allows `ICodeProjectRepository` to import `CodeProject` from the same bounded context
|
||||||
|
- Cross-aggregate imports (with multiple `../..`) are still detected as violations
|
||||||
|
|
||||||
|
## [0.7.2] - 2025-11-25
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- 🐛 **False positive: `errors` folder detected as aggregate:**
|
||||||
|
- Added `errors` and `exceptions` to `DDD_FOLDER_NAMES` constants
|
||||||
|
- Added to `nonAggregateFolderNames` — these folders are no longer detected as aggregates
|
||||||
|
- Added to `allowedFolderNames` — imports from `errors`/`exceptions` folders are allowed across aggregates
|
||||||
|
- Fixes issue where `domain/code-analysis/errors/` was incorrectly identified as a separate aggregate named "errors"
|
||||||
|
|
||||||
|
## [0.7.1] - 2025-11-25
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- 🐛 **Aggregate Boundary Detection for relative paths:**
|
||||||
|
- Fixed regex pattern to support paths starting with `domain/` (without leading `/`)
|
||||||
|
- Now correctly detects violations in projects scanned from parent directories
|
||||||
|
|
||||||
|
- 🐛 **Reduced false positives in Repository Pattern detection:**
|
||||||
|
- Removed `findAll`, `exists`, `count` from ORM technical methods blacklist
|
||||||
|
- These are now correctly recognized as valid domain method names
|
||||||
|
- Added `exists`, `count`, `countBy[A-Z]` to domain method patterns
|
||||||
|
|
||||||
|
- 🐛 **Non-aggregate folder exclusions:**
|
||||||
|
- Added exclusions for standard DDD folders: `constants`, `shared`, `factories`, `ports`, `interfaces`
|
||||||
|
- Prevents false positives when domain layer has shared utilities
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- ♻️ **Extracted magic strings to constants:**
|
||||||
|
- DDD folder names (`entities`, `aggregates`, `value-objects`, etc.) moved to `DDD_FOLDER_NAMES`
|
||||||
|
- Repository method suggestions moved to `REPOSITORY_METHOD_SUGGESTIONS`
|
||||||
|
- Fallback suggestions moved to `REPOSITORY_FALLBACK_SUGGESTIONS`
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- 📁 **Aggregate boundary test examples:**
|
||||||
|
- Added `examples/aggregate-boundary/domain/` with Order, User, Product aggregates
|
||||||
|
- Demonstrates cross-aggregate entity reference violations
|
||||||
|
|
||||||
|
## [0.7.0] - 2025-11-25
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
**🔒 Aggregate Boundary Validation**
|
||||||
|
|
||||||
|
New DDD feature to enforce aggregate boundaries and prevent tight coupling between aggregates.
|
||||||
|
|
||||||
|
- ✅ **Aggregate Boundary Detector:**
|
||||||
|
- Detects direct entity references across aggregate boundaries
|
||||||
|
- Validates that aggregates reference each other only by ID or Value Objects
|
||||||
|
- Supports multiple folder structure patterns:
|
||||||
|
- `domain/aggregates/order/Order.ts`
|
||||||
|
- `domain/order/Order.ts`
|
||||||
|
- `domain/entities/order/Order.ts`
|
||||||
|
|
||||||
|
- ✅ **Smart Import Analysis:**
|
||||||
|
- Parses ES6 imports and CommonJS require statements
|
||||||
|
- Identifies entity imports from other aggregates
|
||||||
|
- Allows imports from value-objects, events, services, specifications folders
|
||||||
|
|
||||||
|
- ✅ **Actionable Suggestions:**
|
||||||
|
- Reference by ID instead of entity
|
||||||
|
- Use Value Objects to store needed data from other aggregates
|
||||||
|
- Maintain aggregate independence
|
||||||
|
|
||||||
|
- ✅ **CLI Integration:**
|
||||||
|
- `--architecture` flag includes aggregate boundary checks
|
||||||
|
- CRITICAL severity for violations
|
||||||
|
- Detailed violation messages with file:line references
|
||||||
|
|
||||||
|
- ✅ **Test Coverage:**
|
||||||
|
- 41 new tests for aggregate boundary detection
|
||||||
|
- 333 total tests passing (100% pass rate)
|
||||||
|
- Examples in `examples/aggregate-boundary/`
|
||||||
|
|
||||||
|
### Technical
|
||||||
|
|
||||||
|
- New `AggregateBoundaryDetector` in infrastructure layer
|
||||||
|
- New `AggregateBoundaryViolation` value object in domain layer
|
||||||
|
- New `IAggregateBoundaryDetector` interface for dependency inversion
|
||||||
|
- Integrated into `AnalyzeProject` use case
|
||||||
|
|
||||||
|
## [0.6.4] - 2025-11-24
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
**🎯 Smart Context-Aware Suggestions for Repository Method Names**
|
||||||
|
|
||||||
|
Guardian now provides intelligent, context-specific suggestions when it detects non-domain method names in repositories.
|
||||||
|
|
||||||
|
- ✅ **Intelligent method name analysis:**
|
||||||
|
- `queryUsers()` → Suggests: `search`, `findBy[Property]`
|
||||||
|
- `selectById()` → Suggests: `findBy[Property]`, `get[Entity]`
|
||||||
|
- `insertUser()` → Suggests: `create`, `add[Entity]`, `store[Entity]`
|
||||||
|
- `updateRecord()` → Suggests: `update`, `modify[Entity]`
|
||||||
|
- `upsertUser()` → Suggests: `save`, `store[Entity]`
|
||||||
|
- `removeUser()` → Suggests: `delete`, `removeBy[Property]`
|
||||||
|
- `fetchUserData()` → Suggests: `findBy[Property]`, `get[Entity]`
|
||||||
|
- And more technical patterns detected automatically!
|
||||||
|
|
||||||
|
- 🎯 **Impact:**
|
||||||
|
- Developers get actionable, relevant suggestions instead of generic examples
|
||||||
|
- Faster refactoring with specific naming alternatives
|
||||||
|
- Better learning experience for developers new to DDD
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- ✅ **Expanded domain method patterns support:**
|
||||||
|
- `find*()` methods - e.g., `findNodes()`, `findNodeById()`, `findSimilar()`
|
||||||
|
- `saveAll()` - batch save operations
|
||||||
|
- `deleteBy*()` methods - e.g., `deleteByPath()`, `deleteById()`
|
||||||
|
- `deleteAll()` - clear all entities
|
||||||
|
- `add*()` methods - e.g., `addRelationship()`, `addItem()`
|
||||||
|
- `initializeCollection()` - collection initialization
|
||||||
|
|
||||||
|
- 🐛 **Removed `findAll` from technical methods blacklist:**
|
||||||
|
- `findAll()` is now correctly recognized as a standard domain method
|
||||||
|
- Reduced false positives for repositories using this common pattern
|
||||||
|
|
||||||
|
### Technical
|
||||||
|
|
||||||
|
- Added `suggestDomainMethodName()` method in `RepositoryPatternDetector.ts` with keyword-based suggestion mapping
|
||||||
|
- Updated `getNonDomainMethodSuggestion()` in `RepositoryViolation.ts` to extract and use smart suggestions
|
||||||
|
- Refactored suggestion logic to reduce cyclomatic complexity (22 → 9)
|
||||||
|
- Enhanced `domainMethodPatterns` with 9 additional patterns
|
||||||
|
- All 333 tests passing
|
||||||
|
|
||||||
|
## [0.6.3] - 2025-11-24
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
**🐛 Repository Pattern Detection - Reduced False Positives**
|
||||||
|
|
||||||
|
Fixed overly strict repository method name validation that was flagging valid DDD patterns as violations.
|
||||||
|
|
||||||
|
- ✅ **Added support for common DDD repository patterns:**
|
||||||
|
- `has*()` methods - e.g., `hasProject()`, `hasPermission()`
|
||||||
|
- `is*()` methods - e.g., `isCached()`, `isActive()`
|
||||||
|
- `exists*()` methods - e.g., `existsById()`, `existsByEmail()`
|
||||||
|
- `clear*()` methods - e.g., `clearCache()`, `clearAll()`
|
||||||
|
- `store*()` methods - e.g., `storeMetadata()`, `storeFile()`
|
||||||
|
- Lifecycle methods: `initialize()`, `close()`, `connect()`, `disconnect()`
|
||||||
|
|
||||||
|
- 🎯 **Impact:**
|
||||||
|
- Reduced false positives in real-world DDD projects
|
||||||
|
- Better alignment with Domain-Driven Design best practices
|
||||||
|
- More practical for cache repositories, connection management, and business queries
|
||||||
|
|
||||||
|
- 📚 **Why these patterns are valid:**
|
||||||
|
- Martin Fowler's Repository Pattern allows domain-specific query methods
|
||||||
|
- DDD recommends using ubiquitous language in method names
|
||||||
|
- Lifecycle methods are standard for resource management in repositories
|
||||||
|
|
||||||
|
### Technical
|
||||||
|
|
||||||
|
- Updated `domainMethodPatterns` in `RepositoryPatternDetector.ts` with 11 additional valid patterns
|
||||||
|
- All existing functionality remains unchanged
|
||||||
|
|
||||||
## [0.6.2] - 2025-11-24
|
## [0.6.2] - 2025-11-24
|
||||||
|
|
||||||
### Added
|
### Added
|
||||||
|
|||||||
895
packages/guardian/COMPARISON.md
Normal file
895
packages/guardian/COMPARISON.md
Normal file
@@ -0,0 +1,895 @@
|
|||||||
|
# Guardian vs Competitors: Comprehensive Comparison 🔍
|
||||||
|
|
||||||
|
**Last Updated:** 2025-01-24
|
||||||
|
|
||||||
|
This document provides an in-depth comparison of Guardian against major competitors in the static analysis and architecture enforcement space.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 TL;DR - When to Use Each Tool
|
||||||
|
|
||||||
|
| Your Need | Recommended Tool | Why |
|
||||||
|
|-----------|------------------|-----|
|
||||||
|
| **TypeScript + AI coding + DDD** | ✅ **Guardian** | Only tool built for AI-assisted DDD development |
|
||||||
|
| **Multi-language + Security** | SonarQube | 35+ languages, deep security scanning |
|
||||||
|
| **Dependency visualization** | dependency-cruiser + Guardian | Best visualization + architecture rules |
|
||||||
|
| **Java architecture** | ArchUnit | Java-specific with unit test integration |
|
||||||
|
| **TypeScript complexity metrics** | FTA + Guardian | Fast metrics + architecture enforcement |
|
||||||
|
| **Python architecture** | import-linter + Guardian (future) | Python layer enforcement |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Feature Comparison Matrix
|
||||||
|
|
||||||
|
### Core Capabilities
|
||||||
|
|
||||||
|
| Feature | Guardian | SonarQube | dependency-cruiser | ArchUnit | FTA | ESLint |
|
||||||
|
|---------|----------|-----------|-------------------|----------|-----|--------|
|
||||||
|
| **Languages** | JS/TS | 35+ | JS/TS/Vue | Java | TS/JS | JS/TS |
|
||||||
|
| **Setup Complexity** | ⚡ Simple | 🐌 Complex | ⚡ Simple | ⚙️ Medium | ⚡ Simple | ⚡ Simple |
|
||||||
|
| **Price** | 🆓 Free | 💰 Freemium | 🆓 Free | 🆓 Free | 🆓 Free | 🆓 Free |
|
||||||
|
| **GitHub Stars** | - | - | 6.2k | 3.1k | - | 24k+ |
|
||||||
|
|
||||||
|
### Detection Capabilities
|
||||||
|
|
||||||
|
| Feature | Guardian | SonarQube | dependency-cruiser | ArchUnit | FTA | ESLint |
|
||||||
|
|---------|----------|-----------|-------------------|----------|-----|--------|
|
||||||
|
| **Hardcode Detection** | ✅✅ (with AI tips) | ⚠️ (secrets only) | ❌ | ❌ | ❌ | ❌ |
|
||||||
|
| **Circular Dependencies** | ✅ | ✅ | ✅✅ (visual) | ✅ | ❌ | ✅ |
|
||||||
|
| **Architecture Layers** | ✅✅ (DDD/Clean) | ⚠️ (generic) | ✅ (via rules) | ✅✅ | ❌ | ⚠️ |
|
||||||
|
| **Framework Leak** | ✅✅ UNIQUE | ❌ | ⚠️ (via rules) | ⚠️ | ❌ | ❌ |
|
||||||
|
| **Entity Exposure** | ✅✅ UNIQUE | ❌ | ❌ | ❌ | ❌ | ❌ |
|
||||||
|
| **Naming Conventions** | ✅ (DDD-specific) | ✅ (generic) | ❌ | ✅ | ❌ | ✅ |
|
||||||
|
| **Repository Pattern** | ✅✅ UNIQUE | ❌ | ❌ | ⚠️ | ❌ | ❌ |
|
||||||
|
| **Dependency Direction** | ✅✅ | ❌ | ✅ (via rules) | ✅ | ❌ | ❌ |
|
||||||
|
| **Security (SAST)** | ❌ | ✅✅ | ❌ | ❌ | ❌ | ⚠️ |
|
||||||
|
| **Dependency Risks (SCA)** | ❌ | ✅✅ | ❌ | ❌ | ❌ | ❌ |
|
||||||
|
| **Complexity Metrics** | ❌ | ✅ | ❌ | ❌ | ✅✅ | ⚠️ |
|
||||||
|
| **Code Duplication** | ❌ | ✅✅ | ❌ | ❌ | ❌ | ❌ |
|
||||||
|
|
||||||
|
### Developer Experience
|
||||||
|
|
||||||
|
| Feature | Guardian | SonarQube | dependency-cruiser | ArchUnit | FTA | ESLint |
|
||||||
|
|---------|----------|-----------|-------------------|----------|-----|--------|
|
||||||
|
| **CLI** | ✅ | ✅ | ✅ | ❌ (lib) | ✅ | ✅ |
|
||||||
|
| **Configuration** | ✅ (v0.6+) | ✅✅ | ✅ | ✅ | ⚠️ | ✅✅ |
|
||||||
|
| **Visualization** | ✅ (v0.7+) | ✅✅ (dashboard) | ✅✅ (graphs) | ❌ | ⚠️ | ❌ |
|
||||||
|
| **Auto-Fix** | ✅✅ (v0.9+) UNIQUE | ❌ | ❌ | ❌ | ❌ | ✅ |
|
||||||
|
| **AI Workflow** | ✅✅ UNIQUE | ❌ | ❌ | ❌ | ❌ | ❌ |
|
||||||
|
| **CI/CD Integration** | ✅ (v0.8+) | ✅✅ | ✅ | ✅ | ⚠️ | ✅✅ |
|
||||||
|
| **IDE Extensions** | 🔜 (v1.0+) | ✅ | ❌ | ❌ | ⚠️ | ✅✅ |
|
||||||
|
| **Metrics Dashboard** | ✅ (v0.10+) | ✅✅ | ⚠️ | ❌ | ✅ | ❌ |
|
||||||
|
|
||||||
|
**Legend:**
|
||||||
|
- ✅✅ = Excellent support
|
||||||
|
- ✅ = Good support
|
||||||
|
- ⚠️ = Limited/partial support
|
||||||
|
- ❌ = Not available
|
||||||
|
- 🔜 = Planned/Coming soon
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔥 Guardian's Unique Advantages
|
||||||
|
|
||||||
|
Guardian has **7 unique features** that no competitor offers:
|
||||||
|
|
||||||
|
### 1. ✨ Hardcode Detection with AI Suggestions
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```typescript
|
||||||
|
// Detected:
|
||||||
|
app.listen(3000)
|
||||||
|
|
||||||
|
// Suggestion:
|
||||||
|
💡 Extract to: DEFAULT_PORT
|
||||||
|
📁 Location: infrastructure/config/constants.ts
|
||||||
|
🤖 AI Prompt: "Extract port 3000 to DEFAULT_PORT constant in config"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- SonarQube: Only detects hardcoded secrets (API keys), not magic numbers
|
||||||
|
- Others: No hardcode detection at all
|
||||||
|
|
||||||
|
### 2. 🔌 Framework Leak Detection
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```typescript
|
||||||
|
// domain/entities/User.ts
|
||||||
|
import { Request } from 'express' // ❌ VIOLATION!
|
||||||
|
|
||||||
|
// Detected: Framework leak in domain layer
|
||||||
|
// Suggestion: Use dependency injection via interfaces
|
||||||
|
```
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- ArchUnit: Can check via custom rules (not built-in)
|
||||||
|
- Others: Not available
|
||||||
|
|
||||||
|
### 3. 🎭 Entity Exposure Detection
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```typescript
|
||||||
|
// ❌ Bad: Domain entity exposed
|
||||||
|
async getUser(): Promise<User> { }
|
||||||
|
|
||||||
|
// ✅ Good: Use DTOs
|
||||||
|
async getUser(): Promise<UserDto> { }
|
||||||
|
|
||||||
|
// Guardian detects this automatically!
|
||||||
|
```
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- None have this built-in
|
||||||
|
|
||||||
|
### 4. 📚 Repository Pattern Validation
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```typescript
|
||||||
|
// Detects ORM types in domain interfaces:
|
||||||
|
interface IUserRepository {
|
||||||
|
findOne(query: { where: ... }) // ❌ Prisma-specific!
|
||||||
|
}
|
||||||
|
|
||||||
|
// Detects concrete repos in use cases:
|
||||||
|
constructor(private prisma: PrismaClient) {} // ❌ VIOLATION!
|
||||||
|
```
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- None validate repository pattern
|
||||||
|
|
||||||
|
### 5. 🤖 AI-First Workflow
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```bash
|
||||||
|
# Generate AI-friendly fix prompt
|
||||||
|
guardian check ./src --format ai-prompt > fix.txt
|
||||||
|
|
||||||
|
# Feed to Claude/GPT:
|
||||||
|
"Fix these Guardian violations: $(cat fix.txt)"
|
||||||
|
|
||||||
|
# AI fixes → Run Guardian again → Ship it!
|
||||||
|
```
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- Generic output, not optimized for AI assistants
|
||||||
|
|
||||||
|
### 6. 🛠️ Auto-Fix for Architecture (v0.9+)
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```bash
|
||||||
|
# Automatically extract hardcodes to constants
|
||||||
|
guardian fix ./src --auto
|
||||||
|
|
||||||
|
# Rename files to match conventions
|
||||||
|
guardian fix naming ./src --auto
|
||||||
|
|
||||||
|
# Interactive mode
|
||||||
|
guardian fix ./src --interactive
|
||||||
|
```
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- ESLint has `--fix` but only for syntax
|
||||||
|
- None fix architecture violations
|
||||||
|
|
||||||
|
### 7. 🎯 DDD Pattern Detection (30+)
|
||||||
|
|
||||||
|
**Guardian Roadmap:**
|
||||||
|
- Aggregate boundaries
|
||||||
|
- Anemic domain model
|
||||||
|
- Domain events
|
||||||
|
- Value Object immutability
|
||||||
|
- CQRS violations
|
||||||
|
- Saga pattern
|
||||||
|
- Ubiquitous language
|
||||||
|
- And 23+ more DDD patterns!
|
||||||
|
|
||||||
|
**Competitors:**
|
||||||
|
- Generic architecture checks only
|
||||||
|
- No DDD-specific patterns
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📈 Detailed Tool Comparisons
|
||||||
|
|
||||||
|
## vs SonarQube
|
||||||
|
|
||||||
|
### When SonarQube Wins
|
||||||
|
|
||||||
|
✅ **Multi-language projects**
|
||||||
|
```
|
||||||
|
Java + Python + TypeScript → Use SonarQube
|
||||||
|
TypeScript only → Consider Guardian
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Security-critical applications**
|
||||||
|
```
|
||||||
|
SonarQube: SAST, SCA, OWASP Top 10, CVE detection
|
||||||
|
Guardian: Architecture only (security coming later)
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Large enterprise with compliance**
|
||||||
|
```
|
||||||
|
SonarQube: Compliance reports, audit trails, enterprise support
|
||||||
|
Guardian: Lightweight, developer-focused
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Existing SonarQube investment**
|
||||||
|
```
|
||||||
|
Already using SonarQube? Add Guardian for DDD-specific checks
|
||||||
|
```
|
||||||
|
|
||||||
|
### When Guardian Wins
|
||||||
|
|
||||||
|
✅ **TypeScript + AI coding workflow**
|
||||||
|
```typescript
|
||||||
|
// AI generates code → Guardian checks → AI fixes → Ship
|
||||||
|
// 10x faster than manual review
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Clean Architecture / DDD enforcement**
|
||||||
|
```typescript
|
||||||
|
// Guardian understands DDD out-of-the-box
|
||||||
|
// SonarQube requires custom rules
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Fast setup (< 5 minutes)**
|
||||||
|
```bash
|
||||||
|
npm install -g @samiyev/guardian
|
||||||
|
guardian check ./src
|
||||||
|
# Done! (vs hours of SonarQube setup)
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Hardcode detection with context**
|
||||||
|
```typescript
|
||||||
|
// Guardian knows the difference between:
|
||||||
|
const port = 3000 // ❌ Should be constant
|
||||||
|
const increment = 1 // ✅ Allowed (semantic)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Side-by-Side Example
|
||||||
|
|
||||||
|
**Scenario:** Detect hardcoded port in Express app
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// src/server.ts
|
||||||
|
app.listen(3000)
|
||||||
|
```
|
||||||
|
|
||||||
|
**SonarQube:**
|
||||||
|
```
|
||||||
|
❌ No violation (not a secret)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```
|
||||||
|
✅ Hardcode detected:
|
||||||
|
Type: magic-number
|
||||||
|
Value: 3000
|
||||||
|
💡 Suggested: DEFAULT_PORT
|
||||||
|
📁 Location: infrastructure/config/constants.ts
|
||||||
|
🤖 AI Fix: "Extract 3000 to DEFAULT_PORT constant"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## vs dependency-cruiser
|
||||||
|
|
||||||
|
### When dependency-cruiser Wins
|
||||||
|
|
||||||
|
✅ **Visualization priority**
|
||||||
|
```bash
|
||||||
|
# Best-in-class dependency graphs
|
||||||
|
depcruise src --output-type dot | dot -T svg > graph.svg
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Custom dependency rules**
|
||||||
|
```javascript
|
||||||
|
// Highly flexible rule system
|
||||||
|
forbidden: [
|
||||||
|
{
|
||||||
|
from: { path: '^src/domain' },
|
||||||
|
to: { path: '^src/infrastructure' }
|
||||||
|
}
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Multi-framework support**
|
||||||
|
```
|
||||||
|
JS, TS, Vue, Svelte, JSX, CoffeeScript
|
||||||
|
```
|
||||||
|
|
||||||
|
### When Guardian Wins
|
||||||
|
|
||||||
|
✅ **DDD/Clean Architecture out-of-the-box**
|
||||||
|
```typescript
|
||||||
|
// Guardian knows these patterns:
|
||||||
|
// - Domain/Application/Infrastructure layers
|
||||||
|
// - Entity exposure
|
||||||
|
// - Repository pattern
|
||||||
|
// - Framework leaks
|
||||||
|
|
||||||
|
// dependency-cruiser: Write custom rules for each
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Hardcode detection**
|
||||||
|
```typescript
|
||||||
|
// Guardian finds:
|
||||||
|
setTimeout(() => {}, 5000) // Magic number
|
||||||
|
const url = "http://..." // Magic string
|
||||||
|
|
||||||
|
// dependency-cruiser: Doesn't check this
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **AI workflow integration**
|
||||||
|
```bash
|
||||||
|
guardian check ./src --format ai-prompt
|
||||||
|
# Optimized for Claude/GPT
|
||||||
|
|
||||||
|
depcruise src
|
||||||
|
# Generic output
|
||||||
|
```
|
||||||
|
|
||||||
|
### Complementary Usage
|
||||||
|
|
||||||
|
**Best approach:** Use both!
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Guardian for architecture + hardcode
|
||||||
|
guardian check ./src
|
||||||
|
|
||||||
|
# dependency-cruiser for visualization
|
||||||
|
depcruise src --output-type svg > architecture.svg
|
||||||
|
```
|
||||||
|
|
||||||
|
**Coming in Guardian v0.7.0:**
|
||||||
|
```bash
|
||||||
|
# Guardian will have built-in visualization!
|
||||||
|
guardian visualize ./src --output architecture.svg
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## vs ArchUnit (Java)
|
||||||
|
|
||||||
|
### When ArchUnit Wins
|
||||||
|
|
||||||
|
✅ **Java projects**
|
||||||
|
```java
|
||||||
|
// ArchUnit is built for Java
|
||||||
|
@ArchTest
|
||||||
|
void domainShouldNotDependOnInfrastructure(JavaClasses classes) {
|
||||||
|
noClasses().that().resideInPackage("..domain..")
|
||||||
|
.should().dependOnClassesThat().resideInPackage("..infrastructure..")
|
||||||
|
.check(classes);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Test-based architecture validation**
|
||||||
|
```java
|
||||||
|
// Architecture rules = unit tests
|
||||||
|
// Runs in your CI with other tests
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Mature Java ecosystem**
|
||||||
|
```
|
||||||
|
Spring Boot, Hibernate, JPA patterns
|
||||||
|
Built-in rules for layered/onion architecture
|
||||||
|
```
|
||||||
|
|
||||||
|
### When Guardian Wins
|
||||||
|
|
||||||
|
✅ **TypeScript/JavaScript projects**
|
||||||
|
```typescript
|
||||||
|
// Guardian is built for TypeScript
|
||||||
|
// ArchUnit doesn't support TS
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **AI coding workflow**
|
||||||
|
```bash
|
||||||
|
# Guardian → AI → Fix → Ship
|
||||||
|
# ArchUnit is test-based (slower feedback)
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Zero-config DDD**
|
||||||
|
```bash
|
||||||
|
guardian check ./src
|
||||||
|
# Works immediately with DDD structure
|
||||||
|
|
||||||
|
# ArchUnit requires writing tests for each rule
|
||||||
|
```
|
||||||
|
|
||||||
|
### Philosophical Difference
|
||||||
|
|
||||||
|
**ArchUnit:**
|
||||||
|
```java
|
||||||
|
// Architecture = Tests
|
||||||
|
// You write explicit tests for each rule
|
||||||
|
```
|
||||||
|
|
||||||
|
**Guardian:**
|
||||||
|
```bash
|
||||||
|
# Architecture = Linter
|
||||||
|
# Pre-configured DDD rules out-of-the-box
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## vs FTA (Fast TypeScript Analyzer)
|
||||||
|
|
||||||
|
### When FTA Wins
|
||||||
|
|
||||||
|
✅ **Complexity metrics focus**
|
||||||
|
```bash
|
||||||
|
# FTA provides:
|
||||||
|
# - Cyclomatic complexity
|
||||||
|
# - Halstead metrics
|
||||||
|
# - Line counts
|
||||||
|
# - Technical debt estimation
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Performance (Rust-based)**
|
||||||
|
```
|
||||||
|
FTA: 1600 files/second
|
||||||
|
Guardian: ~500 files/second (Node.js)
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Simplicity**
|
||||||
|
```bash
|
||||||
|
# FTA does one thing well: metrics
|
||||||
|
fta src/
|
||||||
|
```
|
||||||
|
|
||||||
|
### When Guardian Wins
|
||||||
|
|
||||||
|
✅ **Architecture enforcement**
|
||||||
|
```typescript
|
||||||
|
// Guardian checks:
|
||||||
|
// - Layer violations
|
||||||
|
// - Framework leaks
|
||||||
|
// - Circular dependencies
|
||||||
|
// - Repository pattern
|
||||||
|
|
||||||
|
// FTA: Only measures complexity, no architecture checks
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Hardcode detection**
|
||||||
|
```typescript
|
||||||
|
// Guardian finds magic numbers/strings
|
||||||
|
// FTA doesn't check this
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **AI workflow**
|
||||||
|
```bash
|
||||||
|
# Guardian provides actionable suggestions
|
||||||
|
# FTA provides metrics only
|
||||||
|
```
|
||||||
|
|
||||||
|
### Complementary Usage
|
||||||
|
|
||||||
|
**Best approach:** Use both!
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Guardian for architecture
|
||||||
|
guardian check ./src
|
||||||
|
|
||||||
|
# FTA for complexity metrics
|
||||||
|
fta src/ --threshold complexity:15
|
||||||
|
```
|
||||||
|
|
||||||
|
**Coming in Guardian v0.10.0:**
|
||||||
|
```bash
|
||||||
|
# Guardian will include complexity metrics!
|
||||||
|
guardian metrics ./src --include-complexity
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## vs ESLint + Plugins
|
||||||
|
|
||||||
|
### When ESLint Wins
|
||||||
|
|
||||||
|
✅ **General code quality**
|
||||||
|
```javascript
|
||||||
|
// Best for:
|
||||||
|
// - Code style
|
||||||
|
// - Common bugs
|
||||||
|
// - TypeScript errors
|
||||||
|
// - React/Vue specific rules
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Huge ecosystem**
|
||||||
|
```bash
|
||||||
|
# 10,000+ plugins
|
||||||
|
eslint-plugin-react
|
||||||
|
eslint-plugin-vue
|
||||||
|
eslint-plugin-security
|
||||||
|
# ...and many more
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Auto-fix for syntax**
|
||||||
|
```bash
|
||||||
|
eslint --fix
|
||||||
|
# Fixes semicolons, quotes, formatting, etc.
|
||||||
|
```
|
||||||
|
|
||||||
|
### When Guardian Wins
|
||||||
|
|
||||||
|
✅ **Architecture enforcement**
|
||||||
|
```typescript
|
||||||
|
// ESLint doesn't understand:
|
||||||
|
// - Clean Architecture layers
|
||||||
|
// - DDD patterns
|
||||||
|
// - Framework leaks
|
||||||
|
// - Entity exposure
|
||||||
|
|
||||||
|
// Guardian does!
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Hardcode detection with context**
|
||||||
|
```typescript
|
||||||
|
// ESLint plugins check patterns
|
||||||
|
// Guardian understands semantic context
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **AI workflow integration**
|
||||||
|
```bash
|
||||||
|
# Guardian optimized for AI assistants
|
||||||
|
# ESLint generic output
|
||||||
|
```
|
||||||
|
|
||||||
|
### Complementary Usage
|
||||||
|
|
||||||
|
**Best approach:** Use both!
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# ESLint for code quality
|
||||||
|
eslint src/
|
||||||
|
|
||||||
|
# Guardian for architecture
|
||||||
|
guardian check ./src
|
||||||
|
```
|
||||||
|
|
||||||
|
**Many teams run both in CI:**
|
||||||
|
```yaml
|
||||||
|
# .github/workflows/quality.yml
|
||||||
|
- name: ESLint
|
||||||
|
run: npm run lint
|
||||||
|
|
||||||
|
- name: Guardian
|
||||||
|
run: guardian check ./src --fail-on error
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## vs import-linter (Python)
|
||||||
|
|
||||||
|
### When import-linter Wins
|
||||||
|
|
||||||
|
✅ **Python projects**
|
||||||
|
```ini
|
||||||
|
# .importlinter
|
||||||
|
[importlinter]
|
||||||
|
root_package = myproject
|
||||||
|
|
||||||
|
[importlinter:contract:1]
|
||||||
|
name = Layers contract
|
||||||
|
type = layers
|
||||||
|
layers =
|
||||||
|
myproject.domain
|
||||||
|
myproject.application
|
||||||
|
myproject.infrastructure
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Mature Python ecosystem**
|
||||||
|
```python
|
||||||
|
# Django, Flask, FastAPI integration
|
||||||
|
```
|
||||||
|
|
||||||
|
### When Guardian Wins
|
||||||
|
|
||||||
|
✅ **TypeScript/JavaScript**
|
||||||
|
```typescript
|
||||||
|
// Guardian is for TS/JS
|
||||||
|
// import-linter is Python-only
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **More than import checking**
|
||||||
|
```typescript
|
||||||
|
// Guardian checks:
|
||||||
|
// - Hardcode
|
||||||
|
// - Entity exposure
|
||||||
|
// - Repository pattern
|
||||||
|
// - Framework leaks
|
||||||
|
|
||||||
|
// import-linter: Only imports
|
||||||
|
```
|
||||||
|
|
||||||
|
### Future Integration
|
||||||
|
|
||||||
|
**Guardian v2.0+ (Planned):**
|
||||||
|
```bash
|
||||||
|
# Multi-language support coming
|
||||||
|
guardian check ./python-src --language python
|
||||||
|
guardian check ./ts-src --language typescript
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💰 Cost Comparison
|
||||||
|
|
||||||
|
| Tool | Free Tier | Paid Plans | Enterprise |
|
||||||
|
|------|-----------|------------|------------|
|
||||||
|
| **Guardian** | ✅ MIT License (100% free) | - | - |
|
||||||
|
| **SonarQube** | ✅ Community Edition | Developer: $150/yr | Custom pricing |
|
||||||
|
| **dependency-cruiser** | ✅ MIT License | - | - |
|
||||||
|
| **ArchUnit** | ✅ Apache 2.0 | - | - |
|
||||||
|
| **FTA** | ✅ Open Source | - | - |
|
||||||
|
| **ESLint** | ✅ MIT License | - | - |
|
||||||
|
|
||||||
|
**Guardian will always be free and open-source (MIT License)**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🚀 Setup Time Comparison
|
||||||
|
|
||||||
|
| Tool | Setup Time | Configuration Required |
|
||||||
|
|------|------------|------------------------|
|
||||||
|
| **Guardian** | ⚡ 2 minutes | ❌ Zero-config (DDD) |
|
||||||
|
| **SonarQube** | 🐌 2-4 hours | ✅ Extensive setup |
|
||||||
|
| **dependency-cruiser** | ⚡ 5 minutes | ⚠️ Rules configuration |
|
||||||
|
| **ArchUnit** | ⚙️ 30 minutes | ✅ Write test rules |
|
||||||
|
| **FTA** | ⚡ 1 minute | ❌ Zero-config |
|
||||||
|
| **ESLint** | ⚡ 10 minutes | ⚠️ Plugin configuration |
|
||||||
|
|
||||||
|
**Guardian Setup:**
|
||||||
|
```bash
|
||||||
|
# 1. Install (30 seconds)
|
||||||
|
npm install -g @samiyev/guardian
|
||||||
|
|
||||||
|
# 2. Run (90 seconds)
|
||||||
|
cd your-project
|
||||||
|
guardian check ./src
|
||||||
|
|
||||||
|
# Done! 🎉
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Real-World Performance
|
||||||
|
|
||||||
|
### Analysis Speed (1000 TypeScript files)
|
||||||
|
|
||||||
|
| Tool | Time | Notes |
|
||||||
|
|------|------|-------|
|
||||||
|
| **FTA** | ~0.6s | ⚡ Fastest (Rust) |
|
||||||
|
| **Guardian** | ~2s | Fast (Node.js, tree-sitter) |
|
||||||
|
| **dependency-cruiser** | ~3s | Fast |
|
||||||
|
| **ESLint** | ~5s | Depends on rules |
|
||||||
|
| **SonarQube** | ~15s | Slower (comprehensive) |
|
||||||
|
|
||||||
|
### Memory Usage
|
||||||
|
|
||||||
|
| Tool | RAM | Notes |
|
||||||
|
|------|-----|-------|
|
||||||
|
| **Guardian** | ~150MB | Efficient |
|
||||||
|
| **FTA** | ~50MB | Minimal (Rust) |
|
||||||
|
| **dependency-cruiser** | ~200MB | Moderate |
|
||||||
|
| **ESLint** | ~300MB | Varies by plugins |
|
||||||
|
| **SonarQube** | ~2GB | Heavy (server) |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Use Case Recommendations
|
||||||
|
|
||||||
|
### Scenario 1: TypeScript Startup Using AI Coding
|
||||||
|
|
||||||
|
**Best Stack:**
|
||||||
|
```bash
|
||||||
|
✅ Guardian (architecture + hardcode)
|
||||||
|
✅ ESLint (code quality)
|
||||||
|
✅ Prettier (formatting)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Why:**
|
||||||
|
- Fast setup
|
||||||
|
- AI workflow integration
|
||||||
|
- Zero-config DDD
|
||||||
|
- Catches AI mistakes (hardcode)
|
||||||
|
|
||||||
|
### Scenario 2: Enterprise Multi-Language
|
||||||
|
|
||||||
|
**Best Stack:**
|
||||||
|
```bash
|
||||||
|
✅ SonarQube (security + multi-language)
|
||||||
|
✅ Guardian (TypeScript DDD specialization)
|
||||||
|
✅ ArchUnit (Java architecture)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Why:**
|
||||||
|
- Comprehensive coverage
|
||||||
|
- Security scanning
|
||||||
|
- Language-specific depth
|
||||||
|
|
||||||
|
### Scenario 3: Clean Architecture Refactoring
|
||||||
|
|
||||||
|
**Best Stack:**
|
||||||
|
```bash
|
||||||
|
✅ Guardian (architecture enforcement)
|
||||||
|
✅ dependency-cruiser (visualization)
|
||||||
|
✅ Guardian v0.9+ (auto-fix)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Why:**
|
||||||
|
- Visualize current state
|
||||||
|
- Detect violations
|
||||||
|
- Auto-fix issues
|
||||||
|
|
||||||
|
### Scenario 4: Python + TypeScript Monorepo
|
||||||
|
|
||||||
|
**Best Stack:**
|
||||||
|
```bash
|
||||||
|
✅ Guardian (TypeScript)
|
||||||
|
✅ import-linter (Python)
|
||||||
|
✅ SonarQube (security, both languages)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Why:**
|
||||||
|
- Language-specific depth
|
||||||
|
- Unified security scanning
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🏆 Winner by Category
|
||||||
|
|
||||||
|
| Category | Winner | Runner-up |
|
||||||
|
|----------|--------|-----------|
|
||||||
|
| **TypeScript Architecture** | 🥇 Guardian | dependency-cruiser |
|
||||||
|
| **Multi-Language** | 🥇 SonarQube | - |
|
||||||
|
| **Visualization** | 🥇 dependency-cruiser | SonarQube |
|
||||||
|
| **AI Workflow** | 🥇 Guardian | - (no competitor) |
|
||||||
|
| **Security** | 🥇 SonarQube | - |
|
||||||
|
| **Hardcode Detection** | 🥇 Guardian | - (no competitor) |
|
||||||
|
| **DDD Patterns** | 🥇 Guardian | ArchUnit (Java) |
|
||||||
|
| **Auto-Fix** | 🥇 ESLint (syntax) | Guardian v0.9+ (architecture) |
|
||||||
|
| **Complexity Metrics** | 🥇 FTA | SonarQube |
|
||||||
|
| **Setup Speed** | 🥇 FTA | Guardian |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔮 Future Roadmap Comparison
|
||||||
|
|
||||||
|
### Guardian v1.0.0 (Q4 2026)
|
||||||
|
- ✅ Configuration & presets (v0.6)
|
||||||
|
- ✅ Visualization (v0.7)
|
||||||
|
- ✅ CI/CD kit (v0.8)
|
||||||
|
- ✅ Auto-fix (v0.9) **UNIQUE!**
|
||||||
|
- ✅ Metrics dashboard (v0.10)
|
||||||
|
- ✅ 30+ DDD patterns (v0.11-v0.32)
|
||||||
|
- ✅ VS Code extension
|
||||||
|
- ✅ JetBrains plugin
|
||||||
|
|
||||||
|
### Competitors
|
||||||
|
- **SonarQube**: Incremental improvements, AI-powered fixes (experimental)
|
||||||
|
- **dependency-cruiser**: Stable, no major changes planned
|
||||||
|
- **ArchUnit**: Java focus, incremental improvements
|
||||||
|
- **FTA**: Adding more metrics
|
||||||
|
- **ESLint**: Flat config, performance improvements
|
||||||
|
|
||||||
|
**Guardian's Advantage:** Only tool actively expanding DDD/architecture detection
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💡 Migration Guides
|
||||||
|
|
||||||
|
### From SonarQube to Guardian
|
||||||
|
|
||||||
|
**When to migrate:**
|
||||||
|
- TypeScript-only project
|
||||||
|
- Want faster iteration
|
||||||
|
- Need DDD-specific checks
|
||||||
|
- Don't need multi-language/security
|
||||||
|
|
||||||
|
**How to migrate:**
|
||||||
|
```bash
|
||||||
|
# Keep SonarQube for security
|
||||||
|
# Add Guardian for architecture
|
||||||
|
npm install -g @samiyev/guardian
|
||||||
|
guardian check ./src
|
||||||
|
|
||||||
|
# CI/CD: Run both
|
||||||
|
# SonarQube (security) → Guardian (architecture)
|
||||||
|
```
|
||||||
|
|
||||||
|
### From ESLint-only to ESLint + Guardian
|
||||||
|
|
||||||
|
**Why add Guardian:**
|
||||||
|
```typescript
|
||||||
|
// ESLint checks syntax
|
||||||
|
// Guardian checks architecture
|
||||||
|
```
|
||||||
|
|
||||||
|
**How to add:**
|
||||||
|
```bash
|
||||||
|
# Keep ESLint
|
||||||
|
npm run lint
|
||||||
|
|
||||||
|
# Add Guardian
|
||||||
|
guardian check ./src
|
||||||
|
|
||||||
|
# Both in CI:
|
||||||
|
npm run lint && guardian check ./src
|
||||||
|
```
|
||||||
|
|
||||||
|
### From dependency-cruiser to Guardian
|
||||||
|
|
||||||
|
**Why migrate:**
|
||||||
|
- Need more than circular deps
|
||||||
|
- Want hardcode detection
|
||||||
|
- Need DDD patterns
|
||||||
|
- Want auto-fix (v0.9+)
|
||||||
|
|
||||||
|
**How to migrate:**
|
||||||
|
```bash
|
||||||
|
# Replace:
|
||||||
|
depcruise src --config .dependency-cruiser.js
|
||||||
|
|
||||||
|
# With:
|
||||||
|
guardian check ./src
|
||||||
|
|
||||||
|
# Or keep both:
|
||||||
|
# dependency-cruiser → visualization
|
||||||
|
# Guardian → architecture + hardcode
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📚 Additional Resources
|
||||||
|
|
||||||
|
### Guardian
|
||||||
|
- [GitHub Repository](https://github.com/samiyev/puaros)
|
||||||
|
- [Documentation](https://puaros.ailabs.uz)
|
||||||
|
- [npm Package](https://www.npmjs.com/package/@samiyev/guardian)
|
||||||
|
|
||||||
|
### Competitors
|
||||||
|
- [SonarQube](https://www.sonarsource.com/products/sonarqube/)
|
||||||
|
- [dependency-cruiser](https://github.com/sverweij/dependency-cruiser)
|
||||||
|
- [ArchUnit](https://www.archunit.org/)
|
||||||
|
- [FTA](https://ftaproject.dev/)
|
||||||
|
- [import-linter](https://import-linter.readthedocs.io/)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🤝 Community & Support
|
||||||
|
|
||||||
|
| Tool | Community | Support |
|
||||||
|
|------|-----------|---------|
|
||||||
|
| **Guardian** | GitHub Issues | Community (planned: Discord) |
|
||||||
|
| **SonarQube** | Community Forum | Commercial support available |
|
||||||
|
| **dependency-cruiser** | GitHub Issues | Community |
|
||||||
|
| **ArchUnit** | GitHub Issues | Community |
|
||||||
|
| **ESLint** | Discord, Twitter | Community |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Guardian's Position in the Market:**
|
||||||
|
|
||||||
|
> **"The AI-First Architecture Guardian for TypeScript teams practicing DDD/Clean Architecture"**
|
||||||
|
|
||||||
|
**Guardian is NOT:**
|
||||||
|
- ❌ A replacement for SonarQube's security scanning
|
||||||
|
- ❌ A replacement for ESLint's code quality checks
|
||||||
|
- ❌ A multi-language tool (yet)
|
||||||
|
|
||||||
|
**Guardian IS:**
|
||||||
|
- ✅ The best tool for TypeScript DDD/Clean Architecture
|
||||||
|
- ✅ The only tool optimized for AI-assisted coding
|
||||||
|
- ✅ The only tool with intelligent hardcode detection
|
||||||
|
- ✅ The only tool with auto-fix for architecture (v0.9+)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Questions? Feedback?**
|
||||||
|
|
||||||
|
- 📧 Email: fozilbek.samiyev@gmail.com
|
||||||
|
- 🐙 GitHub: https://github.com/samiyev/puaros/issues
|
||||||
|
- 🌐 Website: https://puaros.ailabs.uz
|
||||||
323
packages/guardian/COMPETITIVE_ANALYSIS_SUMMARY.md
Normal file
323
packages/guardian/COMPETITIVE_ANALYSIS_SUMMARY.md
Normal file
@@ -0,0 +1,323 @@
|
|||||||
|
# Competitive Analysis & Roadmap - Summary
|
||||||
|
|
||||||
|
**Date:** 2025-01-24
|
||||||
|
**Prepared for:** Puaros Guardian
|
||||||
|
**Documents Created:**
|
||||||
|
1. ROADMAP_NEW.md - Updated roadmap with reprioritized features
|
||||||
|
2. COMPARISON.md - Comprehensive competitor comparison
|
||||||
|
3. docs/v0.6.0-CONFIGURATION-SPEC.md - Configuration feature specification
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Executive Summary
|
||||||
|
|
||||||
|
Guardian has **5 unique features** that no competitor offers, positioning it as the **only tool built for AI-assisted DDD/Clean Architecture development**. However, to achieve enterprise adoption, we need to first match competitors' baseline features (configuration, visualization, CI/CD, metrics).
|
||||||
|
|
||||||
|
### Current Position (v0.5.1)
|
||||||
|
|
||||||
|
**Strengths:**
|
||||||
|
- ✅ Hardcode detection with AI suggestions (UNIQUE)
|
||||||
|
- ✅ Framework leak detection (UNIQUE)
|
||||||
|
- ✅ Entity exposure detection (UNIQUE)
|
||||||
|
- ✅ Repository pattern validation (UNIQUE)
|
||||||
|
- ✅ DDD-specific naming conventions (UNIQUE)
|
||||||
|
|
||||||
|
**Gaps:**
|
||||||
|
- ❌ No configuration file support
|
||||||
|
- ❌ No visualization/graphs
|
||||||
|
- ❌ No ready-to-use CI/CD templates
|
||||||
|
- ❌ No metrics/quality score
|
||||||
|
- ❌ No auto-fix capabilities
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Competitive Landscape
|
||||||
|
|
||||||
|
### Main Competitors
|
||||||
|
|
||||||
|
| Tool | Strength | Weakness | Market Position |
|
||||||
|
|------|----------|----------|-----------------|
|
||||||
|
| **SonarQube** | Multi-language + Security | Complex setup, expensive | Enterprise leader |
|
||||||
|
| **dependency-cruiser** | Best visualization | No hardcode/DDD | Dependency specialist |
|
||||||
|
| **ArchUnit** | Java architecture | Java-only | Java ecosystem |
|
||||||
|
| **FTA** | Fast metrics (Rust) | No architecture checks | Metrics tool |
|
||||||
|
| **ESLint** | Huge ecosystem | No architecture | Code quality standard |
|
||||||
|
|
||||||
|
### Guardian's Unique Position
|
||||||
|
|
||||||
|
> **"The AI-First Architecture Guardian for TypeScript teams practicing DDD/Clean Architecture"**
|
||||||
|
|
||||||
|
**Market Gap Filled:**
|
||||||
|
- No tool optimizes for AI-assisted coding workflow
|
||||||
|
- No tool deeply understands DDD patterns (except ArchUnit for Java)
|
||||||
|
- No tool combines hardcode detection + architecture enforcement
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🚀 Strategic Roadmap
|
||||||
|
|
||||||
|
### Phase 1: Market Parity (v0.6-v0.10) - Q1-Q2 2026
|
||||||
|
|
||||||
|
**Goal:** Match competitors' baseline features
|
||||||
|
|
||||||
|
| Version | Feature | Why Critical | Competitor |
|
||||||
|
|---------|---------|--------------|------------|
|
||||||
|
| v0.6.0 | Configuration & Presets | All competitors have this | ESLint, SonarQube |
|
||||||
|
| v0.7.0 | Visualization | dependency-cruiser's main advantage | dependency-cruiser |
|
||||||
|
| v0.8.0 | CI/CD Integration Kit | Enterprise requirement | SonarQube |
|
||||||
|
| v0.9.0 | **Auto-Fix (UNIQUE!)** | Game-changer, no one has this | None |
|
||||||
|
| v0.10.0 | Metrics & Quality Score | Enterprise adoption | SonarQube |
|
||||||
|
|
||||||
|
**After v0.10.0:** Guardian competes with SonarQube/dependency-cruiser on features
|
||||||
|
|
||||||
|
### Phase 2: DDD Specialization (v0.11-v0.32) - Q3-Q4 2026
|
||||||
|
|
||||||
|
**Goal:** Deepen DDD/Clean Architecture expertise
|
||||||
|
|
||||||
|
30+ DDD pattern detectors:
|
||||||
|
- Aggregate boundaries
|
||||||
|
- Anemic domain model
|
||||||
|
- Domain events
|
||||||
|
- Value Object immutability
|
||||||
|
- CQRS validation
|
||||||
|
- Saga pattern
|
||||||
|
- Anti-Corruption Layer
|
||||||
|
- Ubiquitous Language
|
||||||
|
- And 22+ more...
|
||||||
|
|
||||||
|
**After Phase 2:** Guardian = THE tool for DDD/Clean Architecture
|
||||||
|
|
||||||
|
### Phase 3: Enterprise Ecosystem (v1.0+) - Q4 2026+
|
||||||
|
|
||||||
|
**Goal:** Full enterprise platform
|
||||||
|
|
||||||
|
- VS Code extension
|
||||||
|
- JetBrains plugin
|
||||||
|
- Web dashboard
|
||||||
|
- Team analytics
|
||||||
|
- Multi-language support (Python, C#, Java)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔥 Critical Changes to Current Roadmap
|
||||||
|
|
||||||
|
### Old Roadmap Issues
|
||||||
|
|
||||||
|
❌ **v0.6.0 was "Aggregate Boundaries"** → Too early for DDD-specific features
|
||||||
|
❌ **v0.12.0 was "Configuration"** → Way too late! Critical feature postponed
|
||||||
|
❌ **Missing:** Visualization, CI/CD, Auto-fix, Metrics
|
||||||
|
❌ **Too many consecutive DDD features** → Need market parity first
|
||||||
|
|
||||||
|
### New Roadmap Priorities
|
||||||
|
|
||||||
|
✅ **v0.6.0 = Configuration (MOVED UP)** → Critical for adoption
|
||||||
|
✅ **v0.7.0 = Visualization (NEW)** → Compete with dependency-cruiser
|
||||||
|
✅ **v0.8.0 = CI/CD Kit (NEW)** → Enterprise requirement
|
||||||
|
✅ **v0.9.0 = Auto-Fix (NEW, UNIQUE!)** → Game-changing differentiator
|
||||||
|
✅ **v0.10.0 = Metrics (NEW)** → Compete with SonarQube
|
||||||
|
✅ **v0.11+ = DDD Features** → After market parity
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💡 Key Recommendations
|
||||||
|
|
||||||
|
### Immediate Actions (Next 2 Weeks)
|
||||||
|
|
||||||
|
1. **Review & Approve New Roadmap**
|
||||||
|
- Read ROADMAP_NEW.md
|
||||||
|
- Approve priority changes
|
||||||
|
- Create GitHub milestones
|
||||||
|
|
||||||
|
2. **Start v0.6.0 Configuration**
|
||||||
|
- Read v0.6.0-CONFIGURATION-SPEC.md
|
||||||
|
- Create implementation tasks
|
||||||
|
- Start Phase 1 development
|
||||||
|
|
||||||
|
3. **Update Documentation**
|
||||||
|
- Update main README.md with comparison table
|
||||||
|
- Add "Guardian vs Competitors" section
|
||||||
|
- Link to COMPARISON.md
|
||||||
|
|
||||||
|
### Next 3 Months (Q1 2026)
|
||||||
|
|
||||||
|
4. **Complete v0.6.0 (Configuration)**
|
||||||
|
- 8-week timeline
|
||||||
|
- Beta test with community
|
||||||
|
- Stable release
|
||||||
|
|
||||||
|
5. **Start v0.7.0 (Visualization)**
|
||||||
|
- Design graph system
|
||||||
|
- Choose visualization library
|
||||||
|
- Prototype SVG/Mermaid output
|
||||||
|
|
||||||
|
6. **Marketing & Positioning**
|
||||||
|
- Create comparison blog post
|
||||||
|
- Submit to Product Hunt
|
||||||
|
- Share on Reddit/HackerNews
|
||||||
|
|
||||||
|
### Next 6 Months (Q1-Q2 2026)
|
||||||
|
|
||||||
|
7. **Complete Market Parity (v0.6-v0.10)**
|
||||||
|
- Configuration ✅
|
||||||
|
- Visualization ✅
|
||||||
|
- CI/CD Integration ✅
|
||||||
|
- Auto-Fix ✅ (UNIQUE!)
|
||||||
|
- Metrics ✅
|
||||||
|
|
||||||
|
8. **Community Growth**
|
||||||
|
- 1000+ GitHub stars
|
||||||
|
- 100+ weekly npm installs
|
||||||
|
- 10+ enterprise adopters
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📈 Success Metrics
|
||||||
|
|
||||||
|
### v0.10.0 (Market Parity Achieved) - June 2026
|
||||||
|
|
||||||
|
**Feature Parity:**
|
||||||
|
- ✅ Configuration support (compete with ESLint)
|
||||||
|
- ✅ Visualization (compete with dependency-cruiser)
|
||||||
|
- ✅ CI/CD integration (compete with SonarQube)
|
||||||
|
- ✅ Auto-fix (UNIQUE! Game-changer)
|
||||||
|
- ✅ Metrics dashboard (compete with SonarQube)
|
||||||
|
|
||||||
|
**Adoption Metrics:**
|
||||||
|
- 1,000+ GitHub stars
|
||||||
|
- 100+ weekly npm installs
|
||||||
|
- 50+ projects with guardian.config.js
|
||||||
|
- 10+ enterprise teams
|
||||||
|
|
||||||
|
### v1.0.0 (Enterprise Ready) - December 2026
|
||||||
|
|
||||||
|
**Feature Completeness:**
|
||||||
|
- ✅ All baseline features
|
||||||
|
- ✅ 30+ DDD pattern detectors
|
||||||
|
- ✅ IDE extensions (VS Code, JetBrains)
|
||||||
|
- ✅ Web dashboard
|
||||||
|
- ✅ Team analytics
|
||||||
|
|
||||||
|
**Market Position:**
|
||||||
|
- #1 tool for TypeScript DDD/Clean Architecture
|
||||||
|
- Top 3 in static analysis for TypeScript
|
||||||
|
- Known in enterprise as "the AI code reviewer"
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Positioning Strategy
|
||||||
|
|
||||||
|
### Target Segments
|
||||||
|
|
||||||
|
1. **Primary:** TypeScript developers using AI coding assistants (GitHub Copilot, Cursor, Windsurf, Claude, ChatGPT, Cline)
|
||||||
|
2. **Secondary:** Teams implementing DDD/Clean Architecture
|
||||||
|
3. **Tertiary:** Startups/scale-ups needing fast quality enforcement
|
||||||
|
|
||||||
|
### Messaging
|
||||||
|
|
||||||
|
**Tagline:** "The AI-First Architecture Guardian"
|
||||||
|
|
||||||
|
**Key Messages:**
|
||||||
|
- "Catches the #1 AI mistake: hardcoded values everywhere"
|
||||||
|
- "Enforces Clean Architecture that AI often ignores"
|
||||||
|
- "Closes the AI feedback loop for cleaner code"
|
||||||
|
- "The only tool with auto-fix for architecture" (v0.9+)
|
||||||
|
|
||||||
|
### Differentiation
|
||||||
|
|
||||||
|
**Guardian ≠ SonarQube:** We're specialized for TypeScript DDD, not multi-language security
|
||||||
|
**Guardian ≠ dependency-cruiser:** We detect patterns, not just dependencies
|
||||||
|
**Guardian ≠ ESLint:** We enforce architecture, not syntax
|
||||||
|
|
||||||
|
**Guardian = ESLint for architecture + AI code reviewer**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📚 Document Guide
|
||||||
|
|
||||||
|
### ROADMAP_NEW.md
|
||||||
|
|
||||||
|
**Purpose:** Complete technical roadmap with reprioritized features
|
||||||
|
**Audience:** Development team, contributors
|
||||||
|
**Key Sections:**
|
||||||
|
- Current state analysis
|
||||||
|
- Phase 1: Market Parity (v0.6-v0.10)
|
||||||
|
- Phase 2: DDD Specialization (v0.11-v0.32)
|
||||||
|
- Phase 3: Enterprise Ecosystem (v1.0+)
|
||||||
|
|
||||||
|
### COMPARISON.md
|
||||||
|
|
||||||
|
**Purpose:** Marketing-focused comparison with all competitors
|
||||||
|
**Audience:** Users, potential adopters, marketing
|
||||||
|
**Key Sections:**
|
||||||
|
- Feature comparison matrix
|
||||||
|
- Detailed tool comparisons
|
||||||
|
- When to use each tool
|
||||||
|
- Use case recommendations
|
||||||
|
- Winner by category
|
||||||
|
|
||||||
|
### v0.6.0-CONFIGURATION-SPEC.md
|
||||||
|
|
||||||
|
**Purpose:** Technical specification for Configuration feature
|
||||||
|
**Audience:** Development team
|
||||||
|
**Key Sections:**
|
||||||
|
- Configuration file format
|
||||||
|
- Preset system design
|
||||||
|
- Rule configuration
|
||||||
|
- Implementation plan (8 weeks)
|
||||||
|
- Testing strategy
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎬 Next Steps
|
||||||
|
|
||||||
|
### Week 1-2: Planning & Kickoff
|
||||||
|
|
||||||
|
- [ ] Review all three documents
|
||||||
|
- [ ] Approve new roadmap priorities
|
||||||
|
- [ ] Create GitHub milestones for v0.6.0-v0.10.0
|
||||||
|
- [ ] Create implementation issues for v0.6.0
|
||||||
|
- [ ] Update main README.md with comparison table
|
||||||
|
|
||||||
|
### Week 3-10: v0.6.0 Development
|
||||||
|
|
||||||
|
- [ ] Phase 1: Core Configuration (Week 3-4)
|
||||||
|
- [ ] Phase 2: Rule Configuration (Week 4-5)
|
||||||
|
- [ ] Phase 3: Preset System (Week 5-6)
|
||||||
|
- [ ] Phase 4: Ignore Patterns (Week 6-7)
|
||||||
|
- [ ] Phase 5: CLI Integration (Week 7-8)
|
||||||
|
- [ ] Phase 6: Documentation (Week 8-9)
|
||||||
|
- [ ] Phase 7: Beta & Release (Week 9-10)
|
||||||
|
|
||||||
|
### Post-v0.6.0
|
||||||
|
|
||||||
|
- [ ] Start v0.7.0 (Visualization) planning
|
||||||
|
- [ ] Marketing push (blog, Product Hunt, etc.)
|
||||||
|
- [ ] Community feedback gathering
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ❓ Questions?
|
||||||
|
|
||||||
|
**For technical questions:**
|
||||||
|
- Email: fozilbek.samiyev@gmail.com
|
||||||
|
- GitHub Issues: https://github.com/samiyev/puaros/issues
|
||||||
|
|
||||||
|
**For strategic decisions:**
|
||||||
|
- Review sessions: Schedule with team
|
||||||
|
- Roadmap adjustments: Create GitHub discussion
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📝 Changelog
|
||||||
|
|
||||||
|
**2025-01-24:** Initial competitive analysis and roadmap revision
|
||||||
|
- Created comprehensive competitor comparison
|
||||||
|
- Reprioritized roadmap (Configuration moved to v0.6.0)
|
||||||
|
- Added market parity phase (v0.6-v0.10)
|
||||||
|
- Designed v0.6.0 Configuration specification
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Status:** ✅ Analysis complete, ready for implementation
|
||||||
|
|
||||||
|
**Confidence Level:** HIGH - Analysis based on thorough competitor research and market positioning
|
||||||
@@ -72,6 +72,24 @@ Code quality guardian for vibe coders and enterprise teams - because AI writes f
|
|||||||
- Prevents "new Repository()" anti-pattern
|
- Prevents "new Repository()" anti-pattern
|
||||||
- 📚 *Based on: Martin Fowler's Repository Pattern, DDD (Evans 2003)* → [Why?](./docs/WHY.md#repository-pattern)
|
- 📚 *Based on: Martin Fowler's Repository Pattern, DDD (Evans 2003)* → [Why?](./docs/WHY.md#repository-pattern)
|
||||||
|
|
||||||
|
🔒 **Aggregate Boundary Validation**
|
||||||
|
- Detects direct entity references across DDD aggregates
|
||||||
|
- Enforces reference-by-ID or Value Object pattern
|
||||||
|
- Prevents tight coupling between aggregates
|
||||||
|
- Supports multiple folder structures (domain/aggregates/*, domain/*, domain/entities/*)
|
||||||
|
- Filters allowed imports (value-objects, events, repositories, services)
|
||||||
|
- Critical severity for maintaining aggregate independence
|
||||||
|
- 📚 *Based on: Domain-Driven Design (Evans 2003), Implementing DDD (Vernon 2013)* → [Why?](./docs/WHY.md#aggregate-boundaries)
|
||||||
|
|
||||||
|
🔐 **Secret Detection** ✨ NEW in v0.8.0
|
||||||
|
- Detects 350+ types of hardcoded secrets using industry-standard Secretlint
|
||||||
|
- Catches AWS keys, GitHub tokens, NPM tokens, SSH keys, API keys, and more
|
||||||
|
- All secrets marked as **CRITICAL severity** - immediate security risk
|
||||||
|
- Context-aware remediation suggestions for each secret type
|
||||||
|
- Prevents credentials from reaching version control
|
||||||
|
- Integrates seamlessly with existing detectors
|
||||||
|
- 📚 *Based on: OWASP Top 10, CWE-798 (Hardcoded Credentials), NIST Security Guidelines* → [Learn more](https://owasp.org/www-community/vulnerabilities/Use_of_hard-coded_password)
|
||||||
|
|
||||||
🏗️ **Clean Architecture Enforcement**
|
🏗️ **Clean Architecture Enforcement**
|
||||||
- Built with DDD principles
|
- Built with DDD principles
|
||||||
- Layered architecture (Domain, Application, Infrastructure)
|
- Layered architecture (Domain, Application, Infrastructure)
|
||||||
@@ -357,6 +375,15 @@ const result = await analyzeProject({
|
|||||||
})
|
})
|
||||||
|
|
||||||
console.log(`Found ${result.hardcodeViolations.length} hardcoded values`)
|
console.log(`Found ${result.hardcodeViolations.length} hardcoded values`)
|
||||||
|
console.log(`Found ${result.secretViolations.length} hardcoded secrets 🔐`)
|
||||||
|
|
||||||
|
// Check for critical security issues first!
|
||||||
|
result.secretViolations.forEach((violation) => {
|
||||||
|
console.log(`🔐 CRITICAL: ${violation.file}:${violation.line}`)
|
||||||
|
console.log(` Secret Type: ${violation.secretType}`)
|
||||||
|
console.log(` ${violation.message}`)
|
||||||
|
console.log(` ⚠️ Rotate this secret immediately!`)
|
||||||
|
})
|
||||||
|
|
||||||
result.hardcodeViolations.forEach((violation) => {
|
result.hardcodeViolations.forEach((violation) => {
|
||||||
console.log(`${violation.file}:${violation.line}`)
|
console.log(`${violation.file}:${violation.line}`)
|
||||||
@@ -385,9 +412,9 @@ npx @samiyev/guardian check ./src --verbose
|
|||||||
npx @samiyev/guardian check ./src --no-hardcode # Skip hardcode detection
|
npx @samiyev/guardian check ./src --no-hardcode # Skip hardcode detection
|
||||||
npx @samiyev/guardian check ./src --no-architecture # Skip architecture checks
|
npx @samiyev/guardian check ./src --no-architecture # Skip architecture checks
|
||||||
|
|
||||||
# Filter by severity
|
# Filter by severity (perfect for finding secrets first!)
|
||||||
npx @samiyev/guardian check ./src --min-severity high # Show high, critical only
|
npx @samiyev/guardian check ./src --only-critical # Show only critical issues (secrets, circular deps)
|
||||||
npx @samiyev/guardian check ./src --only-critical # Show only critical issues
|
npx @samiyev/guardian check ./src --min-severity high # Show high and critical only
|
||||||
|
|
||||||
# Limit detailed output (useful for large codebases)
|
# Limit detailed output (useful for large codebases)
|
||||||
npx @samiyev/guardian check ./src --limit 10 # Show first 10 violations per category
|
npx @samiyev/guardian check ./src --limit 10 # Show first 10 violations per category
|
||||||
|
|||||||
@@ -2,9 +2,22 @@
|
|||||||
|
|
||||||
This document outlines the current features and future plans for @puaros/guardian.
|
This document outlines the current features and future plans for @puaros/guardian.
|
||||||
|
|
||||||
## Current Version: 0.6.0 ✅ RELEASED
|
## Current Version: 0.9.0 ✅ RELEASED
|
||||||
|
|
||||||
**Released:** 2025-11-24
|
**Released:** 2025-11-26
|
||||||
|
|
||||||
|
### What's New in 0.9.0
|
||||||
|
|
||||||
|
- 🏛️ **Anemic Model Detection** - NEW feature to detect anemic domain models lacking business logic
|
||||||
|
- ✅ **100% clean codebase** - Guardian now passes its own self-check with 0 issues
|
||||||
|
- 📦 **New shared constants** - Added CLASS_KEYWORDS and EXAMPLE_CODE_CONSTANTS
|
||||||
|
- ✅ **All 578 tests passing** - Added 12 new tests for anemic model detection
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Previous Version: 0.8.1 ✅ RELEASED
|
||||||
|
|
||||||
|
**Released:** 2025-11-25
|
||||||
|
|
||||||
### Features Included in 0.1.0
|
### Features Included in 0.1.0
|
||||||
|
|
||||||
@@ -256,11 +269,10 @@ Internal refactoring to eliminate hardcoded values and improve maintainability:
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Future Roadmap
|
## Version 0.7.0 - Aggregate Boundary Validation 🔒 ✅ RELEASED
|
||||||
|
|
||||||
### Version 0.6.0 - Aggregate Boundary Validation 🔒
|
**Released:** 2025-11-24
|
||||||
**Target:** Q1 2026
|
**Priority:** CRITICAL
|
||||||
**Priority:** MEDIUM
|
|
||||||
|
|
||||||
Validate aggregate boundaries in DDD:
|
Validate aggregate boundaries in DDD:
|
||||||
|
|
||||||
@@ -286,16 +298,265 @@ class Order {
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
**Planned Features:**
|
**Implemented Features:**
|
||||||
- Detect entity references across aggregates
|
- ✅ Detect entity references across aggregates
|
||||||
- Allow only ID or Value Object references
|
- ✅ Allow only ID or Value Object references from other aggregates
|
||||||
- Detect circular dependencies between aggregates
|
- ✅ Filter allowed imports (value-objects, events, repositories, services)
|
||||||
- Validate aggregate root access patterns
|
- ✅ Support for multiple aggregate folder structures (domain/aggregates/name, domain/name, domain/entities/name)
|
||||||
- Support for aggregate folder structure
|
- ✅ 41 comprehensive tests with 100% pass rate
|
||||||
|
- ✅ Examples of good and bad patterns
|
||||||
|
- ✅ CLI output with 🔒 icon and detailed violation info
|
||||||
|
- ✅ Critical severity level for aggregate boundary violations
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.8.0 - Anemic Domain Model Detection 🩺
|
## Future Roadmap
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Version 0.7.5 - Refactor AnalyzeProject Use-Case 🔧 ✅ RELEASED
|
||||||
|
|
||||||
|
**Released:** 2025-11-25
|
||||||
|
**Priority:** HIGH
|
||||||
|
**Scope:** Single session (~128K tokens)
|
||||||
|
|
||||||
|
Split `AnalyzeProject.ts` (615 lines) into focused pipeline components.
|
||||||
|
|
||||||
|
**Problem:**
|
||||||
|
- God Use-Case with 615 lines
|
||||||
|
- Mixing: file scanning, parsing, detection, aggregation
|
||||||
|
- Hard to test and modify individual steps
|
||||||
|
|
||||||
|
**Solution:**
|
||||||
|
```
|
||||||
|
application/use-cases/
|
||||||
|
├── AnalyzeProject.ts # Orchestrator (245 lines)
|
||||||
|
├── pipeline/
|
||||||
|
│ ├── FileCollectionStep.ts # File scanning (66 lines)
|
||||||
|
│ ├── ParsingStep.ts # AST + dependency graph (51 lines)
|
||||||
|
│ ├── DetectionPipeline.ts # All 7 detectors (371 lines)
|
||||||
|
│ └── ResultAggregator.ts # Build response DTO (81 lines)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Deliverables:**
|
||||||
|
- ✅ Extract 4 pipeline components
|
||||||
|
- ✅ Reduce `AnalyzeProject.ts` from 615 to 245 lines (60% reduction)
|
||||||
|
- ✅ All 345 tests pass, no breaking changes
|
||||||
|
- ✅ Publish to npm
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Version 0.7.6 - Refactor CLI Module 🔧 ✅ RELEASED
|
||||||
|
|
||||||
|
**Released:** 2025-11-25
|
||||||
|
**Priority:** MEDIUM
|
||||||
|
**Scope:** Single session (~128K tokens)
|
||||||
|
|
||||||
|
Split `cli/index.ts` (484 lines) into focused formatters.
|
||||||
|
|
||||||
|
**Problem:**
|
||||||
|
- CLI file has 484 lines
|
||||||
|
- Mixing: command setup, formatting, grouping, statistics
|
||||||
|
|
||||||
|
**Solution:**
|
||||||
|
```
|
||||||
|
cli/
|
||||||
|
├── index.ts # Commands only (260 lines)
|
||||||
|
├── formatters/
|
||||||
|
│ ├── OutputFormatter.ts # Violation formatting (190 lines)
|
||||||
|
│ └── StatisticsFormatter.ts # Metrics & summary (58 lines)
|
||||||
|
├── groupers/
|
||||||
|
│ └── ViolationGrouper.ts # Sorting & grouping (29 lines)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Deliverables:**
|
||||||
|
- ✅ Extract formatters and groupers
|
||||||
|
- ✅ Reduce `cli/index.ts` from 484 to 260 lines (46% reduction)
|
||||||
|
- ✅ CLI output identical to before
|
||||||
|
- ✅ All 345 tests pass, no breaking changes
|
||||||
|
- ✅ Publish to npm
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Version 0.7.7 - Improve Test Coverage 🧪 ✅ RELEASED
|
||||||
|
|
||||||
|
**Released:** 2025-11-25
|
||||||
|
**Priority:** MEDIUM
|
||||||
|
**Scope:** Single session (~128K tokens)
|
||||||
|
|
||||||
|
Increase coverage for under-tested domain files.
|
||||||
|
|
||||||
|
**Results:**
|
||||||
|
| File | Before | After |
|
||||||
|
|------|--------|-------|
|
||||||
|
| SourceFile.ts | 46% | 100% ✅ |
|
||||||
|
| ProjectPath.ts | 50% | 100% ✅ |
|
||||||
|
| ValueObject.ts | 25% | 100% ✅ |
|
||||||
|
| RepositoryViolation.ts | 58% | 92.68% ✅ |
|
||||||
|
|
||||||
|
**Deliverables:**
|
||||||
|
- ✅ SourceFile.ts → 100% (31 tests)
|
||||||
|
- ✅ ProjectPath.ts → 100% (31 tests)
|
||||||
|
- ✅ ValueObject.ts → 100% (18 tests)
|
||||||
|
- ✅ RepositoryViolation.ts → 92.68% (32 tests)
|
||||||
|
- ✅ All 457 tests passing
|
||||||
|
- ✅ Overall coverage: 95.4% statements, 86.25% branches, 96.68% functions
|
||||||
|
- ✅ Publish to npm
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Version 0.7.8 - Add E2E Tests 🧪 ✅ RELEASED
|
||||||
|
|
||||||
|
**Released:** 2025-11-25
|
||||||
|
**Priority:** MEDIUM
|
||||||
|
**Scope:** Single session (~128K tokens)
|
||||||
|
|
||||||
|
Add integration tests for full pipeline and CLI.
|
||||||
|
|
||||||
|
**Deliverables:**
|
||||||
|
- ✅ E2E test: `AnalyzeProject` full pipeline (21 tests)
|
||||||
|
- ✅ CLI smoke test (spawn process, check output) (22 tests)
|
||||||
|
- ✅ Test `examples/good-architecture/` → 0 violations
|
||||||
|
- ✅ Test `examples/bad/` → specific violations
|
||||||
|
- ✅ Test JSON output format (19 tests)
|
||||||
|
- ✅ 519 total tests (519 passing, **100% pass rate** 🎉)
|
||||||
|
- ✅ Comprehensive E2E coverage for API and CLI
|
||||||
|
- ✅ 3 new E2E test files with full pipeline coverage
|
||||||
|
- ✅ Publish to npm
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Version 0.7.9 - Refactor Large Detectors 🔧 ✅ RELEASED
|
||||||
|
|
||||||
|
**Released:** 2025-11-25
|
||||||
|
**Priority:** LOW
|
||||||
|
**Scope:** Single session (~128K tokens)
|
||||||
|
|
||||||
|
Refactored largest detectors to reduce complexity and improve maintainability.
|
||||||
|
|
||||||
|
**Results:**
|
||||||
|
| Detector | Before | After | Reduction |
|
||||||
|
|----------|--------|-------|-----------|
|
||||||
|
| AggregateBoundaryDetector | 381 lines | 162 lines | 57% ✅ |
|
||||||
|
| HardcodeDetector | 459 lines | 89 lines | 81% ✅ |
|
||||||
|
| RepositoryPatternDetector | 479 lines | 106 lines | 78% ✅ |
|
||||||
|
|
||||||
|
**Implemented Features:**
|
||||||
|
- ✅ Extracted 13 strategy classes for focused responsibilities
|
||||||
|
- ✅ Reduced file sizes by 57-81%
|
||||||
|
- ✅ Improved code organization and maintainability
|
||||||
|
- ✅ All 519 tests passing
|
||||||
|
- ✅ Zero ESLint errors, 1 pre-existing warning
|
||||||
|
- ✅ No breaking changes
|
||||||
|
|
||||||
|
**New Strategy Classes:**
|
||||||
|
- `FolderRegistry` - Centralized DDD folder name management
|
||||||
|
- `AggregatePathAnalyzer` - Path parsing and aggregate extraction
|
||||||
|
- `ImportValidator` - Import validation logic
|
||||||
|
- `BraceTracker` - Brace and bracket counting
|
||||||
|
- `ConstantsFileChecker` - Constants file detection
|
||||||
|
- `ExportConstantAnalyzer` - Export const analysis
|
||||||
|
- `MagicNumberMatcher` - Magic number detection
|
||||||
|
- `MagicStringMatcher` - Magic string detection
|
||||||
|
- `OrmTypeMatcher` - ORM type matching
|
||||||
|
- `MethodNameValidator` - Repository method validation
|
||||||
|
- `RepositoryFileAnalyzer` - File role detection
|
||||||
|
- `RepositoryViolationDetector` - Violation detection logic
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Version 0.8.0 - Secret Detection 🔐 ✅ RELEASED
|
||||||
|
|
||||||
|
**Released:** 2025-11-25
|
||||||
|
**Priority:** CRITICAL
|
||||||
|
|
||||||
|
Detect hardcoded secrets (API keys, tokens, credentials) using industry-standard Secretlint library.
|
||||||
|
|
||||||
|
**🎯 SecretDetector - NEW standalone detector:**
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// ❌ CRITICAL: Hardcoded AWS credentials
|
||||||
|
const AWS_KEY = "AKIA1234567890ABCDEF" // VIOLATION!
|
||||||
|
const AWS_SECRET = "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY" // VIOLATION!
|
||||||
|
|
||||||
|
// ❌ CRITICAL: Hardcoded GitHub token
|
||||||
|
const GITHUB_TOKEN = "ghp_1234567890abcdefghijklmnopqrstuv" // VIOLATION!
|
||||||
|
|
||||||
|
// ❌ CRITICAL: SSH Private Key in code
|
||||||
|
const privateKey = `-----BEGIN RSA PRIVATE KEY-----
|
||||||
|
MIIEpAIBAAKCAQEA...` // VIOLATION!
|
||||||
|
|
||||||
|
// ❌ CRITICAL: NPM token
|
||||||
|
//registry.npmjs.org/:_authToken=npm_abc123xyz // VIOLATION!
|
||||||
|
|
||||||
|
// ✅ GOOD: Use environment variables
|
||||||
|
const AWS_KEY = process.env.AWS_ACCESS_KEY_ID
|
||||||
|
const AWS_SECRET = process.env.AWS_SECRET_ACCESS_KEY
|
||||||
|
const GITHUB_TOKEN = process.env.GITHUB_TOKEN
|
||||||
|
```
|
||||||
|
|
||||||
|
**Planned Features:**
|
||||||
|
- ✅ **SecretDetector** - Standalone detector (separate from HardcodeDetector)
|
||||||
|
- ✅ **Secretlint Integration** - Industry-standard library (@secretlint/node)
|
||||||
|
- ✅ **350+ Secret Patterns** - AWS, GitHub, NPM, SSH, GCP, Slack, Basic Auth, etc.
|
||||||
|
- ✅ **CRITICAL Severity** - All secret violations marked as critical
|
||||||
|
- ✅ **Smart Suggestions** - Context-aware remediation per secret type
|
||||||
|
- ✅ **Clean Architecture** - New ISecretDetector interface, SecretViolation value object
|
||||||
|
- ✅ **CLI Integration** - New "🔐 Secrets" section in output
|
||||||
|
- ✅ **Parallel Execution** - Runs alongside existing detectors
|
||||||
|
|
||||||
|
**Secret Types Detected:**
|
||||||
|
- AWS Access Keys & Secret Keys
|
||||||
|
- GitHub Tokens (ghp_, github_pat_, gho_, etc.)
|
||||||
|
- NPM tokens in .npmrc and code
|
||||||
|
- SSH Private Keys
|
||||||
|
- GCP Service Account credentials
|
||||||
|
- Slack tokens (xoxb-, xoxp-, etc.)
|
||||||
|
- Basic Auth credentials
|
||||||
|
- JWT tokens
|
||||||
|
- Private encryption keys
|
||||||
|
|
||||||
|
**Architecture:**
|
||||||
|
```typescript
|
||||||
|
// New domain layer
|
||||||
|
interface ISecretDetector {
|
||||||
|
detectAll(code: string, filePath: string): Promise<SecretViolation[]>
|
||||||
|
}
|
||||||
|
|
||||||
|
class SecretViolation {
|
||||||
|
file: string
|
||||||
|
line: number
|
||||||
|
secretType: string // AWS, GitHub, NPM, etc.
|
||||||
|
message: string
|
||||||
|
severity: "critical"
|
||||||
|
suggestion: string // Context-aware guidance
|
||||||
|
}
|
||||||
|
|
||||||
|
// New infrastructure implementation
|
||||||
|
class SecretDetector implements ISecretDetector {
|
||||||
|
// Uses @secretlint/node internally
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Why Secretlint?**
|
||||||
|
- ✅ Actively maintained (updates weekly)
|
||||||
|
- ✅ TypeScript native
|
||||||
|
- ✅ Pluggable architecture
|
||||||
|
- ✅ Low false positives
|
||||||
|
- ✅ Industry standard
|
||||||
|
|
||||||
|
**Why NOT custom implementation?**
|
||||||
|
- ❌ No good npm library for magic numbers/strings
|
||||||
|
- ❌ Our HardcodeDetector is better than existing solutions
|
||||||
|
- ✅ Secretlint is perfect for secrets (don't reinvent the wheel)
|
||||||
|
- ✅ Two focused detectors better than one bloated detector
|
||||||
|
|
||||||
|
**Impact:**
|
||||||
|
Guardian will now catch critical security issues BEFORE they reach production, complementing existing magic number/string detection.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Version 0.9.0 - Anemic Domain Model Detection 🩺
|
||||||
**Target:** Q2 2026
|
**Target:** Q2 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -336,7 +597,7 @@ class Order {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.8.0 - Domain Event Usage Validation 📢
|
### Version 0.10.0 - Domain Event Usage Validation 📢
|
||||||
**Target:** Q2 2026
|
**Target:** Q2 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -375,7 +636,7 @@ class Order {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.9.0 - Value Object Immutability Check 🔐
|
### Version 0.11.0 - Value Object Immutability Check 🔐
|
||||||
**Target:** Q2 2026
|
**Target:** Q2 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -418,7 +679,7 @@ class Email {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.10.0 - Use Case Single Responsibility 🎯
|
### Version 0.12.0 - Use Case Single Responsibility 🎯
|
||||||
**Target:** Q2 2026
|
**Target:** Q2 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -455,7 +716,7 @@ class SendWelcomeEmail {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.11.0 - Interface Segregation Validation 🔌
|
### Version 0.13.0 - Interface Segregation Validation 🔌
|
||||||
**Target:** Q2 2026
|
**Target:** Q2 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -500,7 +761,7 @@ interface IUserExporter {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.12.0 - Port-Adapter Pattern Validation 🔌
|
### Version 0.14.0 - Port-Adapter Pattern Validation 🔌
|
||||||
**Target:** Q2 2026
|
**Target:** Q2 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -539,7 +800,7 @@ class TwilioAdapter implements INotificationPort {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.13.0 - Configuration File Support ⚙️
|
### Version 0.15.0 - Configuration File Support ⚙️
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -590,7 +851,7 @@ export default {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.14.0 - Command Query Separation (CQS/CQRS) 📝
|
### Version 0.16.0 - Command Query Separation (CQS/CQRS) 📝
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -651,7 +912,7 @@ class GetUser { // Query
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.15.0 - Factory Pattern Validation 🏭
|
### Version 0.17.0 - Factory Pattern Validation 🏭
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -734,7 +995,7 @@ class Order {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.16.0 - Specification Pattern Detection 🔍
|
### Version 0.18.0 - Specification Pattern Detection 🔍
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -806,7 +1067,7 @@ class ApproveOrder {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.17.0 - Layered Service Anti-pattern Detection ⚠️
|
### Version 0.19.0 - Layered Service Anti-pattern Detection ⚠️
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -883,7 +1144,7 @@ class OrderService {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.18.0 - Bounded Context Leak Detection 🚧
|
### Version 0.20.0 - Bounded Context Leak Detection 🚧
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -948,7 +1209,7 @@ class ProductPriceChangedHandler {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.19.0 - Transaction Script vs Domain Model Detection 📜
|
### Version 0.21.0 - Transaction Script vs Domain Model Detection 📜
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -1015,7 +1276,7 @@ class Order {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.20.0 - Persistence Ignorance Validation 💾
|
### Version 0.22.0 - Persistence Ignorance Validation 💾
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -1101,7 +1362,7 @@ class UserEntityMapper {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.21.0 - Null Object Pattern Detection 🎭
|
### Version 0.23.0 - Null Object Pattern Detection 🎭
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -1183,7 +1444,7 @@ class ProcessOrder {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.22.0 - Primitive Obsession in Methods 🔢
|
### Version 0.24.0 - Primitive Obsession in Methods 🔢
|
||||||
**Target:** Q3 2026
|
**Target:** Q3 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -1250,7 +1511,7 @@ class Order {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.23.0 - Service Locator Anti-pattern 🔍
|
### Version 0.25.0 - Service Locator Anti-pattern 🔍
|
||||||
**Target:** Q4 2026
|
**Target:** Q4 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -1310,7 +1571,7 @@ class CreateUser {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.24.0 - Double Dispatch Pattern Validation 🎯
|
### Version 0.26.0 - Double Dispatch Pattern Validation 🎯
|
||||||
**Target:** Q4 2026
|
**Target:** Q4 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -1387,7 +1648,7 @@ class ShippingCostCalculator implements IOrderItemVisitor {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.25.0 - Entity Identity Validation 🆔
|
### Version 0.27.0 - Entity Identity Validation 🆔
|
||||||
**Target:** Q4 2026
|
**Target:** Q4 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -1480,7 +1741,7 @@ class UserId {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.26.0 - Saga Pattern Detection 🔄
|
### Version 0.28.0 - Saga Pattern Detection 🔄
|
||||||
**Target:** Q4 2026
|
**Target:** Q4 2026
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
|
|
||||||
@@ -1578,7 +1839,7 @@ abstract class SagaStep {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.27.0 - Anti-Corruption Layer Detection 🛡️
|
### Version 0.29.0 - Anti-Corruption Layer Detection 🛡️
|
||||||
**Target:** Q4 2026
|
**Target:** Q4 2026
|
||||||
**Priority:** MEDIUM
|
**Priority:** MEDIUM
|
||||||
|
|
||||||
@@ -1664,7 +1925,7 @@ interface IOrderSyncPort {
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.28.0 - Ubiquitous Language Validation 📖
|
### Version 0.30.0 - Ubiquitous Language Validation 📖
|
||||||
**Target:** Q4 2026
|
**Target:** Q4 2026
|
||||||
**Priority:** HIGH
|
**Priority:** HIGH
|
||||||
|
|
||||||
@@ -1851,5 +2112,5 @@ Until we reach 1.0.0, minor version bumps (0.x.0) may include breaking changes a
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
**Last Updated:** 2025-11-24
|
**Last Updated:** 2025-11-25
|
||||||
**Current Version:** 0.6.0
|
**Current Version:** 0.7.7
|
||||||
|
|||||||
906
packages/guardian/ROADMAP_NEW.md
Normal file
906
packages/guardian/ROADMAP_NEW.md
Normal file
@@ -0,0 +1,906 @@
|
|||||||
|
# Guardian Roadmap 🗺️
|
||||||
|
|
||||||
|
**Last Updated:** 2025-01-24
|
||||||
|
**Current Version:** 0.5.1
|
||||||
|
|
||||||
|
This document outlines the current features and strategic roadmap for @puaros/guardian, prioritized based on market competition analysis and enterprise adoption requirements.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Current State (v0.5.1) ✅
|
||||||
|
|
||||||
|
### ✨ Unique Competitive Advantages
|
||||||
|
|
||||||
|
Guardian currently has **5 unique features** that competitors don't offer:
|
||||||
|
|
||||||
|
| Feature | Status | Competitors |
|
||||||
|
|---------|--------|-------------|
|
||||||
|
| **Hardcode Detection + AI Suggestions** | ✅ Released | ❌ None |
|
||||||
|
| **Framework Leak Detection** | ✅ Released | ❌ None |
|
||||||
|
| **Entity Exposure Detection** | ✅ Released (v0.3.0) | ❌ None |
|
||||||
|
| **Dependency Direction Enforcement** | ✅ Released (v0.4.0) | ⚠️ dependency-cruiser (via rules) |
|
||||||
|
| **Repository Pattern Validation** | ✅ Released (v0.5.0) | ❌ None |
|
||||||
|
|
||||||
|
### 🛠️ Core Features (v0.1.0-v0.5.0)
|
||||||
|
|
||||||
|
**Detection Capabilities:**
|
||||||
|
- ✅ Hardcode detection (magic numbers, magic strings) with smart suggestions
|
||||||
|
- ✅ Circular dependency detection
|
||||||
|
- ✅ Naming convention enforcement (DDD layer-based rules)
|
||||||
|
- ✅ Clean Architecture layer violations
|
||||||
|
- ✅ Framework leak detection (domain importing frameworks)
|
||||||
|
- ✅ Entity exposure in API responses (v0.3.0)
|
||||||
|
- ✅ Dependency direction validation (v0.4.0)
|
||||||
|
- ✅ Repository pattern validation (v0.5.0)
|
||||||
|
|
||||||
|
**Developer Experience:**
|
||||||
|
- ✅ CLI interface with `guardian check` command
|
||||||
|
- ✅ Smart constant name suggestions
|
||||||
|
- ✅ Layer distribution analysis
|
||||||
|
- ✅ Detailed violation reports with file:line:column
|
||||||
|
- ✅ Context snippets for each issue
|
||||||
|
|
||||||
|
**Quality & Testing:**
|
||||||
|
- ✅ 194 tests across 7 test files (all passing)
|
||||||
|
- ✅ 80%+ code coverage on all metrics
|
||||||
|
- ✅ Self-analysis: 0 violations (100% clean codebase)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Strategic Roadmap Overview
|
||||||
|
|
||||||
|
### Phase 1: Market Parity (v0.6-v0.10) - Q1-Q2 2026
|
||||||
|
**Goal:** Match competitors' baseline features to enable enterprise adoption
|
||||||
|
|
||||||
|
- Configuration & Presets
|
||||||
|
- Visualization & Dependency Graphs
|
||||||
|
- CI/CD Integration Kit
|
||||||
|
- Auto-Fix & Code Generation (UNIQUE!)
|
||||||
|
- Metrics & Quality Score
|
||||||
|
|
||||||
|
### Phase 2: DDD Specialization (v0.11-v0.27) - Q3-Q4 2026
|
||||||
|
**Goal:** Deepen DDD/Clean Architecture expertise
|
||||||
|
|
||||||
|
- Advanced DDD pattern detection (25+ features)
|
||||||
|
- Aggregate boundaries, Domain Events, Value Objects
|
||||||
|
- CQRS, Saga Pattern, Anti-Corruption Layer
|
||||||
|
- Ubiquitous Language validation
|
||||||
|
|
||||||
|
### Phase 3: Enterprise Ecosystem (v1.0+) - Q4 2026+
|
||||||
|
**Goal:** Full-featured enterprise platform
|
||||||
|
|
||||||
|
- VS Code extension
|
||||||
|
- JetBrains plugin
|
||||||
|
- Web dashboard
|
||||||
|
- Team analytics
|
||||||
|
- Multi-language support
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📅 Detailed Roadmap
|
||||||
|
|
||||||
|
## Version 0.6.0 - Configuration & Presets ⚙️
|
||||||
|
**Target:** Q1 2026 (January-February)
|
||||||
|
**Priority:** 🔥 CRITICAL
|
||||||
|
|
||||||
|
> **Why Critical:** All competitors (SonarQube, ESLint, dependency-cruiser) have configuration. Without this, Guardian cannot be customized for different teams/projects.
|
||||||
|
|
||||||
|
### Features
|
||||||
|
|
||||||
|
#### 1. Configuration File Support
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// guardian.config.js (primary)
|
||||||
|
export default {
|
||||||
|
// Zero-config presets
|
||||||
|
preset: 'clean-architecture', // or 'ddd', 'hexagonal', 'onion'
|
||||||
|
|
||||||
|
// Rule configuration
|
||||||
|
rules: {
|
||||||
|
'hardcode/magic-numbers': 'error',
|
||||||
|
'hardcode/magic-strings': 'warn',
|
||||||
|
'architecture/layer-violation': 'error',
|
||||||
|
'architecture/framework-leak': 'error',
|
||||||
|
'architecture/entity-exposure': 'error',
|
||||||
|
'circular-dependency': 'error',
|
||||||
|
'naming-convention': 'warn',
|
||||||
|
'dependency-direction': 'error',
|
||||||
|
'repository-pattern': 'error',
|
||||||
|
},
|
||||||
|
|
||||||
|
// Custom layer paths
|
||||||
|
layers: {
|
||||||
|
domain: 'src/core/domain',
|
||||||
|
application: 'src/core/application',
|
||||||
|
infrastructure: 'src/adapters',
|
||||||
|
shared: 'src/shared',
|
||||||
|
},
|
||||||
|
|
||||||
|
// Exclusions
|
||||||
|
exclude: [
|
||||||
|
'**/*.test.ts',
|
||||||
|
'**/*.spec.ts',
|
||||||
|
'scripts/',
|
||||||
|
'migrations/',
|
||||||
|
'node_modules/',
|
||||||
|
],
|
||||||
|
|
||||||
|
// Per-rule ignores
|
||||||
|
ignore: {
|
||||||
|
'hardcode/magic-numbers': {
|
||||||
|
'src/config/constants.ts': [3000, 8080],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. Built-in Presets
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Preset: clean-architecture (default)
|
||||||
|
preset: 'clean-architecture'
|
||||||
|
// Enables: layer-violation, dependency-direction, naming-convention
|
||||||
|
|
||||||
|
// Preset: ddd
|
||||||
|
preset: 'ddd'
|
||||||
|
// Enables all DDD patterns: aggregates, value-objects, domain-events
|
||||||
|
|
||||||
|
// Preset: hexagonal (Ports & Adapters)
|
||||||
|
preset: 'hexagonal'
|
||||||
|
// Validates port/adapter separation
|
||||||
|
|
||||||
|
// Preset: minimal (for prototyping)
|
||||||
|
preset: 'minimal'
|
||||||
|
// Only critical rules: hardcode, circular-deps
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Framework-Specific Presets
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// NestJS
|
||||||
|
preset: 'nestjs-clean-architecture'
|
||||||
|
|
||||||
|
// Express
|
||||||
|
preset: 'express-clean-architecture'
|
||||||
|
|
||||||
|
// Next.js
|
||||||
|
preset: 'nextjs-clean-architecture'
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 4. Configuration Discovery
|
||||||
|
|
||||||
|
Support multiple config file formats:
|
||||||
|
- `guardian.config.js` (ES modules)
|
||||||
|
- `guardian.config.cjs` (CommonJS)
|
||||||
|
- `.guardianrc` (JSON)
|
||||||
|
- `.guardianrc.json`
|
||||||
|
- `package.json` (`guardian` field)
|
||||||
|
|
||||||
|
#### 5. CLI Override
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Override config from CLI
|
||||||
|
guardian check ./src --rule hardcode/magic-numbers=off
|
||||||
|
|
||||||
|
# Use specific config file
|
||||||
|
guardian check ./src --config custom-config.js
|
||||||
|
|
||||||
|
# Generate config
|
||||||
|
guardian init --preset clean-architecture
|
||||||
|
```
|
||||||
|
|
||||||
|
### Implementation Tasks
|
||||||
|
- [ ] Create config parser and validator
|
||||||
|
- [ ] Implement preset system
|
||||||
|
- [ ] Add config discovery logic
|
||||||
|
- [ ] Update AnalyzeProject use case to accept config
|
||||||
|
- [ ] CLI integration for config override
|
||||||
|
- [ ] Add `guardian init` command
|
||||||
|
- [ ] Documentation and examples
|
||||||
|
- [ ] Tests (config parsing, presets, overrides)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Version 0.7.0 - Visualization & Dependency Graphs 🎨
|
||||||
|
**Target:** Q1 2026 (March)
|
||||||
|
**Priority:** 🔥 HIGH
|
||||||
|
|
||||||
|
> **Why High:** dependency-cruiser's main advantage is visualization. Guardian needs this to compete.
|
||||||
|
|
||||||
|
### Features
|
||||||
|
|
||||||
|
#### 1. Dependency Graph Visualization
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Generate SVG graph
|
||||||
|
guardian visualize ./src --output architecture.svg
|
||||||
|
|
||||||
|
# Interactive HTML
|
||||||
|
guardian visualize ./src --format html --output report.html
|
||||||
|
|
||||||
|
# Mermaid diagram for docs
|
||||||
|
guardian graph ./src --format mermaid > ARCHITECTURE.md
|
||||||
|
|
||||||
|
# ASCII tree for terminal
|
||||||
|
guardian visualize ./src --format ascii
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. Layer Dependency Diagram
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TD
|
||||||
|
I[Infrastructure Layer] --> A[Application Layer]
|
||||||
|
I --> D[Domain Layer]
|
||||||
|
A --> D
|
||||||
|
D --> S[Shared]
|
||||||
|
A --> S
|
||||||
|
I --> S
|
||||||
|
|
||||||
|
style D fill:#4CAF50
|
||||||
|
style A fill:#2196F3
|
||||||
|
style I fill:#FF9800
|
||||||
|
style S fill:#9E9E9E
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Violation Highlighting
|
||||||
|
|
||||||
|
Visualize violations on graph:
|
||||||
|
- 🔴 Circular dependencies (red arrows)
|
||||||
|
- ⚠️ Framework leaks (yellow highlights)
|
||||||
|
- 🚫 Wrong dependency direction (dashed red arrows)
|
||||||
|
- ✅ Correct dependencies (green arrows)
|
||||||
|
|
||||||
|
#### 4. Metrics Overlay
|
||||||
|
|
||||||
|
```bash
|
||||||
|
guardian visualize ./src --show-metrics
|
||||||
|
|
||||||
|
# Shows on each node:
|
||||||
|
# - File count per layer
|
||||||
|
# - Hardcode violations count
|
||||||
|
# - Complexity score
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 5. Export Formats
|
||||||
|
|
||||||
|
- SVG (for docs/website)
|
||||||
|
- PNG (for presentations)
|
||||||
|
- HTML (interactive, zoomable)
|
||||||
|
- Mermaid (for markdown docs)
|
||||||
|
- DOT (Graphviz format)
|
||||||
|
- JSON (for custom processing)
|
||||||
|
|
||||||
|
### Implementation Tasks
|
||||||
|
- [ ] Implement graph generation engine
|
||||||
|
- [ ] Add SVG/PNG renderer
|
||||||
|
- [ ] Create Mermaid diagram generator
|
||||||
|
- [ ] Build HTML interactive viewer
|
||||||
|
- [ ] Add violation highlighting
|
||||||
|
- [ ] Metrics overlay system
|
||||||
|
- [ ] CLI commands (`visualize`, `graph`)
|
||||||
|
- [ ] Documentation and examples
|
||||||
|
- [ ] Tests (graph generation, formats)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Version 0.8.0 - CI/CD Integration Kit 🚀
|
||||||
|
**Target:** Q2 2026 (April)
|
||||||
|
**Priority:** 🔥 HIGH
|
||||||
|
|
||||||
|
> **Why High:** Enterprise requires CI/CD integration. SonarQube succeeds because of this.
|
||||||
|
|
||||||
|
### Features
|
||||||
|
|
||||||
|
#### 1. GitHub Actions
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# .github/workflows/guardian.yml (ready-to-use template)
|
||||||
|
name: Guardian Quality Check
|
||||||
|
|
||||||
|
on: [push, pull_request]
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
guardian:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v3
|
||||||
|
- uses: actions/setup-node@v3
|
||||||
|
|
||||||
|
- name: Guardian Analysis
|
||||||
|
uses: puaros/guardian-action@v1
|
||||||
|
with:
|
||||||
|
path: './src'
|
||||||
|
fail-on: 'error'
|
||||||
|
report-format: 'markdown'
|
||||||
|
|
||||||
|
- name: Comment PR
|
||||||
|
uses: actions/github-script@v6
|
||||||
|
if: github.event_name == 'pull_request'
|
||||||
|
with:
|
||||||
|
script: |
|
||||||
|
// Auto-comment violations on PR
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. GitLab CI Template
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# .gitlab-ci.yml
|
||||||
|
include:
|
||||||
|
- template: Guardian.gitlab-ci.yml
|
||||||
|
|
||||||
|
guardian_check:
|
||||||
|
stage: test
|
||||||
|
extends: .guardian
|
||||||
|
variables:
|
||||||
|
GUARDIAN_FAIL_ON: "error"
|
||||||
|
GUARDIAN_FORMAT: "markdown"
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Quality Gate
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Fail build on violations
|
||||||
|
guardian check ./src --fail-on error
|
||||||
|
guardian check ./src --fail-on warning
|
||||||
|
|
||||||
|
# Threshold-based
|
||||||
|
guardian check ./src --max-violations 10
|
||||||
|
guardian check ./src --max-hardcode 5
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 4. PR Auto-Comments
|
||||||
|
|
||||||
|
Automatically comment on PRs with:
|
||||||
|
- Summary of violations
|
||||||
|
- Comparison with base branch
|
||||||
|
- Quality score change
|
||||||
|
- Actionable suggestions
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
## 🛡️ Guardian Report
|
||||||
|
|
||||||
|
**Quality Score:** 87/100 (⬆️ +3 from main)
|
||||||
|
|
||||||
|
### Violations Found: 5
|
||||||
|
|
||||||
|
#### 🔴 Critical (2)
|
||||||
|
- `src/api/server.ts:15` - Hardcoded port 3000
|
||||||
|
- `src/domain/User.ts:10` - Framework leak (Express)
|
||||||
|
|
||||||
|
#### ⚠️ Warnings (3)
|
||||||
|
- `src/services/UserService.ts` - Naming convention
|
||||||
|
- ...
|
||||||
|
|
||||||
|
[View Full Report](link)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 5. Pre-commit Hook
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install via npx
|
||||||
|
npx guardian install-hooks
|
||||||
|
|
||||||
|
# Creates .husky/pre-commit
|
||||||
|
#!/bin/sh
|
||||||
|
guardian check --staged --fail-on error
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 6. Status Checks
|
||||||
|
|
||||||
|
Integrate with GitHub/GitLab status checks:
|
||||||
|
- ✅ No violations
|
||||||
|
- ⚠️ Warnings only
|
||||||
|
- ❌ Errors found
|
||||||
|
|
||||||
|
### Implementation Tasks
|
||||||
|
- [ ] Create GitHub Action
|
||||||
|
- [ ] Create GitLab CI template
|
||||||
|
- [ ] Implement quality gate logic
|
||||||
|
- [ ] Build PR comment generator
|
||||||
|
- [ ] Pre-commit hook installer
|
||||||
|
- [ ] Status check integration
|
||||||
|
- [ ] Bitbucket Pipelines support
|
||||||
|
- [ ] Documentation and examples
|
||||||
|
- [ ] Tests (CI/CD scenarios)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Version 0.9.0 - Auto-Fix & Code Generation 🤖
|
||||||
|
**Target:** Q2 2026 (May)
|
||||||
|
**Priority:** 🚀 GAME-CHANGER (UNIQUE!)
|
||||||
|
|
||||||
|
> **Why Game-Changer:** No competitor has intelligent auto-fix for architecture. This makes Guardian unique!
|
||||||
|
|
||||||
|
### Features
|
||||||
|
|
||||||
|
#### 1. Auto-Fix Hardcode
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Fix all hardcode violations automatically
|
||||||
|
guardian fix ./src --auto
|
||||||
|
|
||||||
|
# Preview changes
|
||||||
|
guardian fix ./src --dry-run
|
||||||
|
|
||||||
|
# Fix specific types
|
||||||
|
guardian fix ./src --type hardcode
|
||||||
|
guardian fix ./src --type naming
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example:**
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Before
|
||||||
|
const timeout = 5000
|
||||||
|
app.listen(3000)
|
||||||
|
|
||||||
|
// After (auto-generated constants.ts)
|
||||||
|
export const DEFAULT_TIMEOUT_MS = 5000
|
||||||
|
export const DEFAULT_PORT = 3000
|
||||||
|
|
||||||
|
// After (fixed code)
|
||||||
|
import { DEFAULT_TIMEOUT_MS, DEFAULT_PORT } from './constants'
|
||||||
|
const timeout = DEFAULT_TIMEOUT_MS
|
||||||
|
app.listen(DEFAULT_PORT)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. Generate Constants File
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Extract all hardcodes to constants
|
||||||
|
guardian generate constants ./src --output src/config/constants.ts
|
||||||
|
|
||||||
|
# Generated file:
|
||||||
|
// src/config/constants.ts
|
||||||
|
export const DEFAULT_TIMEOUT_MS = 5000
|
||||||
|
export const DEFAULT_PORT = 3000
|
||||||
|
export const MAX_RETRIES = 3
|
||||||
|
export const API_BASE_URL = 'http://localhost:8080'
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Fix Naming Violations
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Rename files to match conventions
|
||||||
|
guardian fix naming ./src --auto
|
||||||
|
|
||||||
|
# Before: src/application/use-cases/user.ts
|
||||||
|
# After: src/application/use-cases/CreateUser.ts
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 4. AI-Friendly Fix Prompts
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Generate prompt for AI assistant
|
||||||
|
guardian check ./src --format ai-prompt > fix-prompt.txt
|
||||||
|
|
||||||
|
# Output (optimized for Claude/GPT):
|
||||||
|
"""
|
||||||
|
Fix the following Guardian violations:
|
||||||
|
|
||||||
|
1. HARDCODE (src/api/server.ts:15)
|
||||||
|
- Replace: app.listen(3000)
|
||||||
|
- With: Extract 3000 to DEFAULT_PORT constant
|
||||||
|
- Location: Create src/config/constants.ts
|
||||||
|
|
||||||
|
2. FRAMEWORK_LEAK (src/domain/User.ts:5)
|
||||||
|
- Remove: import { Request } from 'express'
|
||||||
|
- Reason: Domain layer cannot import frameworks
|
||||||
|
- Suggestion: Use dependency injection via interfaces
|
||||||
|
|
||||||
|
[Complete fix suggestions...]
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Then feed to Claude:
|
||||||
|
# cat fix-prompt.txt | pbcopy
|
||||||
|
# Paste into Claude: "Fix these Guardian violations"
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 5. Interactive Fix Mode
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Interactive fix selection
|
||||||
|
guardian fix ./src --interactive
|
||||||
|
|
||||||
|
# Prompts:
|
||||||
|
# ? Fix hardcode in server.ts:15 (3000)? (Y/n)
|
||||||
|
# ? Suggested constant name: DEFAULT_PORT
|
||||||
|
# [Edit name] [Skip] [Fix All]
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 6. Refactoring Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Break circular dependency
|
||||||
|
guardian refactor circular ./src/services/UserService.ts
|
||||||
|
# Suggests: Extract shared interface
|
||||||
|
|
||||||
|
# Fix layer violation
|
||||||
|
guardian refactor layer ./src/domain/entities/User.ts
|
||||||
|
# Suggests: Move framework imports to infrastructure
|
||||||
|
```
|
||||||
|
|
||||||
|
### Implementation Tasks
|
||||||
|
- [ ] Implement auto-fix engine (AST transformation)
|
||||||
|
- [ ] Constants extractor and generator
|
||||||
|
- [ ] File renaming system
|
||||||
|
- [ ] AI prompt generator
|
||||||
|
- [ ] Interactive fix mode
|
||||||
|
- [ ] Refactoring suggestions
|
||||||
|
- [ ] Safe rollback mechanism
|
||||||
|
- [ ] Documentation and examples
|
||||||
|
- [ ] Tests (fix scenarios, edge cases)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Version 0.10.0 - Metrics & Quality Score 📊
|
||||||
|
**Target:** Q2 2026 (June)
|
||||||
|
**Priority:** 🔥 HIGH
|
||||||
|
|
||||||
|
> **Why High:** Enterprise needs metrics to justify investment. SonarQube's dashboard is a major selling point.
|
||||||
|
|
||||||
|
### Features
|
||||||
|
|
||||||
|
#### 1. Quality Score (0-100)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
guardian score ./src
|
||||||
|
|
||||||
|
# Output:
|
||||||
|
# 🛡️ Guardian Quality Score: 87/100 (Good)
|
||||||
|
#
|
||||||
|
# Category Breakdown:
|
||||||
|
# ✅ Architecture: 95/100 (Excellent)
|
||||||
|
# ⚠️ Hardcode: 78/100 (Needs Improvement)
|
||||||
|
# ✅ Naming: 92/100 (Excellent)
|
||||||
|
# ✅ Dependencies: 89/100 (Good)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Score Calculation:**
|
||||||
|
- Architecture violations: -5 per error
|
||||||
|
- Hardcode violations: -1 per occurrence
|
||||||
|
- Circular dependencies: -10 per cycle
|
||||||
|
- Naming violations: -2 per error
|
||||||
|
|
||||||
|
#### 2. Metrics Dashboard (JSON/HTML)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Export metrics
|
||||||
|
guardian metrics ./src --format json > metrics.json
|
||||||
|
guardian metrics ./src --format html > dashboard.html
|
||||||
|
|
||||||
|
# Metrics included:
|
||||||
|
{
|
||||||
|
"qualityScore": 87,
|
||||||
|
"violations": {
|
||||||
|
"hardcode": 12,
|
||||||
|
"circular": 0,
|
||||||
|
"architecture": 2,
|
||||||
|
"naming": 5
|
||||||
|
},
|
||||||
|
"metrics": {
|
||||||
|
"totalFiles": 45,
|
||||||
|
"totalLOC": 3500,
|
||||||
|
"hardcodePerKLOC": 3.4,
|
||||||
|
"averageFilesPerLayer": 11.25
|
||||||
|
},
|
||||||
|
"trends": {
|
||||||
|
"scoreChange": "+5",
|
||||||
|
"violationsChange": "-8"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Trend Analysis
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Compare with main branch
|
||||||
|
guardian metrics ./src --compare-with main
|
||||||
|
|
||||||
|
# Output:
|
||||||
|
# Quality Score: 87/100 (⬆️ +3 from main)
|
||||||
|
#
|
||||||
|
# Changes:
|
||||||
|
# ✅ Hardcode violations: 12 (⬇️ -5)
|
||||||
|
# ⚠️ Naming violations: 5 (⬆️ +2)
|
||||||
|
# ✅ Circular deps: 0 (⬇️ -1)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 4. Historical Tracking
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Store metrics history
|
||||||
|
guardian metrics ./src --save
|
||||||
|
|
||||||
|
# View trends
|
||||||
|
guardian trends --last 30d
|
||||||
|
|
||||||
|
# Output: ASCII graph showing quality score over time
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 5. Export for Dashboards
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Prometheus format
|
||||||
|
guardian metrics ./src --format prometheus
|
||||||
|
|
||||||
|
# Grafana JSON
|
||||||
|
guardian metrics ./src --format grafana
|
||||||
|
|
||||||
|
# CSV for Excel
|
||||||
|
guardian metrics ./src --format csv
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 6. Badge Generation
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Generate badge for README
|
||||||
|
guardian badge ./src --output badge.svg
|
||||||
|
|
||||||
|
# Markdown badge
|
||||||
|

|
||||||
|
```
|
||||||
|
|
||||||
|
### Implementation Tasks
|
||||||
|
- [ ] Quality score calculation algorithm
|
||||||
|
- [ ] Metrics collection system
|
||||||
|
- [ ] Trend analysis engine
|
||||||
|
- [ ] JSON/HTML/Prometheus exporters
|
||||||
|
- [ ] Historical data storage
|
||||||
|
- [ ] Badge generator
|
||||||
|
- [ ] CLI commands (`score`, `metrics`, `trends`, `badge`)
|
||||||
|
- [ ] Documentation and examples
|
||||||
|
- [ ] Tests (metrics calculation, exports)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Version 0.11.0+ - DDD Specialization 🏗️
|
||||||
|
**Target:** Q3-Q4 2026
|
||||||
|
**Priority:** MEDIUM (After Market Parity)
|
||||||
|
|
||||||
|
Now we can focus on Guardian's unique DDD/Clean Architecture specialization:
|
||||||
|
|
||||||
|
### v0.11.0 - Aggregate Boundary Validation 🔒
|
||||||
|
- Detect entity references across aggregates
|
||||||
|
- Enforce ID-only references between aggregates
|
||||||
|
- Validate aggregate root access patterns
|
||||||
|
|
||||||
|
### v0.12.0 - Anemic Domain Model Detection 🩺
|
||||||
|
- Detect entities with only getters/setters
|
||||||
|
- Count methods vs properties ratio
|
||||||
|
- Suggest moving logic from services to entities
|
||||||
|
|
||||||
|
### v0.13.0 - Domain Event Validation 📢
|
||||||
|
- Validate event publishing pattern
|
||||||
|
- Check events inherit from DomainEvent base
|
||||||
|
- Detect direct infrastructure calls from entities
|
||||||
|
|
||||||
|
### v0.14.0 - Value Object Immutability 🔐
|
||||||
|
- Ensure Value Objects have readonly fields
|
||||||
|
- Detect public setters
|
||||||
|
- Verify equals() method exists
|
||||||
|
|
||||||
|
### v0.15.0 - Use Case Single Responsibility 🎯
|
||||||
|
- Check Use Case has single public method (execute)
|
||||||
|
- Detect multiple responsibilities
|
||||||
|
- Suggest splitting large Use Cases
|
||||||
|
|
||||||
|
### v0.16.0 - Interface Segregation 🔌
|
||||||
|
- Count methods per interface (> 10 = warning)
|
||||||
|
- Check method cohesion
|
||||||
|
- Suggest interface splitting
|
||||||
|
|
||||||
|
### v0.17.0 - Port-Adapter Pattern 🔌
|
||||||
|
- Check Ports (interfaces) are in application/domain
|
||||||
|
- Verify Adapters are in infrastructure
|
||||||
|
- Detect external library imports in use cases
|
||||||
|
|
||||||
|
### v0.18.0 - Command Query Separation (CQRS) 📝
|
||||||
|
- Detect methods that both change state and return data
|
||||||
|
- Check Use Case names for CQS violations
|
||||||
|
- Validate Command Use Cases return void
|
||||||
|
|
||||||
|
### v0.19.0 - Factory Pattern Validation 🏭
|
||||||
|
- Detect complex logic in entity constructors
|
||||||
|
- Check for `new Entity()` calls in use cases
|
||||||
|
- Suggest extracting construction to Factory
|
||||||
|
|
||||||
|
### v0.20.0 - Specification Pattern Detection 🔍
|
||||||
|
- Detect complex business rules in use cases
|
||||||
|
- Validate Specification classes in domain
|
||||||
|
- Suggest extracting rules to Specifications
|
||||||
|
|
||||||
|
### v0.21.0 - Layered Service Anti-pattern ⚠️
|
||||||
|
- Detect service methods operating on single entity
|
||||||
|
- Validate entities have behavior methods
|
||||||
|
- Suggest moving service methods to entities
|
||||||
|
|
||||||
|
### v0.22.0 - Bounded Context Leak Detection 🚧
|
||||||
|
- Detect entity imports across contexts
|
||||||
|
- Validate only ID references between contexts
|
||||||
|
- Verify event-based integration
|
||||||
|
|
||||||
|
### v0.23.0 - Transaction Script Detection 📜
|
||||||
|
- Detect procedural logic in use cases
|
||||||
|
- Check use case length (> 30-50 lines = warning)
|
||||||
|
- Suggest moving logic to domain entities
|
||||||
|
|
||||||
|
### v0.24.0 - Persistence Ignorance 💾
|
||||||
|
- Detect ORM decorators in domain entities
|
||||||
|
- Check for ORM library imports in domain
|
||||||
|
- Suggest persistence ignorance pattern
|
||||||
|
|
||||||
|
### v0.25.0 - Null Object Pattern Detection 🎭
|
||||||
|
- Count null checks in use cases
|
||||||
|
- Suggest Null Object pattern
|
||||||
|
- Detect repositories returning null vs Null Object
|
||||||
|
|
||||||
|
### v0.26.0 - Primitive Obsession Detection 🔢
|
||||||
|
- Detect methods with > 3 primitive parameters
|
||||||
|
- Check for common Value Object candidates
|
||||||
|
- Suggest creating Value Objects
|
||||||
|
|
||||||
|
### v0.27.0 - Service Locator Anti-pattern 🔍
|
||||||
|
- Detect global ServiceLocator/Registry classes
|
||||||
|
- Validate constructor injection
|
||||||
|
- Suggest DI container usage
|
||||||
|
|
||||||
|
### v0.28.0 - Double Dispatch Pattern 🎯
|
||||||
|
- Detect frequent instanceof or type checking
|
||||||
|
- Check for long if-else/switch by type
|
||||||
|
- Suggest Visitor pattern
|
||||||
|
|
||||||
|
### v0.29.0 - Entity Identity Validation 🆔
|
||||||
|
- Detect public mutable ID fields
|
||||||
|
- Validate ID is Value Object
|
||||||
|
- Check for equals() method implementation
|
||||||
|
|
||||||
|
### v0.30.0 - Saga Pattern Detection 🔄
|
||||||
|
- Detect multiple external calls without compensation
|
||||||
|
- Validate compensating transactions
|
||||||
|
- Suggest Saga pattern for distributed operations
|
||||||
|
|
||||||
|
### v0.31.0 - Anti-Corruption Layer Detection 🛡️
|
||||||
|
- Detect direct legacy library imports
|
||||||
|
- Check for domain adaptation to external APIs
|
||||||
|
- Validate translator/adapter layer exists
|
||||||
|
|
||||||
|
### v0.32.0 - Ubiquitous Language Validation 📖
|
||||||
|
**Priority: HIGH**
|
||||||
|
- Detect synonyms for same concepts (User/Customer/Client)
|
||||||
|
- Check inconsistent verbs (Create/Register/SignUp)
|
||||||
|
- Require Ubiquitous Language glossary
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Version 1.0.0 - Stable Release 🚀
|
||||||
|
**Target:** Q4 2026 (December)
|
||||||
|
**Priority:** 🔥 CRITICAL
|
||||||
|
|
||||||
|
Production-ready stable release with ecosystem:
|
||||||
|
|
||||||
|
### Core Features
|
||||||
|
- ✅ All detection features stabilized
|
||||||
|
- ✅ Configuration & presets
|
||||||
|
- ✅ Visualization & graphs
|
||||||
|
- ✅ CI/CD integration
|
||||||
|
- ✅ Auto-fix & code generation
|
||||||
|
- ✅ Metrics & quality score
|
||||||
|
- ✅ 30+ DDD pattern detectors
|
||||||
|
|
||||||
|
### Ecosystem
|
||||||
|
|
||||||
|
#### VS Code Extension
|
||||||
|
- Real-time detection as you type
|
||||||
|
- Inline suggestions and quick fixes
|
||||||
|
- Problem panel integration
|
||||||
|
- Code actions for auto-fix
|
||||||
|
|
||||||
|
#### JetBrains Plugin
|
||||||
|
- IntelliJ IDEA, WebStorm support
|
||||||
|
- Inspection integration
|
||||||
|
- Quick fixes
|
||||||
|
|
||||||
|
#### Web Dashboard
|
||||||
|
- Team quality metrics
|
||||||
|
- Historical trends
|
||||||
|
- Per-developer analytics
|
||||||
|
- Project comparison
|
||||||
|
|
||||||
|
#### GitHub Integration
|
||||||
|
- GitHub App
|
||||||
|
- Code scanning integration
|
||||||
|
- Dependency insights
|
||||||
|
- Security alerts for architecture violations
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💡 Future Ideas (Post-1.0.0)
|
||||||
|
|
||||||
|
### Multi-Language Support
|
||||||
|
- Python (Django/Flask + DDD)
|
||||||
|
- C# (.NET + Clean Architecture)
|
||||||
|
- Java (Spring Boot + DDD)
|
||||||
|
- Go (Clean Architecture)
|
||||||
|
|
||||||
|
### AI-Powered Features
|
||||||
|
- LLM-based fix suggestions
|
||||||
|
- AI generates code for complex refactorings
|
||||||
|
- Claude/GPT API integration
|
||||||
|
- Natural language architecture queries
|
||||||
|
|
||||||
|
### Team Analytics
|
||||||
|
- Per-developer quality metrics
|
||||||
|
- Team quality trends dashboard
|
||||||
|
- Technical debt tracking
|
||||||
|
- Leaderboards (gamification)
|
||||||
|
|
||||||
|
### Security Features
|
||||||
|
- Secrets detection (API keys, passwords)
|
||||||
|
- SQL injection pattern detection
|
||||||
|
- XSS vulnerability patterns
|
||||||
|
- Dependency vulnerability scanning
|
||||||
|
|
||||||
|
### Code Quality Metrics
|
||||||
|
- Maintainability index
|
||||||
|
- Technical debt estimation
|
||||||
|
- Code duplication detection
|
||||||
|
- Complexity trends
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Success Criteria
|
||||||
|
|
||||||
|
### v0.10.0 (Market Parity Achieved)
|
||||||
|
- ✅ Configuration support (compete with ESLint)
|
||||||
|
- ✅ Visualization (compete with dependency-cruiser)
|
||||||
|
- ✅ CI/CD integration (compete with SonarQube)
|
||||||
|
- ✅ Auto-fix (UNIQUE! Game-changer)
|
||||||
|
- ✅ Metrics dashboard (compete with SonarQube)
|
||||||
|
|
||||||
|
### v1.0.0 (Enterprise Ready)
|
||||||
|
- ✅ 1000+ GitHub stars
|
||||||
|
- ✅ 100+ npm installs/week
|
||||||
|
- ✅ 10+ enterprise adopters
|
||||||
|
- ✅ 99%+ test coverage
|
||||||
|
- ✅ Complete documentation
|
||||||
|
- ✅ IDE extensions available
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Competitive Positioning
|
||||||
|
|
||||||
|
| Feature | Guardian v1.0 | SonarQube | dependency-cruiser | ArchUnit | FTA |
|
||||||
|
|---------|---------------|-----------|-------------------|----------|-----|
|
||||||
|
| TypeScript Focus | ✅✅ | ⚠️ | ✅✅ | ❌ | ✅✅ |
|
||||||
|
| Hardcode + AI Tips | ✅✅ UNIQUE | ⚠️ | ❌ | ❌ | ❌ |
|
||||||
|
| Architecture (DDD) | ✅✅ UNIQUE | ⚠️ | ⚠️ | ✅ | ❌ |
|
||||||
|
| Visualization | ✅ | ✅ | ✅✅ | ❌ | ⚠️ |
|
||||||
|
| Auto-Fix | ✅✅ UNIQUE | ❌ | ❌ | ❌ | ❌ |
|
||||||
|
| Configuration | ✅ | ✅✅ | ✅ | ✅ | ⚠️ |
|
||||||
|
| CI/CD | ✅ | ✅✅ | ✅ | ✅ | ⚠️ |
|
||||||
|
| Metrics | ✅ | ✅✅ | ⚠️ | ❌ | ✅✅ |
|
||||||
|
| Security (SAST) | ❌ | ✅✅ | ❌ | ❌ | ❌ |
|
||||||
|
| Multi-language | ❌ | ✅✅ | ⚠️ | ⚠️ | ❌ |
|
||||||
|
|
||||||
|
**Guardian's Position:** The AI-First Architecture Guardian for TypeScript/DDD teams
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🤝 Contributing
|
||||||
|
|
||||||
|
Want to help build Guardian? Check out:
|
||||||
|
- [GitHub Issues](https://github.com/samiyev/puaros/issues)
|
||||||
|
- [CONTRIBUTING.md](../../CONTRIBUTING.md)
|
||||||
|
- [Discord Community](#) (coming soon)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📈 Versioning
|
||||||
|
|
||||||
|
Guardian follows [Semantic Versioning](https://semver.org/):
|
||||||
|
- **MAJOR** (1.0.0) - Breaking changes
|
||||||
|
- **MINOR** (0.x.0) - New features, backwards compatible
|
||||||
|
- **PATCH** (0.x.y) - Bug fixes
|
||||||
|
|
||||||
|
Until 1.0.0, minor versions may include breaking changes as we iterate on the API.
|
||||||
1176
packages/guardian/docs/v0.6.0-CONFIGURATION-SPEC.md
Normal file
1176
packages/guardian/docs/v0.6.0-CONFIGURATION-SPEC.md
Normal file
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,40 @@
|
|||||||
|
/**
|
||||||
|
* ❌ BAD EXAMPLE: Direct Entity Reference Across Aggregates
|
||||||
|
*
|
||||||
|
* Violation: Order aggregate directly imports and uses User entity from User aggregate
|
||||||
|
*
|
||||||
|
* Problems:
|
||||||
|
* 1. Creates tight coupling between aggregates
|
||||||
|
* 2. Changes to User entity affect Order aggregate
|
||||||
|
* 3. Violates aggregate boundary principles in DDD
|
||||||
|
* 4. Makes aggregates not independently modifiable
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { User } from "../user/User"
|
||||||
|
import { Product } from "../product/Product"
|
||||||
|
|
||||||
|
export class Order {
|
||||||
|
private id: string
|
||||||
|
private user: User
|
||||||
|
private product: Product
|
||||||
|
private quantity: number
|
||||||
|
|
||||||
|
constructor(id: string, user: User, product: Product, quantity: number) {
|
||||||
|
this.id = id
|
||||||
|
this.user = user
|
||||||
|
this.product = product
|
||||||
|
this.quantity = quantity
|
||||||
|
}
|
||||||
|
|
||||||
|
getUserEmail(): string {
|
||||||
|
return this.user.email
|
||||||
|
}
|
||||||
|
|
||||||
|
getProductPrice(): number {
|
||||||
|
return this.product.price
|
||||||
|
}
|
||||||
|
|
||||||
|
calculateTotal(): number {
|
||||||
|
return this.product.price * this.quantity
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,16 @@
|
|||||||
|
import { User } from "../user/User"
|
||||||
|
import { Product } from "../product/Product"
|
||||||
|
|
||||||
|
export class Order {
|
||||||
|
private id: string
|
||||||
|
private user: User
|
||||||
|
private product: Product
|
||||||
|
private quantity: number
|
||||||
|
|
||||||
|
constructor(id: string, user: User, product: Product, quantity: number) {
|
||||||
|
this.id = id
|
||||||
|
this.user = user
|
||||||
|
this.product = product
|
||||||
|
this.quantity = quantity
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,7 @@
|
|||||||
|
export class Product {
|
||||||
|
public price: number
|
||||||
|
|
||||||
|
constructor(price: number) {
|
||||||
|
this.price = price
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,7 @@
|
|||||||
|
export class User {
|
||||||
|
public email: string
|
||||||
|
|
||||||
|
constructor(email: string) {
|
||||||
|
this.email = email
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,40 @@
|
|||||||
|
/**
|
||||||
|
* ✅ GOOD EXAMPLE: Reference by ID
|
||||||
|
*
|
||||||
|
* Best Practice: Order aggregate references other aggregates only by their IDs
|
||||||
|
*
|
||||||
|
* Benefits:
|
||||||
|
* 1. Loose coupling between aggregates
|
||||||
|
* 2. Each aggregate can be modified independently
|
||||||
|
* 3. Follows DDD aggregate boundary principles
|
||||||
|
* 4. Clear separation of concerns
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { UserId } from "../user/value-objects/UserId"
|
||||||
|
import { ProductId } from "../product/value-objects/ProductId"
|
||||||
|
|
||||||
|
export class Order {
|
||||||
|
private id: string
|
||||||
|
private userId: UserId
|
||||||
|
private productId: ProductId
|
||||||
|
private quantity: number
|
||||||
|
|
||||||
|
constructor(id: string, userId: UserId, productId: ProductId, quantity: number) {
|
||||||
|
this.id = id
|
||||||
|
this.userId = userId
|
||||||
|
this.productId = productId
|
||||||
|
this.quantity = quantity
|
||||||
|
}
|
||||||
|
|
||||||
|
getUserId(): UserId {
|
||||||
|
return this.userId
|
||||||
|
}
|
||||||
|
|
||||||
|
getProductId(): ProductId {
|
||||||
|
return this.productId
|
||||||
|
}
|
||||||
|
|
||||||
|
getQuantity(): number {
|
||||||
|
return this.quantity
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,61 @@
|
|||||||
|
/**
|
||||||
|
* ✅ GOOD EXAMPLE: Using Value Objects for Needed Data
|
||||||
|
*
|
||||||
|
* Best Practice: When Order needs specific data from other aggregates,
|
||||||
|
* use Value Objects to store that data (denormalization)
|
||||||
|
*
|
||||||
|
* Benefits:
|
||||||
|
* 1. Order aggregate has all data it needs
|
||||||
|
* 2. No runtime dependency on other aggregates
|
||||||
|
* 3. Better performance (no joins needed)
|
||||||
|
* 4. Clear contract through Value Objects
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { UserId } from "../user/value-objects/UserId"
|
||||||
|
import { ProductId } from "../product/value-objects/ProductId"
|
||||||
|
|
||||||
|
export class CustomerInfo {
|
||||||
|
constructor(
|
||||||
|
readonly customerId: UserId,
|
||||||
|
readonly customerName: string,
|
||||||
|
readonly customerEmail: string,
|
||||||
|
) {}
|
||||||
|
}
|
||||||
|
|
||||||
|
export class ProductInfo {
|
||||||
|
constructor(
|
||||||
|
readonly productId: ProductId,
|
||||||
|
readonly productName: string,
|
||||||
|
readonly productPrice: number,
|
||||||
|
) {}
|
||||||
|
}
|
||||||
|
|
||||||
|
export class Order {
|
||||||
|
private id: string
|
||||||
|
private customer: CustomerInfo
|
||||||
|
private product: ProductInfo
|
||||||
|
private quantity: number
|
||||||
|
|
||||||
|
constructor(id: string, customer: CustomerInfo, product: ProductInfo, quantity: number) {
|
||||||
|
this.id = id
|
||||||
|
this.customer = customer
|
||||||
|
this.product = product
|
||||||
|
this.quantity = quantity
|
||||||
|
}
|
||||||
|
|
||||||
|
getCustomerEmail(): string {
|
||||||
|
return this.customer.customerEmail
|
||||||
|
}
|
||||||
|
|
||||||
|
calculateTotal(): number {
|
||||||
|
return this.product.productPrice * this.quantity
|
||||||
|
}
|
||||||
|
|
||||||
|
getCustomerInfo(): CustomerInfo {
|
||||||
|
return this.customer
|
||||||
|
}
|
||||||
|
|
||||||
|
getProductInfo(): ProductInfo {
|
||||||
|
return this.product
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,38 @@
|
|||||||
|
/**
|
||||||
|
* BAD EXAMPLE: Anemic Domain Model
|
||||||
|
*
|
||||||
|
* This Order class only has getters and setters without any business logic.
|
||||||
|
* All business logic is likely scattered in services (procedural approach).
|
||||||
|
*
|
||||||
|
* This violates Domain-Driven Design principles.
|
||||||
|
*/
|
||||||
|
|
||||||
|
class Order {
|
||||||
|
private status: string
|
||||||
|
private total: number
|
||||||
|
private items: any[]
|
||||||
|
|
||||||
|
getStatus(): string {
|
||||||
|
return this.status
|
||||||
|
}
|
||||||
|
|
||||||
|
setStatus(status: string): void {
|
||||||
|
this.status = status
|
||||||
|
}
|
||||||
|
|
||||||
|
getTotal(): number {
|
||||||
|
return this.total
|
||||||
|
}
|
||||||
|
|
||||||
|
setTotal(total: number): void {
|
||||||
|
this.total = total
|
||||||
|
}
|
||||||
|
|
||||||
|
getItems(): any[] {
|
||||||
|
return this.items
|
||||||
|
}
|
||||||
|
|
||||||
|
setItems(items: any[]): void {
|
||||||
|
this.items = items
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,34 @@
|
|||||||
|
/**
|
||||||
|
* BAD EXAMPLE: Anemic Domain Model with Public Setters
|
||||||
|
*
|
||||||
|
* This User class has public setters which is an anti-pattern in DDD.
|
||||||
|
* Public setters allow uncontrolled state changes without validation or business rules.
|
||||||
|
*
|
||||||
|
* This violates Domain-Driven Design principles and encapsulation.
|
||||||
|
*/
|
||||||
|
|
||||||
|
class User {
|
||||||
|
private email: string
|
||||||
|
private password: string
|
||||||
|
private status: string
|
||||||
|
|
||||||
|
public setEmail(email: string): void {
|
||||||
|
this.email = email
|
||||||
|
}
|
||||||
|
|
||||||
|
public getEmail(): string {
|
||||||
|
return this.email
|
||||||
|
}
|
||||||
|
|
||||||
|
public setPassword(password: string): void {
|
||||||
|
this.password = password
|
||||||
|
}
|
||||||
|
|
||||||
|
public setStatus(status: string): void {
|
||||||
|
this.status = status
|
||||||
|
}
|
||||||
|
|
||||||
|
public getStatus(): string {
|
||||||
|
return this.status
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,139 @@
|
|||||||
|
/**
|
||||||
|
* GOOD EXAMPLE: Rich Domain Model with Business Logic
|
||||||
|
*
|
||||||
|
* This Customer class encapsulates business rules and state transitions.
|
||||||
|
* No public setters - all changes go through business methods.
|
||||||
|
*
|
||||||
|
* This follows Domain-Driven Design and encapsulation principles.
|
||||||
|
*/
|
||||||
|
|
||||||
|
interface Address {
|
||||||
|
street: string
|
||||||
|
city: string
|
||||||
|
country: string
|
||||||
|
postalCode: string
|
||||||
|
}
|
||||||
|
|
||||||
|
interface DomainEvent {
|
||||||
|
type: string
|
||||||
|
data: any
|
||||||
|
}
|
||||||
|
|
||||||
|
class Customer {
|
||||||
|
private readonly id: string
|
||||||
|
private email: string
|
||||||
|
private isActive: boolean
|
||||||
|
private loyaltyPoints: number
|
||||||
|
private address: Address | null
|
||||||
|
private readonly events: DomainEvent[] = []
|
||||||
|
|
||||||
|
constructor(id: string, email: string) {
|
||||||
|
this.id = id
|
||||||
|
this.email = email
|
||||||
|
this.isActive = true
|
||||||
|
this.loyaltyPoints = 0
|
||||||
|
this.address = null
|
||||||
|
}
|
||||||
|
|
||||||
|
public activate(): void {
|
||||||
|
if (this.isActive) {
|
||||||
|
throw new Error("Customer is already active")
|
||||||
|
}
|
||||||
|
this.isActive = true
|
||||||
|
this.events.push({
|
||||||
|
type: "CustomerActivated",
|
||||||
|
data: { customerId: this.id },
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
public deactivate(reason: string): void {
|
||||||
|
if (!this.isActive) {
|
||||||
|
throw new Error("Customer is already inactive")
|
||||||
|
}
|
||||||
|
this.isActive = false
|
||||||
|
this.events.push({
|
||||||
|
type: "CustomerDeactivated",
|
||||||
|
data: { customerId: this.id, reason },
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
public changeEmail(newEmail: string): void {
|
||||||
|
if (!this.isValidEmail(newEmail)) {
|
||||||
|
throw new Error("Invalid email format")
|
||||||
|
}
|
||||||
|
if (this.email === newEmail) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
const oldEmail = this.email
|
||||||
|
this.email = newEmail
|
||||||
|
this.events.push({
|
||||||
|
type: "EmailChanged",
|
||||||
|
data: { customerId: this.id, oldEmail, newEmail },
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
public updateAddress(address: Address): void {
|
||||||
|
if (!this.isValidAddress(address)) {
|
||||||
|
throw new Error("Invalid address")
|
||||||
|
}
|
||||||
|
this.address = address
|
||||||
|
this.events.push({
|
||||||
|
type: "AddressUpdated",
|
||||||
|
data: { customerId: this.id },
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
public addLoyaltyPoints(points: number): void {
|
||||||
|
if (points <= 0) {
|
||||||
|
throw new Error("Points must be positive")
|
||||||
|
}
|
||||||
|
if (!this.isActive) {
|
||||||
|
throw new Error("Cannot add points to inactive customer")
|
||||||
|
}
|
||||||
|
this.loyaltyPoints += points
|
||||||
|
this.events.push({
|
||||||
|
type: "LoyaltyPointsAdded",
|
||||||
|
data: { customerId: this.id, points },
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
public redeemLoyaltyPoints(points: number): void {
|
||||||
|
if (points <= 0) {
|
||||||
|
throw new Error("Points must be positive")
|
||||||
|
}
|
||||||
|
if (this.loyaltyPoints < points) {
|
||||||
|
throw new Error("Insufficient loyalty points")
|
||||||
|
}
|
||||||
|
this.loyaltyPoints -= points
|
||||||
|
this.events.push({
|
||||||
|
type: "LoyaltyPointsRedeemed",
|
||||||
|
data: { customerId: this.id, points },
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
public getEmail(): string {
|
||||||
|
return this.email
|
||||||
|
}
|
||||||
|
|
||||||
|
public getLoyaltyPoints(): number {
|
||||||
|
return this.loyaltyPoints
|
||||||
|
}
|
||||||
|
|
||||||
|
public getAddress(): Address | null {
|
||||||
|
return this.address ? { ...this.address } : null
|
||||||
|
}
|
||||||
|
|
||||||
|
public getEvents(): DomainEvent[] {
|
||||||
|
return [...this.events]
|
||||||
|
}
|
||||||
|
|
||||||
|
private isValidEmail(email: string): boolean {
|
||||||
|
return /^[^\s@]+@[^\s@]+\.[^\s@]+$/.test(email)
|
||||||
|
}
|
||||||
|
|
||||||
|
private isValidAddress(address: Address): boolean {
|
||||||
|
return !!address.street && !!address.city && !!address.country && !!address.postalCode
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export { Customer }
|
||||||
@@ -0,0 +1,104 @@
|
|||||||
|
/**
|
||||||
|
* GOOD EXAMPLE: Rich Domain Model
|
||||||
|
*
|
||||||
|
* This Order class contains business logic and enforces business rules.
|
||||||
|
* State changes are made through business methods, not setters.
|
||||||
|
*
|
||||||
|
* This follows Domain-Driven Design principles.
|
||||||
|
*/
|
||||||
|
|
||||||
|
type OrderStatus = "pending" | "approved" | "rejected" | "shipped"
|
||||||
|
|
||||||
|
interface OrderItem {
|
||||||
|
productId: string
|
||||||
|
quantity: number
|
||||||
|
price: number
|
||||||
|
}
|
||||||
|
|
||||||
|
interface DomainEvent {
|
||||||
|
type: string
|
||||||
|
data: any
|
||||||
|
}
|
||||||
|
|
||||||
|
class Order {
|
||||||
|
private readonly id: string
|
||||||
|
private status: OrderStatus
|
||||||
|
private items: OrderItem[]
|
||||||
|
private readonly events: DomainEvent[] = []
|
||||||
|
|
||||||
|
constructor(id: string, items: OrderItem[]) {
|
||||||
|
this.id = id
|
||||||
|
this.status = "pending"
|
||||||
|
this.items = items
|
||||||
|
}
|
||||||
|
|
||||||
|
public approve(): void {
|
||||||
|
if (!this.canBeApproved()) {
|
||||||
|
throw new Error("Cannot approve order in current state")
|
||||||
|
}
|
||||||
|
this.status = "approved"
|
||||||
|
this.events.push({
|
||||||
|
type: "OrderApproved",
|
||||||
|
data: { orderId: this.id },
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
public reject(reason: string): void {
|
||||||
|
if (!this.canBeRejected()) {
|
||||||
|
throw new Error("Cannot reject order in current state")
|
||||||
|
}
|
||||||
|
this.status = "rejected"
|
||||||
|
this.events.push({
|
||||||
|
type: "OrderRejected",
|
||||||
|
data: { orderId: this.id, reason },
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
public ship(): void {
|
||||||
|
if (!this.canBeShipped()) {
|
||||||
|
throw new Error("Order must be approved before shipping")
|
||||||
|
}
|
||||||
|
this.status = "shipped"
|
||||||
|
this.events.push({
|
||||||
|
type: "OrderShipped",
|
||||||
|
data: { orderId: this.id },
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
public addItem(item: OrderItem): void {
|
||||||
|
if (this.status !== "pending") {
|
||||||
|
throw new Error("Cannot modify approved or shipped order")
|
||||||
|
}
|
||||||
|
this.items.push(item)
|
||||||
|
}
|
||||||
|
|
||||||
|
public calculateTotal(): number {
|
||||||
|
return this.items.reduce((sum, item) => sum + item.price * item.quantity, 0)
|
||||||
|
}
|
||||||
|
|
||||||
|
public getStatus(): OrderStatus {
|
||||||
|
return this.status
|
||||||
|
}
|
||||||
|
|
||||||
|
public getItems(): OrderItem[] {
|
||||||
|
return [...this.items]
|
||||||
|
}
|
||||||
|
|
||||||
|
public getEvents(): DomainEvent[] {
|
||||||
|
return [...this.events]
|
||||||
|
}
|
||||||
|
|
||||||
|
private canBeApproved(): boolean {
|
||||||
|
return this.status === "pending" && this.items.length > 0
|
||||||
|
}
|
||||||
|
|
||||||
|
private canBeRejected(): boolean {
|
||||||
|
return this.status === "pending"
|
||||||
|
}
|
||||||
|
|
||||||
|
private canBeShipped(): boolean {
|
||||||
|
return this.status === "approved"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export { Order }
|
||||||
@@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
"name": "@samiyev/guardian",
|
"name": "@samiyev/guardian",
|
||||||
"version": "0.6.2",
|
"version": "0.9.0",
|
||||||
"description": "Research-backed code quality guardian for AI-assisted development. Detects hardcodes, circular deps, framework leaks, entity exposure, and 8 architecture violations. Enforces Clean Architecture/DDD principles. Works with GitHub Copilot, Cursor, Windsurf, Claude, ChatGPT, Cline, and any AI coding tool.",
|
"description": "Research-backed code quality guardian for AI-assisted development. Detects hardcodes, secrets, circular deps, framework leaks, entity exposure, and 9 architecture violations. Enforces Clean Architecture/DDD principles. Works with GitHub Copilot, Cursor, Windsurf, Claude, ChatGPT, Cline, and any AI coding tool.",
|
||||||
"keywords": [
|
"keywords": [
|
||||||
"puaros",
|
"puaros",
|
||||||
"guardian",
|
"guardian",
|
||||||
@@ -82,6 +82,10 @@
|
|||||||
"guardian": "./bin/guardian.js"
|
"guardian": "./bin/guardian.js"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
|
"@secretlint/core": "^11.2.5",
|
||||||
|
"@secretlint/node": "^11.2.5",
|
||||||
|
"@secretlint/secretlint-rule-preset-recommend": "^11.2.5",
|
||||||
|
"@secretlint/types": "^11.2.5",
|
||||||
"commander": "^12.1.0",
|
"commander": "^12.1.0",
|
||||||
"simple-git": "^3.30.0",
|
"simple-git": "^3.30.0",
|
||||||
"tree-sitter": "^0.21.1",
|
"tree-sitter": "^0.21.1",
|
||||||
|
|||||||
@@ -11,6 +11,9 @@ import { IFrameworkLeakDetector } from "./domain/services/IFrameworkLeakDetector
|
|||||||
import { IEntityExposureDetector } from "./domain/services/IEntityExposureDetector"
|
import { IEntityExposureDetector } from "./domain/services/IEntityExposureDetector"
|
||||||
import { IDependencyDirectionDetector } from "./domain/services/IDependencyDirectionDetector"
|
import { IDependencyDirectionDetector } from "./domain/services/IDependencyDirectionDetector"
|
||||||
import { IRepositoryPatternDetector } from "./domain/services/RepositoryPatternDetectorService"
|
import { IRepositoryPatternDetector } from "./domain/services/RepositoryPatternDetectorService"
|
||||||
|
import { IAggregateBoundaryDetector } from "./domain/services/IAggregateBoundaryDetector"
|
||||||
|
import { ISecretDetector } from "./domain/services/ISecretDetector"
|
||||||
|
import { IAnemicModelDetector } from "./domain/services/IAnemicModelDetector"
|
||||||
import { FileScanner } from "./infrastructure/scanners/FileScanner"
|
import { FileScanner } from "./infrastructure/scanners/FileScanner"
|
||||||
import { CodeParser } from "./infrastructure/parsers/CodeParser"
|
import { CodeParser } from "./infrastructure/parsers/CodeParser"
|
||||||
import { HardcodeDetector } from "./infrastructure/analyzers/HardcodeDetector"
|
import { HardcodeDetector } from "./infrastructure/analyzers/HardcodeDetector"
|
||||||
@@ -19,6 +22,9 @@ import { FrameworkLeakDetector } from "./infrastructure/analyzers/FrameworkLeakD
|
|||||||
import { EntityExposureDetector } from "./infrastructure/analyzers/EntityExposureDetector"
|
import { EntityExposureDetector } from "./infrastructure/analyzers/EntityExposureDetector"
|
||||||
import { DependencyDirectionDetector } from "./infrastructure/analyzers/DependencyDirectionDetector"
|
import { DependencyDirectionDetector } from "./infrastructure/analyzers/DependencyDirectionDetector"
|
||||||
import { RepositoryPatternDetector } from "./infrastructure/analyzers/RepositoryPatternDetector"
|
import { RepositoryPatternDetector } from "./infrastructure/analyzers/RepositoryPatternDetector"
|
||||||
|
import { AggregateBoundaryDetector } from "./infrastructure/analyzers/AggregateBoundaryDetector"
|
||||||
|
import { SecretDetector } from "./infrastructure/analyzers/SecretDetector"
|
||||||
|
import { AnemicModelDetector } from "./infrastructure/analyzers/AnemicModelDetector"
|
||||||
import { ERROR_MESSAGES } from "./shared/constants"
|
import { ERROR_MESSAGES } from "./shared/constants"
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -76,6 +82,9 @@ export async function analyzeProject(
|
|||||||
const dependencyDirectionDetector: IDependencyDirectionDetector =
|
const dependencyDirectionDetector: IDependencyDirectionDetector =
|
||||||
new DependencyDirectionDetector()
|
new DependencyDirectionDetector()
|
||||||
const repositoryPatternDetector: IRepositoryPatternDetector = new RepositoryPatternDetector()
|
const repositoryPatternDetector: IRepositoryPatternDetector = new RepositoryPatternDetector()
|
||||||
|
const aggregateBoundaryDetector: IAggregateBoundaryDetector = new AggregateBoundaryDetector()
|
||||||
|
const secretDetector: ISecretDetector = new SecretDetector()
|
||||||
|
const anemicModelDetector: IAnemicModelDetector = new AnemicModelDetector()
|
||||||
const useCase = new AnalyzeProject(
|
const useCase = new AnalyzeProject(
|
||||||
fileScanner,
|
fileScanner,
|
||||||
codeParser,
|
codeParser,
|
||||||
@@ -85,6 +94,9 @@ export async function analyzeProject(
|
|||||||
entityExposureDetector,
|
entityExposureDetector,
|
||||||
dependencyDirectionDetector,
|
dependencyDirectionDetector,
|
||||||
repositoryPatternDetector,
|
repositoryPatternDetector,
|
||||||
|
aggregateBoundaryDetector,
|
||||||
|
secretDetector,
|
||||||
|
anemicModelDetector,
|
||||||
)
|
)
|
||||||
|
|
||||||
const result = await useCase.execute(options)
|
const result = await useCase.execute(options)
|
||||||
@@ -107,5 +119,7 @@ export type {
|
|||||||
EntityExposureViolation,
|
EntityExposureViolation,
|
||||||
DependencyDirectionViolation,
|
DependencyDirectionViolation,
|
||||||
RepositoryPatternViolation,
|
RepositoryPatternViolation,
|
||||||
|
AggregateBoundaryViolation,
|
||||||
|
AnemicModelViolation,
|
||||||
ProjectMetrics,
|
ProjectMetrics,
|
||||||
} from "./application/use-cases/AnalyzeProject"
|
} from "./application/use-cases/AnalyzeProject"
|
||||||
|
|||||||
@@ -8,20 +8,22 @@ import { IFrameworkLeakDetector } from "../../domain/services/IFrameworkLeakDete
|
|||||||
import { IEntityExposureDetector } from "../../domain/services/IEntityExposureDetector"
|
import { IEntityExposureDetector } from "../../domain/services/IEntityExposureDetector"
|
||||||
import { IDependencyDirectionDetector } from "../../domain/services/IDependencyDirectionDetector"
|
import { IDependencyDirectionDetector } from "../../domain/services/IDependencyDirectionDetector"
|
||||||
import { IRepositoryPatternDetector } from "../../domain/services/RepositoryPatternDetectorService"
|
import { IRepositoryPatternDetector } from "../../domain/services/RepositoryPatternDetectorService"
|
||||||
|
import { IAggregateBoundaryDetector } from "../../domain/services/IAggregateBoundaryDetector"
|
||||||
|
import { ISecretDetector } from "../../domain/services/ISecretDetector"
|
||||||
|
import { IAnemicModelDetector } from "../../domain/services/IAnemicModelDetector"
|
||||||
import { SourceFile } from "../../domain/entities/SourceFile"
|
import { SourceFile } from "../../domain/entities/SourceFile"
|
||||||
import { DependencyGraph } from "../../domain/entities/DependencyGraph"
|
import { DependencyGraph } from "../../domain/entities/DependencyGraph"
|
||||||
import { ProjectPath } from "../../domain/value-objects/ProjectPath"
|
import { CollectFiles } from "./pipeline/CollectFiles"
|
||||||
|
import { ParseSourceFiles } from "./pipeline/ParseSourceFiles"
|
||||||
|
import { ExecuteDetection } from "./pipeline/ExecuteDetection"
|
||||||
|
import { AggregateResults } from "./pipeline/AggregateResults"
|
||||||
import {
|
import {
|
||||||
ERROR_MESSAGES,
|
ERROR_MESSAGES,
|
||||||
HARDCODE_TYPES,
|
HARDCODE_TYPES,
|
||||||
LAYERS,
|
|
||||||
NAMING_VIOLATION_TYPES,
|
NAMING_VIOLATION_TYPES,
|
||||||
REGEX_PATTERNS,
|
|
||||||
REPOSITORY_VIOLATION_TYPES,
|
REPOSITORY_VIOLATION_TYPES,
|
||||||
RULES,
|
RULES,
|
||||||
SEVERITY_ORDER,
|
|
||||||
type SeverityLevel,
|
type SeverityLevel,
|
||||||
VIOLATION_SEVERITY_MAP,
|
|
||||||
} from "../../shared/constants"
|
} from "../../shared/constants"
|
||||||
|
|
||||||
export interface AnalyzeProjectRequest {
|
export interface AnalyzeProjectRequest {
|
||||||
@@ -41,6 +43,9 @@ export interface AnalyzeProjectResponse {
|
|||||||
entityExposureViolations: EntityExposureViolation[]
|
entityExposureViolations: EntityExposureViolation[]
|
||||||
dependencyDirectionViolations: DependencyDirectionViolation[]
|
dependencyDirectionViolations: DependencyDirectionViolation[]
|
||||||
repositoryPatternViolations: RepositoryPatternViolation[]
|
repositoryPatternViolations: RepositoryPatternViolation[]
|
||||||
|
aggregateBoundaryViolations: AggregateBoundaryViolation[]
|
||||||
|
secretViolations: SecretViolation[]
|
||||||
|
anemicModelViolations: AnemicModelViolation[]
|
||||||
metrics: ProjectMetrics
|
metrics: ProjectMetrics
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -149,6 +154,45 @@ export interface RepositoryPatternViolation {
|
|||||||
severity: SeverityLevel
|
severity: SeverityLevel
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export interface AggregateBoundaryViolation {
|
||||||
|
rule: typeof RULES.AGGREGATE_BOUNDARY
|
||||||
|
fromAggregate: string
|
||||||
|
toAggregate: string
|
||||||
|
entityName: string
|
||||||
|
importPath: string
|
||||||
|
file: string
|
||||||
|
line?: number
|
||||||
|
message: string
|
||||||
|
suggestion: string
|
||||||
|
severity: SeverityLevel
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface SecretViolation {
|
||||||
|
rule: typeof RULES.SECRET_EXPOSURE
|
||||||
|
secretType: string
|
||||||
|
file: string
|
||||||
|
line: number
|
||||||
|
column: number
|
||||||
|
message: string
|
||||||
|
suggestion: string
|
||||||
|
severity: SeverityLevel
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface AnemicModelViolation {
|
||||||
|
rule: typeof RULES.ANEMIC_MODEL
|
||||||
|
className: string
|
||||||
|
file: string
|
||||||
|
layer: string
|
||||||
|
line?: number
|
||||||
|
methodCount: number
|
||||||
|
propertyCount: number
|
||||||
|
hasOnlyGettersSetters: boolean
|
||||||
|
hasPublicSetters: boolean
|
||||||
|
message: string
|
||||||
|
suggestion: string
|
||||||
|
severity: SeverityLevel
|
||||||
|
}
|
||||||
|
|
||||||
export interface ProjectMetrics {
|
export interface ProjectMetrics {
|
||||||
totalFiles: number
|
totalFiles: number
|
||||||
totalFunctions: number
|
totalFunctions: number
|
||||||
@@ -158,406 +202,78 @@ export interface ProjectMetrics {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* Main use case for analyzing a project's codebase
|
* Main use case for analyzing a project's codebase
|
||||||
|
* Orchestrates the analysis pipeline through focused components
|
||||||
*/
|
*/
|
||||||
export class AnalyzeProject extends UseCase<
|
export class AnalyzeProject extends UseCase<
|
||||||
AnalyzeProjectRequest,
|
AnalyzeProjectRequest,
|
||||||
ResponseDto<AnalyzeProjectResponse>
|
ResponseDto<AnalyzeProjectResponse>
|
||||||
> {
|
> {
|
||||||
|
private readonly fileCollectionStep: CollectFiles
|
||||||
|
private readonly parsingStep: ParseSourceFiles
|
||||||
|
private readonly detectionPipeline: ExecuteDetection
|
||||||
|
private readonly resultAggregator: AggregateResults
|
||||||
|
|
||||||
constructor(
|
constructor(
|
||||||
private readonly fileScanner: IFileScanner,
|
fileScanner: IFileScanner,
|
||||||
private readonly codeParser: ICodeParser,
|
codeParser: ICodeParser,
|
||||||
private readonly hardcodeDetector: IHardcodeDetector,
|
hardcodeDetector: IHardcodeDetector,
|
||||||
private readonly namingConventionDetector: INamingConventionDetector,
|
namingConventionDetector: INamingConventionDetector,
|
||||||
private readonly frameworkLeakDetector: IFrameworkLeakDetector,
|
frameworkLeakDetector: IFrameworkLeakDetector,
|
||||||
private readonly entityExposureDetector: IEntityExposureDetector,
|
entityExposureDetector: IEntityExposureDetector,
|
||||||
private readonly dependencyDirectionDetector: IDependencyDirectionDetector,
|
dependencyDirectionDetector: IDependencyDirectionDetector,
|
||||||
private readonly repositoryPatternDetector: IRepositoryPatternDetector,
|
repositoryPatternDetector: IRepositoryPatternDetector,
|
||||||
|
aggregateBoundaryDetector: IAggregateBoundaryDetector,
|
||||||
|
secretDetector: ISecretDetector,
|
||||||
|
anemicModelDetector: IAnemicModelDetector,
|
||||||
) {
|
) {
|
||||||
super()
|
super()
|
||||||
|
this.fileCollectionStep = new CollectFiles(fileScanner)
|
||||||
|
this.parsingStep = new ParseSourceFiles(codeParser)
|
||||||
|
this.detectionPipeline = new ExecuteDetection(
|
||||||
|
hardcodeDetector,
|
||||||
|
namingConventionDetector,
|
||||||
|
frameworkLeakDetector,
|
||||||
|
entityExposureDetector,
|
||||||
|
dependencyDirectionDetector,
|
||||||
|
repositoryPatternDetector,
|
||||||
|
aggregateBoundaryDetector,
|
||||||
|
secretDetector,
|
||||||
|
anemicModelDetector,
|
||||||
|
)
|
||||||
|
this.resultAggregator = new AggregateResults()
|
||||||
}
|
}
|
||||||
|
|
||||||
public async execute(
|
public async execute(
|
||||||
request: AnalyzeProjectRequest,
|
request: AnalyzeProjectRequest,
|
||||||
): Promise<ResponseDto<AnalyzeProjectResponse>> {
|
): Promise<ResponseDto<AnalyzeProjectResponse>> {
|
||||||
try {
|
try {
|
||||||
const filePaths = await this.fileScanner.scan({
|
const { sourceFiles } = await this.fileCollectionStep.execute({
|
||||||
rootDir: request.rootDir,
|
rootDir: request.rootDir,
|
||||||
include: request.include,
|
include: request.include,
|
||||||
exclude: request.exclude,
|
exclude: request.exclude,
|
||||||
})
|
})
|
||||||
|
|
||||||
const sourceFiles: SourceFile[] = []
|
const { dependencyGraph, totalFunctions } = this.parsingStep.execute({
|
||||||
const dependencyGraph = new DependencyGraph()
|
sourceFiles,
|
||||||
let totalFunctions = 0
|
rootDir: request.rootDir,
|
||||||
|
|
||||||
for (const filePath of filePaths) {
|
|
||||||
const content = await this.fileScanner.readFile(filePath)
|
|
||||||
const projectPath = ProjectPath.create(filePath, request.rootDir)
|
|
||||||
|
|
||||||
const imports = this.extractImports(content)
|
|
||||||
const exports = this.extractExports(content)
|
|
||||||
|
|
||||||
const sourceFile = new SourceFile(projectPath, content, imports, exports)
|
|
||||||
|
|
||||||
sourceFiles.push(sourceFile)
|
|
||||||
dependencyGraph.addFile(sourceFile)
|
|
||||||
|
|
||||||
if (projectPath.isTypeScript()) {
|
|
||||||
const tree = this.codeParser.parseTypeScript(content)
|
|
||||||
const functions = this.codeParser.extractFunctions(tree)
|
|
||||||
totalFunctions += functions.length
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const imp of imports) {
|
|
||||||
dependencyGraph.addDependency(
|
|
||||||
projectPath.relative,
|
|
||||||
this.resolveImportPath(imp, filePath, request.rootDir),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const violations = this.sortBySeverity(this.detectViolations(sourceFiles))
|
|
||||||
const hardcodeViolations = this.sortBySeverity(this.detectHardcode(sourceFiles))
|
|
||||||
const circularDependencyViolations = this.sortBySeverity(
|
|
||||||
this.detectCircularDependencies(dependencyGraph),
|
|
||||||
)
|
|
||||||
const namingViolations = this.sortBySeverity(this.detectNamingConventions(sourceFiles))
|
|
||||||
const frameworkLeakViolations = this.sortBySeverity(
|
|
||||||
this.detectFrameworkLeaks(sourceFiles),
|
|
||||||
)
|
|
||||||
const entityExposureViolations = this.sortBySeverity(
|
|
||||||
this.detectEntityExposures(sourceFiles),
|
|
||||||
)
|
|
||||||
const dependencyDirectionViolations = this.sortBySeverity(
|
|
||||||
this.detectDependencyDirections(sourceFiles),
|
|
||||||
)
|
|
||||||
const repositoryPatternViolations = this.sortBySeverity(
|
|
||||||
this.detectRepositoryPatternViolations(sourceFiles),
|
|
||||||
)
|
|
||||||
const metrics = this.calculateMetrics(sourceFiles, totalFunctions, dependencyGraph)
|
|
||||||
|
|
||||||
return ResponseDto.ok({
|
|
||||||
files: sourceFiles,
|
|
||||||
dependencyGraph,
|
|
||||||
violations,
|
|
||||||
hardcodeViolations,
|
|
||||||
circularDependencyViolations,
|
|
||||||
namingViolations,
|
|
||||||
frameworkLeakViolations,
|
|
||||||
entityExposureViolations,
|
|
||||||
dependencyDirectionViolations,
|
|
||||||
repositoryPatternViolations,
|
|
||||||
metrics,
|
|
||||||
})
|
})
|
||||||
|
|
||||||
|
const detectionResult = await this.detectionPipeline.execute({
|
||||||
|
sourceFiles,
|
||||||
|
dependencyGraph,
|
||||||
|
})
|
||||||
|
|
||||||
|
const response = this.resultAggregator.execute({
|
||||||
|
sourceFiles,
|
||||||
|
dependencyGraph,
|
||||||
|
totalFunctions,
|
||||||
|
...detectionResult,
|
||||||
|
})
|
||||||
|
|
||||||
|
return ResponseDto.ok(response)
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
const errorMessage = `${ERROR_MESSAGES.FAILED_TO_ANALYZE}: ${error instanceof Error ? error.message : String(error)}`
|
const errorMessage = `${ERROR_MESSAGES.FAILED_TO_ANALYZE}: ${error instanceof Error ? error.message : String(error)}`
|
||||||
return ResponseDto.fail(errorMessage)
|
return ResponseDto.fail(errorMessage)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private extractImports(content: string): string[] {
|
|
||||||
const imports: string[] = []
|
|
||||||
let match
|
|
||||||
|
|
||||||
while ((match = REGEX_PATTERNS.IMPORT_STATEMENT.exec(content)) !== null) {
|
|
||||||
imports.push(match[1])
|
|
||||||
}
|
|
||||||
|
|
||||||
return imports
|
|
||||||
}
|
|
||||||
|
|
||||||
private extractExports(content: string): string[] {
|
|
||||||
const exports: string[] = []
|
|
||||||
let match
|
|
||||||
|
|
||||||
while ((match = REGEX_PATTERNS.EXPORT_STATEMENT.exec(content)) !== null) {
|
|
||||||
exports.push(match[1])
|
|
||||||
}
|
|
||||||
|
|
||||||
return exports
|
|
||||||
}
|
|
||||||
|
|
||||||
private resolveImportPath(importPath: string, _currentFile: string, _rootDir: string): string {
|
|
||||||
if (importPath.startsWith(".")) {
|
|
||||||
return importPath
|
|
||||||
}
|
|
||||||
return importPath
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectViolations(sourceFiles: SourceFile[]): ArchitectureViolation[] {
|
|
||||||
const violations: ArchitectureViolation[] = []
|
|
||||||
|
|
||||||
const layerRules: Record<string, string[]> = {
|
|
||||||
[LAYERS.DOMAIN]: [LAYERS.SHARED],
|
|
||||||
[LAYERS.APPLICATION]: [LAYERS.DOMAIN, LAYERS.SHARED],
|
|
||||||
[LAYERS.INFRASTRUCTURE]: [LAYERS.DOMAIN, LAYERS.APPLICATION, LAYERS.SHARED],
|
|
||||||
[LAYERS.SHARED]: [],
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
if (!file.layer) {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
const allowedLayers = layerRules[file.layer]
|
|
||||||
|
|
||||||
for (const imp of file.imports) {
|
|
||||||
const importedLayer = this.detectLayerFromImport(imp)
|
|
||||||
|
|
||||||
if (
|
|
||||||
importedLayer &&
|
|
||||||
importedLayer !== file.layer &&
|
|
||||||
!allowedLayers.includes(importedLayer)
|
|
||||||
) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.CLEAN_ARCHITECTURE,
|
|
||||||
message: `Layer "${file.layer}" cannot import from "${importedLayer}"`,
|
|
||||||
file: file.path.relative,
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.ARCHITECTURE,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectLayerFromImport(importPath: string): string | undefined {
|
|
||||||
const layers = Object.values(LAYERS)
|
|
||||||
|
|
||||||
for (const layer of layers) {
|
|
||||||
if (importPath.toLowerCase().includes(layer)) {
|
|
||||||
return layer
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return undefined
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectHardcode(sourceFiles: SourceFile[]): HardcodeViolation[] {
|
|
||||||
const violations: HardcodeViolation[] = []
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
const hardcodedValues = this.hardcodeDetector.detectAll(
|
|
||||||
file.content,
|
|
||||||
file.path.relative,
|
|
||||||
)
|
|
||||||
|
|
||||||
for (const hardcoded of hardcodedValues) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.HARDCODED_VALUE,
|
|
||||||
type: hardcoded.type,
|
|
||||||
value: hardcoded.value,
|
|
||||||
file: file.path.relative,
|
|
||||||
line: hardcoded.line,
|
|
||||||
column: hardcoded.column,
|
|
||||||
context: hardcoded.context,
|
|
||||||
suggestion: {
|
|
||||||
constantName: hardcoded.suggestConstantName(),
|
|
||||||
location: hardcoded.suggestLocation(file.layer),
|
|
||||||
},
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.HARDCODE,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectCircularDependencies(
|
|
||||||
dependencyGraph: DependencyGraph,
|
|
||||||
): CircularDependencyViolation[] {
|
|
||||||
const violations: CircularDependencyViolation[] = []
|
|
||||||
const cycles = dependencyGraph.findCycles()
|
|
||||||
|
|
||||||
for (const cycle of cycles) {
|
|
||||||
const cycleChain = [...cycle, cycle[0]].join(" → ")
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.CIRCULAR_DEPENDENCY,
|
|
||||||
message: `Circular dependency detected: ${cycleChain}`,
|
|
||||||
cycle,
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.CIRCULAR_DEPENDENCY,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectNamingConventions(sourceFiles: SourceFile[]): NamingConventionViolation[] {
|
|
||||||
const violations: NamingConventionViolation[] = []
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
const namingViolations = this.namingConventionDetector.detectViolations(
|
|
||||||
file.path.filename,
|
|
||||||
file.layer,
|
|
||||||
file.path.relative,
|
|
||||||
)
|
|
||||||
|
|
||||||
for (const violation of namingViolations) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.NAMING_CONVENTION,
|
|
||||||
type: violation.violationType,
|
|
||||||
fileName: violation.fileName,
|
|
||||||
layer: violation.layer,
|
|
||||||
file: violation.filePath,
|
|
||||||
expected: violation.expected,
|
|
||||||
actual: violation.actual,
|
|
||||||
message: violation.getMessage(),
|
|
||||||
suggestion: violation.suggestion,
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.NAMING_CONVENTION,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectFrameworkLeaks(sourceFiles: SourceFile[]): FrameworkLeakViolation[] {
|
|
||||||
const violations: FrameworkLeakViolation[] = []
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
const leaks = this.frameworkLeakDetector.detectLeaks(
|
|
||||||
file.imports,
|
|
||||||
file.path.relative,
|
|
||||||
file.layer,
|
|
||||||
)
|
|
||||||
|
|
||||||
for (const leak of leaks) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.FRAMEWORK_LEAK,
|
|
||||||
packageName: leak.packageName,
|
|
||||||
category: leak.category,
|
|
||||||
categoryDescription: leak.getCategoryDescription(),
|
|
||||||
file: file.path.relative,
|
|
||||||
layer: leak.layer,
|
|
||||||
line: leak.line,
|
|
||||||
message: leak.getMessage(),
|
|
||||||
suggestion: leak.getSuggestion(),
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.FRAMEWORK_LEAK,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectEntityExposures(sourceFiles: SourceFile[]): EntityExposureViolation[] {
|
|
||||||
const violations: EntityExposureViolation[] = []
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
const exposures = this.entityExposureDetector.detectExposures(
|
|
||||||
file.content,
|
|
||||||
file.path.relative,
|
|
||||||
file.layer,
|
|
||||||
)
|
|
||||||
|
|
||||||
for (const exposure of exposures) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.ENTITY_EXPOSURE,
|
|
||||||
entityName: exposure.entityName,
|
|
||||||
returnType: exposure.returnType,
|
|
||||||
file: file.path.relative,
|
|
||||||
layer: exposure.layer,
|
|
||||||
line: exposure.line,
|
|
||||||
methodName: exposure.methodName,
|
|
||||||
message: exposure.getMessage(),
|
|
||||||
suggestion: exposure.getSuggestion(),
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.ENTITY_EXPOSURE,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectDependencyDirections(sourceFiles: SourceFile[]): DependencyDirectionViolation[] {
|
|
||||||
const violations: DependencyDirectionViolation[] = []
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
const directionViolations = this.dependencyDirectionDetector.detectViolations(
|
|
||||||
file.content,
|
|
||||||
file.path.relative,
|
|
||||||
file.layer,
|
|
||||||
)
|
|
||||||
|
|
||||||
for (const violation of directionViolations) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.DEPENDENCY_DIRECTION,
|
|
||||||
fromLayer: violation.fromLayer,
|
|
||||||
toLayer: violation.toLayer,
|
|
||||||
importPath: violation.importPath,
|
|
||||||
file: file.path.relative,
|
|
||||||
line: violation.line,
|
|
||||||
message: violation.getMessage(),
|
|
||||||
suggestion: violation.getSuggestion(),
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.DEPENDENCY_DIRECTION,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private detectRepositoryPatternViolations(
|
|
||||||
sourceFiles: SourceFile[],
|
|
||||||
): RepositoryPatternViolation[] {
|
|
||||||
const violations: RepositoryPatternViolation[] = []
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
const patternViolations = this.repositoryPatternDetector.detectViolations(
|
|
||||||
file.content,
|
|
||||||
file.path.relative,
|
|
||||||
file.layer,
|
|
||||||
)
|
|
||||||
|
|
||||||
for (const violation of patternViolations) {
|
|
||||||
violations.push({
|
|
||||||
rule: RULES.REPOSITORY_PATTERN,
|
|
||||||
violationType: violation.violationType as
|
|
||||||
| typeof REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE
|
|
||||||
| typeof REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE
|
|
||||||
| typeof REPOSITORY_VIOLATION_TYPES.NEW_REPOSITORY_IN_USE_CASE
|
|
||||||
| typeof REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
|
||||||
file: file.path.relative,
|
|
||||||
layer: violation.layer,
|
|
||||||
line: violation.line,
|
|
||||||
details: violation.details,
|
|
||||||
message: violation.getMessage(),
|
|
||||||
suggestion: violation.getSuggestion(),
|
|
||||||
severity: VIOLATION_SEVERITY_MAP.REPOSITORY_PATTERN,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
private calculateMetrics(
|
|
||||||
sourceFiles: SourceFile[],
|
|
||||||
totalFunctions: number,
|
|
||||||
_dependencyGraph: DependencyGraph,
|
|
||||||
): ProjectMetrics {
|
|
||||||
const layerDistribution: Record<string, number> = {}
|
|
||||||
let totalImports = 0
|
|
||||||
|
|
||||||
for (const file of sourceFiles) {
|
|
||||||
if (file.layer) {
|
|
||||||
layerDistribution[file.layer] = (layerDistribution[file.layer] || 0) + 1
|
|
||||||
}
|
|
||||||
totalImports += file.imports.length
|
|
||||||
}
|
|
||||||
|
|
||||||
return {
|
|
||||||
totalFiles: sourceFiles.length,
|
|
||||||
totalFunctions,
|
|
||||||
totalImports,
|
|
||||||
layerDistribution,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private sortBySeverity<T extends { severity: SeverityLevel }>(violations: T[]): T[] {
|
|
||||||
return violations.sort((a, b) => {
|
|
||||||
return SEVERITY_ORDER[a.severity] - SEVERITY_ORDER[b.severity]
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -0,0 +1,87 @@
|
|||||||
|
import { SourceFile } from "../../../domain/entities/SourceFile"
|
||||||
|
import { DependencyGraph } from "../../../domain/entities/DependencyGraph"
|
||||||
|
import type {
|
||||||
|
AggregateBoundaryViolation,
|
||||||
|
AnalyzeProjectResponse,
|
||||||
|
AnemicModelViolation,
|
||||||
|
ArchitectureViolation,
|
||||||
|
CircularDependencyViolation,
|
||||||
|
DependencyDirectionViolation,
|
||||||
|
EntityExposureViolation,
|
||||||
|
FrameworkLeakViolation,
|
||||||
|
HardcodeViolation,
|
||||||
|
NamingConventionViolation,
|
||||||
|
ProjectMetrics,
|
||||||
|
RepositoryPatternViolation,
|
||||||
|
SecretViolation,
|
||||||
|
} from "../AnalyzeProject"
|
||||||
|
|
||||||
|
export interface AggregationRequest {
|
||||||
|
sourceFiles: SourceFile[]
|
||||||
|
dependencyGraph: DependencyGraph
|
||||||
|
totalFunctions: number
|
||||||
|
violations: ArchitectureViolation[]
|
||||||
|
hardcodeViolations: HardcodeViolation[]
|
||||||
|
circularDependencyViolations: CircularDependencyViolation[]
|
||||||
|
namingViolations: NamingConventionViolation[]
|
||||||
|
frameworkLeakViolations: FrameworkLeakViolation[]
|
||||||
|
entityExposureViolations: EntityExposureViolation[]
|
||||||
|
dependencyDirectionViolations: DependencyDirectionViolation[]
|
||||||
|
repositoryPatternViolations: RepositoryPatternViolation[]
|
||||||
|
aggregateBoundaryViolations: AggregateBoundaryViolation[]
|
||||||
|
secretViolations: SecretViolation[]
|
||||||
|
anemicModelViolations: AnemicModelViolation[]
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Pipeline step responsible for building final response DTO
|
||||||
|
*/
|
||||||
|
export class AggregateResults {
|
||||||
|
public execute(request: AggregationRequest): AnalyzeProjectResponse {
|
||||||
|
const metrics = this.calculateMetrics(
|
||||||
|
request.sourceFiles,
|
||||||
|
request.totalFunctions,
|
||||||
|
request.dependencyGraph,
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
files: request.sourceFiles,
|
||||||
|
dependencyGraph: request.dependencyGraph,
|
||||||
|
violations: request.violations,
|
||||||
|
hardcodeViolations: request.hardcodeViolations,
|
||||||
|
circularDependencyViolations: request.circularDependencyViolations,
|
||||||
|
namingViolations: request.namingViolations,
|
||||||
|
frameworkLeakViolations: request.frameworkLeakViolations,
|
||||||
|
entityExposureViolations: request.entityExposureViolations,
|
||||||
|
dependencyDirectionViolations: request.dependencyDirectionViolations,
|
||||||
|
repositoryPatternViolations: request.repositoryPatternViolations,
|
||||||
|
aggregateBoundaryViolations: request.aggregateBoundaryViolations,
|
||||||
|
secretViolations: request.secretViolations,
|
||||||
|
anemicModelViolations: request.anemicModelViolations,
|
||||||
|
metrics,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private calculateMetrics(
|
||||||
|
sourceFiles: SourceFile[],
|
||||||
|
totalFunctions: number,
|
||||||
|
_dependencyGraph: DependencyGraph,
|
||||||
|
): ProjectMetrics {
|
||||||
|
const layerDistribution: Record<string, number> = {}
|
||||||
|
let totalImports = 0
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
if (file.layer) {
|
||||||
|
layerDistribution[file.layer] = (layerDistribution[file.layer] || 0) + 1
|
||||||
|
}
|
||||||
|
totalImports += file.imports.length
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
totalFiles: sourceFiles.length,
|
||||||
|
totalFunctions,
|
||||||
|
totalImports,
|
||||||
|
layerDistribution,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,66 @@
|
|||||||
|
import { IFileScanner } from "../../../domain/services/IFileScanner"
|
||||||
|
import { SourceFile } from "../../../domain/entities/SourceFile"
|
||||||
|
import { ProjectPath } from "../../../domain/value-objects/ProjectPath"
|
||||||
|
import { REGEX_PATTERNS } from "../../../shared/constants"
|
||||||
|
|
||||||
|
export interface FileCollectionRequest {
|
||||||
|
rootDir: string
|
||||||
|
include?: string[]
|
||||||
|
exclude?: string[]
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface FileCollectionResult {
|
||||||
|
sourceFiles: SourceFile[]
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Pipeline step responsible for file collection and basic parsing
|
||||||
|
*/
|
||||||
|
export class CollectFiles {
|
||||||
|
constructor(private readonly fileScanner: IFileScanner) {}
|
||||||
|
|
||||||
|
public async execute(request: FileCollectionRequest): Promise<FileCollectionResult> {
|
||||||
|
const filePaths = await this.fileScanner.scan({
|
||||||
|
rootDir: request.rootDir,
|
||||||
|
include: request.include,
|
||||||
|
exclude: request.exclude,
|
||||||
|
})
|
||||||
|
|
||||||
|
const sourceFiles: SourceFile[] = []
|
||||||
|
|
||||||
|
for (const filePath of filePaths) {
|
||||||
|
const content = await this.fileScanner.readFile(filePath)
|
||||||
|
const projectPath = ProjectPath.create(filePath, request.rootDir)
|
||||||
|
|
||||||
|
const imports = this.extractImports(content)
|
||||||
|
const exports = this.extractExports(content)
|
||||||
|
|
||||||
|
const sourceFile = new SourceFile(projectPath, content, imports, exports)
|
||||||
|
sourceFiles.push(sourceFile)
|
||||||
|
}
|
||||||
|
|
||||||
|
return { sourceFiles }
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractImports(content: string): string[] {
|
||||||
|
const imports: string[] = []
|
||||||
|
let match
|
||||||
|
|
||||||
|
while ((match = REGEX_PATTERNS.IMPORT_STATEMENT.exec(content)) !== null) {
|
||||||
|
imports.push(match[1])
|
||||||
|
}
|
||||||
|
|
||||||
|
return imports
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractExports(content: string): string[] {
|
||||||
|
const exports: string[] = []
|
||||||
|
let match
|
||||||
|
|
||||||
|
while ((match = REGEX_PATTERNS.EXPORT_STATEMENT.exec(content)) !== null) {
|
||||||
|
exports.push(match[1])
|
||||||
|
}
|
||||||
|
|
||||||
|
return exports
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,444 @@
|
|||||||
|
import { IHardcodeDetector } from "../../../domain/services/IHardcodeDetector"
|
||||||
|
import { INamingConventionDetector } from "../../../domain/services/INamingConventionDetector"
|
||||||
|
import { IFrameworkLeakDetector } from "../../../domain/services/IFrameworkLeakDetector"
|
||||||
|
import { IEntityExposureDetector } from "../../../domain/services/IEntityExposureDetector"
|
||||||
|
import { IDependencyDirectionDetector } from "../../../domain/services/IDependencyDirectionDetector"
|
||||||
|
import { IRepositoryPatternDetector } from "../../../domain/services/RepositoryPatternDetectorService"
|
||||||
|
import { IAggregateBoundaryDetector } from "../../../domain/services/IAggregateBoundaryDetector"
|
||||||
|
import { ISecretDetector } from "../../../domain/services/ISecretDetector"
|
||||||
|
import { IAnemicModelDetector } from "../../../domain/services/IAnemicModelDetector"
|
||||||
|
import { SourceFile } from "../../../domain/entities/SourceFile"
|
||||||
|
import { DependencyGraph } from "../../../domain/entities/DependencyGraph"
|
||||||
|
import {
|
||||||
|
LAYERS,
|
||||||
|
REPOSITORY_VIOLATION_TYPES,
|
||||||
|
RULES,
|
||||||
|
SEVERITY_ORDER,
|
||||||
|
type SeverityLevel,
|
||||||
|
VIOLATION_SEVERITY_MAP,
|
||||||
|
} from "../../../shared/constants"
|
||||||
|
import type {
|
||||||
|
AggregateBoundaryViolation,
|
||||||
|
AnemicModelViolation,
|
||||||
|
ArchitectureViolation,
|
||||||
|
CircularDependencyViolation,
|
||||||
|
DependencyDirectionViolation,
|
||||||
|
EntityExposureViolation,
|
||||||
|
FrameworkLeakViolation,
|
||||||
|
HardcodeViolation,
|
||||||
|
NamingConventionViolation,
|
||||||
|
RepositoryPatternViolation,
|
||||||
|
SecretViolation,
|
||||||
|
} from "../AnalyzeProject"
|
||||||
|
|
||||||
|
export interface DetectionRequest {
|
||||||
|
sourceFiles: SourceFile[]
|
||||||
|
dependencyGraph: DependencyGraph
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface DetectionResult {
|
||||||
|
violations: ArchitectureViolation[]
|
||||||
|
hardcodeViolations: HardcodeViolation[]
|
||||||
|
circularDependencyViolations: CircularDependencyViolation[]
|
||||||
|
namingViolations: NamingConventionViolation[]
|
||||||
|
frameworkLeakViolations: FrameworkLeakViolation[]
|
||||||
|
entityExposureViolations: EntityExposureViolation[]
|
||||||
|
dependencyDirectionViolations: DependencyDirectionViolation[]
|
||||||
|
repositoryPatternViolations: RepositoryPatternViolation[]
|
||||||
|
aggregateBoundaryViolations: AggregateBoundaryViolation[]
|
||||||
|
secretViolations: SecretViolation[]
|
||||||
|
anemicModelViolations: AnemicModelViolation[]
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Pipeline step responsible for running all detectors
|
||||||
|
*/
|
||||||
|
export class ExecuteDetection {
|
||||||
|
constructor(
|
||||||
|
private readonly hardcodeDetector: IHardcodeDetector,
|
||||||
|
private readonly namingConventionDetector: INamingConventionDetector,
|
||||||
|
private readonly frameworkLeakDetector: IFrameworkLeakDetector,
|
||||||
|
private readonly entityExposureDetector: IEntityExposureDetector,
|
||||||
|
private readonly dependencyDirectionDetector: IDependencyDirectionDetector,
|
||||||
|
private readonly repositoryPatternDetector: IRepositoryPatternDetector,
|
||||||
|
private readonly aggregateBoundaryDetector: IAggregateBoundaryDetector,
|
||||||
|
private readonly secretDetector: ISecretDetector,
|
||||||
|
private readonly anemicModelDetector: IAnemicModelDetector,
|
||||||
|
) {}
|
||||||
|
|
||||||
|
public async execute(request: DetectionRequest): Promise<DetectionResult> {
|
||||||
|
const secretViolations = await this.detectSecrets(request.sourceFiles)
|
||||||
|
|
||||||
|
return {
|
||||||
|
violations: this.sortBySeverity(this.detectViolations(request.sourceFiles)),
|
||||||
|
hardcodeViolations: this.sortBySeverity(this.detectHardcode(request.sourceFiles)),
|
||||||
|
circularDependencyViolations: this.sortBySeverity(
|
||||||
|
this.detectCircularDependencies(request.dependencyGraph),
|
||||||
|
),
|
||||||
|
namingViolations: this.sortBySeverity(
|
||||||
|
this.detectNamingConventions(request.sourceFiles),
|
||||||
|
),
|
||||||
|
frameworkLeakViolations: this.sortBySeverity(
|
||||||
|
this.detectFrameworkLeaks(request.sourceFiles),
|
||||||
|
),
|
||||||
|
entityExposureViolations: this.sortBySeverity(
|
||||||
|
this.detectEntityExposures(request.sourceFiles),
|
||||||
|
),
|
||||||
|
dependencyDirectionViolations: this.sortBySeverity(
|
||||||
|
this.detectDependencyDirections(request.sourceFiles),
|
||||||
|
),
|
||||||
|
repositoryPatternViolations: this.sortBySeverity(
|
||||||
|
this.detectRepositoryPatternViolations(request.sourceFiles),
|
||||||
|
),
|
||||||
|
aggregateBoundaryViolations: this.sortBySeverity(
|
||||||
|
this.detectAggregateBoundaryViolations(request.sourceFiles),
|
||||||
|
),
|
||||||
|
secretViolations: this.sortBySeverity(secretViolations),
|
||||||
|
anemicModelViolations: this.sortBySeverity(
|
||||||
|
this.detectAnemicModels(request.sourceFiles),
|
||||||
|
),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectViolations(sourceFiles: SourceFile[]): ArchitectureViolation[] {
|
||||||
|
const violations: ArchitectureViolation[] = []
|
||||||
|
|
||||||
|
const layerRules: Record<string, string[]> = {
|
||||||
|
[LAYERS.DOMAIN]: [LAYERS.SHARED],
|
||||||
|
[LAYERS.APPLICATION]: [LAYERS.DOMAIN, LAYERS.SHARED],
|
||||||
|
[LAYERS.INFRASTRUCTURE]: [LAYERS.DOMAIN, LAYERS.APPLICATION, LAYERS.SHARED],
|
||||||
|
[LAYERS.SHARED]: [],
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
if (!file.layer) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
const allowedLayers = layerRules[file.layer]
|
||||||
|
|
||||||
|
for (const imp of file.imports) {
|
||||||
|
const importedLayer = this.detectLayerFromImport(imp)
|
||||||
|
|
||||||
|
if (
|
||||||
|
importedLayer &&
|
||||||
|
importedLayer !== file.layer &&
|
||||||
|
!allowedLayers.includes(importedLayer)
|
||||||
|
) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.CLEAN_ARCHITECTURE,
|
||||||
|
message: `Layer "${file.layer}" cannot import from "${importedLayer}"`,
|
||||||
|
file: file.path.relative,
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.ARCHITECTURE,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectLayerFromImport(importPath: string): string | undefined {
|
||||||
|
const layers = Object.values(LAYERS)
|
||||||
|
|
||||||
|
for (const layer of layers) {
|
||||||
|
if (importPath.toLowerCase().includes(layer)) {
|
||||||
|
return layer
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectHardcode(sourceFiles: SourceFile[]): HardcodeViolation[] {
|
||||||
|
const violations: HardcodeViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const hardcodedValues = this.hardcodeDetector.detectAll(
|
||||||
|
file.content,
|
||||||
|
file.path.relative,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const hardcoded of hardcodedValues) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.HARDCODED_VALUE,
|
||||||
|
type: hardcoded.type,
|
||||||
|
value: hardcoded.value,
|
||||||
|
file: file.path.relative,
|
||||||
|
line: hardcoded.line,
|
||||||
|
column: hardcoded.column,
|
||||||
|
context: hardcoded.context,
|
||||||
|
suggestion: {
|
||||||
|
constantName: hardcoded.suggestConstantName(),
|
||||||
|
location: hardcoded.suggestLocation(file.layer),
|
||||||
|
},
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.HARDCODE,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectCircularDependencies(
|
||||||
|
dependencyGraph: DependencyGraph,
|
||||||
|
): CircularDependencyViolation[] {
|
||||||
|
const violations: CircularDependencyViolation[] = []
|
||||||
|
const cycles = dependencyGraph.findCycles()
|
||||||
|
|
||||||
|
for (const cycle of cycles) {
|
||||||
|
const cycleChain = [...cycle, cycle[0]].join(" → ")
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.CIRCULAR_DEPENDENCY,
|
||||||
|
message: `Circular dependency detected: ${cycleChain}`,
|
||||||
|
cycle,
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.CIRCULAR_DEPENDENCY,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectNamingConventions(sourceFiles: SourceFile[]): NamingConventionViolation[] {
|
||||||
|
const violations: NamingConventionViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const namingViolations = this.namingConventionDetector.detectViolations(
|
||||||
|
file.path.filename,
|
||||||
|
file.layer,
|
||||||
|
file.path.relative,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const violation of namingViolations) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.NAMING_CONVENTION,
|
||||||
|
type: violation.violationType,
|
||||||
|
fileName: violation.fileName,
|
||||||
|
layer: violation.layer,
|
||||||
|
file: violation.filePath,
|
||||||
|
expected: violation.expected,
|
||||||
|
actual: violation.actual,
|
||||||
|
message: violation.getMessage(),
|
||||||
|
suggestion: violation.suggestion,
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.NAMING_CONVENTION,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectFrameworkLeaks(sourceFiles: SourceFile[]): FrameworkLeakViolation[] {
|
||||||
|
const violations: FrameworkLeakViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const leaks = this.frameworkLeakDetector.detectLeaks(
|
||||||
|
file.imports,
|
||||||
|
file.path.relative,
|
||||||
|
file.layer,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const leak of leaks) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.FRAMEWORK_LEAK,
|
||||||
|
packageName: leak.packageName,
|
||||||
|
category: leak.category,
|
||||||
|
categoryDescription: leak.getCategoryDescription(),
|
||||||
|
file: file.path.relative,
|
||||||
|
layer: leak.layer,
|
||||||
|
line: leak.line,
|
||||||
|
message: leak.getMessage(),
|
||||||
|
suggestion: leak.getSuggestion(),
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.FRAMEWORK_LEAK,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectEntityExposures(sourceFiles: SourceFile[]): EntityExposureViolation[] {
|
||||||
|
const violations: EntityExposureViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const exposures = this.entityExposureDetector.detectExposures(
|
||||||
|
file.content,
|
||||||
|
file.path.relative,
|
||||||
|
file.layer,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const exposure of exposures) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.ENTITY_EXPOSURE,
|
||||||
|
entityName: exposure.entityName,
|
||||||
|
returnType: exposure.returnType,
|
||||||
|
file: file.path.relative,
|
||||||
|
layer: exposure.layer,
|
||||||
|
line: exposure.line,
|
||||||
|
methodName: exposure.methodName,
|
||||||
|
message: exposure.getMessage(),
|
||||||
|
suggestion: exposure.getSuggestion(),
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.ENTITY_EXPOSURE,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectDependencyDirections(sourceFiles: SourceFile[]): DependencyDirectionViolation[] {
|
||||||
|
const violations: DependencyDirectionViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const directionViolations = this.dependencyDirectionDetector.detectViolations(
|
||||||
|
file.content,
|
||||||
|
file.path.relative,
|
||||||
|
file.layer,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const violation of directionViolations) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.DEPENDENCY_DIRECTION,
|
||||||
|
fromLayer: violation.fromLayer,
|
||||||
|
toLayer: violation.toLayer,
|
||||||
|
importPath: violation.importPath,
|
||||||
|
file: file.path.relative,
|
||||||
|
line: violation.line,
|
||||||
|
message: violation.getMessage(),
|
||||||
|
suggestion: violation.getSuggestion(),
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.DEPENDENCY_DIRECTION,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectRepositoryPatternViolations(
|
||||||
|
sourceFiles: SourceFile[],
|
||||||
|
): RepositoryPatternViolation[] {
|
||||||
|
const violations: RepositoryPatternViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const patternViolations = this.repositoryPatternDetector.detectViolations(
|
||||||
|
file.content,
|
||||||
|
file.path.relative,
|
||||||
|
file.layer,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const violation of patternViolations) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.REPOSITORY_PATTERN,
|
||||||
|
violationType: violation.violationType as
|
||||||
|
| typeof REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE
|
||||||
|
| typeof REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE
|
||||||
|
| typeof REPOSITORY_VIOLATION_TYPES.NEW_REPOSITORY_IN_USE_CASE
|
||||||
|
| typeof REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
||||||
|
file: file.path.relative,
|
||||||
|
layer: violation.layer,
|
||||||
|
line: violation.line,
|
||||||
|
details: violation.details,
|
||||||
|
message: violation.getMessage(),
|
||||||
|
suggestion: violation.getSuggestion(),
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.REPOSITORY_PATTERN,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectAggregateBoundaryViolations(
|
||||||
|
sourceFiles: SourceFile[],
|
||||||
|
): AggregateBoundaryViolation[] {
|
||||||
|
const violations: AggregateBoundaryViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const boundaryViolations = this.aggregateBoundaryDetector.detectViolations(
|
||||||
|
file.content,
|
||||||
|
file.path.relative,
|
||||||
|
file.layer,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const violation of boundaryViolations) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.AGGREGATE_BOUNDARY,
|
||||||
|
fromAggregate: violation.fromAggregate,
|
||||||
|
toAggregate: violation.toAggregate,
|
||||||
|
entityName: violation.entityName,
|
||||||
|
importPath: violation.importPath,
|
||||||
|
file: file.path.relative,
|
||||||
|
line: violation.line,
|
||||||
|
message: violation.getMessage(),
|
||||||
|
suggestion: violation.getSuggestion(),
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.AGGREGATE_BOUNDARY,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private async detectSecrets(sourceFiles: SourceFile[]): Promise<SecretViolation[]> {
|
||||||
|
const violations: SecretViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const secretViolations = await this.secretDetector.detectAll(
|
||||||
|
file.content,
|
||||||
|
file.path.relative,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const secret of secretViolations) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.SECRET_EXPOSURE,
|
||||||
|
secretType: secret.secretType,
|
||||||
|
file: file.path.relative,
|
||||||
|
line: secret.line,
|
||||||
|
column: secret.column,
|
||||||
|
message: secret.getMessage(),
|
||||||
|
suggestion: secret.getSuggestion(),
|
||||||
|
severity: "critical",
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private detectAnemicModels(sourceFiles: SourceFile[]): AnemicModelViolation[] {
|
||||||
|
const violations: AnemicModelViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const anemicModels = this.anemicModelDetector.detectAnemicModels(
|
||||||
|
file.content,
|
||||||
|
file.path.relative,
|
||||||
|
file.layer,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const anemicModel of anemicModels) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.ANEMIC_MODEL,
|
||||||
|
className: anemicModel.className,
|
||||||
|
file: file.path.relative,
|
||||||
|
layer: anemicModel.layer,
|
||||||
|
line: anemicModel.line,
|
||||||
|
methodCount: anemicModel.methodCount,
|
||||||
|
propertyCount: anemicModel.propertyCount,
|
||||||
|
hasOnlyGettersSetters: anemicModel.hasOnlyGettersSetters,
|
||||||
|
hasPublicSetters: anemicModel.hasPublicSetters,
|
||||||
|
message: anemicModel.getMessage(),
|
||||||
|
suggestion: anemicModel.getSuggestion(),
|
||||||
|
severity: VIOLATION_SEVERITY_MAP.ANEMIC_MODEL,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private sortBySeverity<T extends { severity: SeverityLevel }>(violations: T[]): T[] {
|
||||||
|
return violations.sort((a, b) => {
|
||||||
|
return SEVERITY_ORDER[a.severity] - SEVERITY_ORDER[b.severity]
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,51 @@
|
|||||||
|
import { ICodeParser } from "../../../domain/services/ICodeParser"
|
||||||
|
import { SourceFile } from "../../../domain/entities/SourceFile"
|
||||||
|
import { DependencyGraph } from "../../../domain/entities/DependencyGraph"
|
||||||
|
|
||||||
|
export interface ParsingRequest {
|
||||||
|
sourceFiles: SourceFile[]
|
||||||
|
rootDir: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ParsingResult {
|
||||||
|
dependencyGraph: DependencyGraph
|
||||||
|
totalFunctions: number
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Pipeline step responsible for AST parsing and dependency graph construction
|
||||||
|
*/
|
||||||
|
export class ParseSourceFiles {
|
||||||
|
constructor(private readonly codeParser: ICodeParser) {}
|
||||||
|
|
||||||
|
public execute(request: ParsingRequest): ParsingResult {
|
||||||
|
const dependencyGraph = new DependencyGraph()
|
||||||
|
let totalFunctions = 0
|
||||||
|
|
||||||
|
for (const sourceFile of request.sourceFiles) {
|
||||||
|
dependencyGraph.addFile(sourceFile)
|
||||||
|
|
||||||
|
if (sourceFile.path.isTypeScript()) {
|
||||||
|
const tree = this.codeParser.parseTypeScript(sourceFile.content)
|
||||||
|
const functions = this.codeParser.extractFunctions(tree)
|
||||||
|
totalFunctions += functions.length
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const imp of sourceFile.imports) {
|
||||||
|
dependencyGraph.addDependency(
|
||||||
|
sourceFile.path.relative,
|
||||||
|
this.resolveImportPath(imp, sourceFile.path.relative, request.rootDir),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return { dependencyGraph, totalFunctions }
|
||||||
|
}
|
||||||
|
|
||||||
|
private resolveImportPath(importPath: string, _currentFile: string, _rootDir: string): string {
|
||||||
|
if (importPath.startsWith(".")) {
|
||||||
|
return importPath
|
||||||
|
}
|
||||||
|
return importPath
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -150,4 +150,30 @@ export const CLI_HELP_TEXT = {
|
|||||||
FIX_REPOSITORY:
|
FIX_REPOSITORY:
|
||||||
" Repository pattern → Create IUserRepository in domain, implement in infra\n\n",
|
" Repository pattern → Create IUserRepository in domain, implement in infra\n\n",
|
||||||
FOOTER: "Each violation includes a 💡 Suggestion with specific fix instructions.\n",
|
FOOTER: "Each violation includes a 💡 Suggestion with specific fix instructions.\n",
|
||||||
|
AI_AGENT_HEADER: "AI AGENT INSTRUCTIONS:\n",
|
||||||
|
AI_AGENT_INTRO:
|
||||||
|
" When an AI coding assistant (Claude, Copilot, Cursor, etc.) uses Guardian:\n\n",
|
||||||
|
AI_AGENT_STEP1: " STEP 1: Run initial scan\n",
|
||||||
|
AI_AGENT_STEP1_CMD: " $ guardian check ./src --only-critical --limit 5\n\n",
|
||||||
|
AI_AGENT_STEP2: " STEP 2: For each violation in output:\n",
|
||||||
|
AI_AGENT_STEP2_DETAIL:
|
||||||
|
" - Read the file at reported location (file:line:column)\n" +
|
||||||
|
" - Apply the 💡 Suggestion provided\n" +
|
||||||
|
" - The suggestion contains exact fix instructions\n\n",
|
||||||
|
AI_AGENT_STEP3: " STEP 3: After fixing, verify:\n",
|
||||||
|
AI_AGENT_STEP3_CMD: " $ guardian check ./src --only-critical\n\n",
|
||||||
|
AI_AGENT_STEP4: " STEP 4: Expand scope progressively:\n",
|
||||||
|
AI_AGENT_STEP4_CMDS:
|
||||||
|
" $ guardian check ./src --min-severity high # Fix HIGH issues\n" +
|
||||||
|
" $ guardian check ./src --min-severity medium # Fix MEDIUM issues\n" +
|
||||||
|
" $ guardian check ./src # Full scan\n\n",
|
||||||
|
AI_AGENT_OUTPUT: " OUTPUT FORMAT (parse this):\n",
|
||||||
|
AI_AGENT_OUTPUT_DETAIL:
|
||||||
|
" <index>. <file>:<line>:<column>\n" +
|
||||||
|
" Severity: <emoji> <LEVEL>\n" +
|
||||||
|
" Type: <violation-type>\n" +
|
||||||
|
" Value: <problematic-value>\n" +
|
||||||
|
" Context: <code-snippet>\n" +
|
||||||
|
" 💡 Suggestion: <exact-fix-instruction>\n\n",
|
||||||
|
AI_AGENT_PRIORITY: " PRIORITY ORDER: CRITICAL → HIGH → MEDIUM → LOW\n\n",
|
||||||
} as const
|
} as const
|
||||||
|
|||||||
235
packages/guardian/src/cli/formatters/OutputFormatter.ts
Normal file
235
packages/guardian/src/cli/formatters/OutputFormatter.ts
Normal file
@@ -0,0 +1,235 @@
|
|||||||
|
import { SEVERITY_LEVELS, type SeverityLevel } from "../../shared/constants"
|
||||||
|
import type {
|
||||||
|
AggregateBoundaryViolation,
|
||||||
|
AnemicModelViolation,
|
||||||
|
ArchitectureViolation,
|
||||||
|
CircularDependencyViolation,
|
||||||
|
DependencyDirectionViolation,
|
||||||
|
EntityExposureViolation,
|
||||||
|
FrameworkLeakViolation,
|
||||||
|
HardcodeViolation,
|
||||||
|
NamingConventionViolation,
|
||||||
|
RepositoryPatternViolation,
|
||||||
|
SecretViolation,
|
||||||
|
} from "../../application/use-cases/AnalyzeProject"
|
||||||
|
import { SEVERITY_DISPLAY_LABELS, SEVERITY_SECTION_HEADERS } from "../constants"
|
||||||
|
import { ViolationGrouper } from "../groupers/ViolationGrouper"
|
||||||
|
|
||||||
|
const SEVERITY_LABELS: Record<SeverityLevel, string> = {
|
||||||
|
[SEVERITY_LEVELS.CRITICAL]: SEVERITY_DISPLAY_LABELS.CRITICAL,
|
||||||
|
[SEVERITY_LEVELS.HIGH]: SEVERITY_DISPLAY_LABELS.HIGH,
|
||||||
|
[SEVERITY_LEVELS.MEDIUM]: SEVERITY_DISPLAY_LABELS.MEDIUM,
|
||||||
|
[SEVERITY_LEVELS.LOW]: SEVERITY_DISPLAY_LABELS.LOW,
|
||||||
|
}
|
||||||
|
|
||||||
|
const SEVERITY_HEADER: Record<SeverityLevel, string> = {
|
||||||
|
[SEVERITY_LEVELS.CRITICAL]: SEVERITY_SECTION_HEADERS.CRITICAL,
|
||||||
|
[SEVERITY_LEVELS.HIGH]: SEVERITY_SECTION_HEADERS.HIGH,
|
||||||
|
[SEVERITY_LEVELS.MEDIUM]: SEVERITY_SECTION_HEADERS.MEDIUM,
|
||||||
|
[SEVERITY_LEVELS.LOW]: SEVERITY_SECTION_HEADERS.LOW,
|
||||||
|
}
|
||||||
|
|
||||||
|
export class OutputFormatter {
|
||||||
|
private readonly grouper = new ViolationGrouper()
|
||||||
|
|
||||||
|
displayGroupedViolations<T extends { severity: SeverityLevel }>(
|
||||||
|
violations: T[],
|
||||||
|
displayFn: (v: T, index: number) => void,
|
||||||
|
limit?: number,
|
||||||
|
): void {
|
||||||
|
const grouped = this.grouper.groupBySeverity(violations)
|
||||||
|
const severities: SeverityLevel[] = [
|
||||||
|
SEVERITY_LEVELS.CRITICAL,
|
||||||
|
SEVERITY_LEVELS.HIGH,
|
||||||
|
SEVERITY_LEVELS.MEDIUM,
|
||||||
|
SEVERITY_LEVELS.LOW,
|
||||||
|
]
|
||||||
|
|
||||||
|
let totalDisplayed = 0
|
||||||
|
const totalAvailable = violations.length
|
||||||
|
|
||||||
|
for (const severity of severities) {
|
||||||
|
const items = grouped.get(severity)
|
||||||
|
if (items && items.length > 0) {
|
||||||
|
console.warn(SEVERITY_HEADER[severity])
|
||||||
|
console.warn(`Found ${String(items.length)} issue(s)\n`)
|
||||||
|
|
||||||
|
const itemsToDisplay =
|
||||||
|
limit !== undefined ? items.slice(0, limit - totalDisplayed) : items
|
||||||
|
itemsToDisplay.forEach((item, index) => {
|
||||||
|
displayFn(item, totalDisplayed + index)
|
||||||
|
})
|
||||||
|
totalDisplayed += itemsToDisplay.length
|
||||||
|
|
||||||
|
if (limit !== undefined && totalDisplayed >= limit) {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (limit !== undefined && totalAvailable > limit) {
|
||||||
|
console.warn(
|
||||||
|
`\n⚠️ Showing first ${String(limit)} of ${String(totalAvailable)} issues (use --limit to adjust)\n`,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
formatArchitectureViolation(v: ArchitectureViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${v.file}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[v.severity]}`)
|
||||||
|
console.log(` Rule: ${v.rule}`)
|
||||||
|
console.log(` ${v.message}`)
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatCircularDependency(cd: CircularDependencyViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${cd.message}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[cd.severity]}`)
|
||||||
|
console.log(" Cycle path:")
|
||||||
|
cd.cycle.forEach((file, i) => {
|
||||||
|
console.log(` ${String(i + 1)}. ${file}`)
|
||||||
|
})
|
||||||
|
console.log(` ${String(cd.cycle.length + 1)}. ${cd.cycle[0]} (back to start)`)
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatNamingViolation(nc: NamingConventionViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${nc.file}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[nc.severity]}`)
|
||||||
|
console.log(` File: ${nc.fileName}`)
|
||||||
|
console.log(` Layer: ${nc.layer}`)
|
||||||
|
console.log(` Type: ${nc.type}`)
|
||||||
|
console.log(` Message: ${nc.message}`)
|
||||||
|
if (nc.suggestion) {
|
||||||
|
console.log(` 💡 Suggestion: ${nc.suggestion}`)
|
||||||
|
}
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatFrameworkLeak(fl: FrameworkLeakViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${fl.file}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[fl.severity]}`)
|
||||||
|
console.log(` Package: ${fl.packageName}`)
|
||||||
|
console.log(` Category: ${fl.categoryDescription}`)
|
||||||
|
console.log(` Layer: ${fl.layer}`)
|
||||||
|
console.log(` Rule: ${fl.rule}`)
|
||||||
|
console.log(` ${fl.message}`)
|
||||||
|
console.log(` 💡 Suggestion: ${fl.suggestion}`)
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatEntityExposure(ee: EntityExposureViolation, index: number): void {
|
||||||
|
const location = ee.line ? `${ee.file}:${String(ee.line)}` : ee.file
|
||||||
|
console.log(`${String(index + 1)}. ${location}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[ee.severity]}`)
|
||||||
|
console.log(` Entity: ${ee.entityName}`)
|
||||||
|
console.log(` Return Type: ${ee.returnType}`)
|
||||||
|
if (ee.methodName) {
|
||||||
|
console.log(` Method: ${ee.methodName}`)
|
||||||
|
}
|
||||||
|
console.log(` Layer: ${ee.layer}`)
|
||||||
|
console.log(` Rule: ${ee.rule}`)
|
||||||
|
console.log(` ${ee.message}`)
|
||||||
|
console.log(" 💡 Suggestion:")
|
||||||
|
ee.suggestion.split("\n").forEach((line) => {
|
||||||
|
if (line.trim()) {
|
||||||
|
console.log(` ${line}`)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatDependencyDirection(dd: DependencyDirectionViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${dd.file}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[dd.severity]}`)
|
||||||
|
console.log(` From Layer: ${dd.fromLayer}`)
|
||||||
|
console.log(` To Layer: ${dd.toLayer}`)
|
||||||
|
console.log(` Import: ${dd.importPath}`)
|
||||||
|
console.log(` ${dd.message}`)
|
||||||
|
console.log(` 💡 Suggestion: ${dd.suggestion}`)
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatRepositoryPattern(rp: RepositoryPatternViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${rp.file}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[rp.severity]}`)
|
||||||
|
console.log(` Layer: ${rp.layer}`)
|
||||||
|
console.log(` Type: ${rp.violationType}`)
|
||||||
|
console.log(` Details: ${rp.details}`)
|
||||||
|
console.log(` ${rp.message}`)
|
||||||
|
console.log(` 💡 Suggestion: ${rp.suggestion}`)
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatAggregateBoundary(ab: AggregateBoundaryViolation, index: number): void {
|
||||||
|
const location = ab.line ? `${ab.file}:${String(ab.line)}` : ab.file
|
||||||
|
console.log(`${String(index + 1)}. ${location}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[ab.severity]}`)
|
||||||
|
console.log(` From Aggregate: ${ab.fromAggregate}`)
|
||||||
|
console.log(` To Aggregate: ${ab.toAggregate}`)
|
||||||
|
console.log(` Entity: ${ab.entityName}`)
|
||||||
|
console.log(` Import: ${ab.importPath}`)
|
||||||
|
console.log(` ${ab.message}`)
|
||||||
|
console.log(" 💡 Suggestion:")
|
||||||
|
ab.suggestion.split("\n").forEach((line) => {
|
||||||
|
if (line.trim()) {
|
||||||
|
console.log(` ${line}`)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatSecretViolation(sv: SecretViolation, index: number): void {
|
||||||
|
const location = `${sv.file}:${String(sv.line)}:${String(sv.column)}`
|
||||||
|
console.log(`${String(index + 1)}. ${location}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[sv.severity]} ⚠️`)
|
||||||
|
console.log(` Secret Type: ${sv.secretType}`)
|
||||||
|
console.log(` ${sv.message}`)
|
||||||
|
console.log(" 🔐 CRITICAL: Rotate this secret immediately!")
|
||||||
|
console.log(" 💡 Suggestion:")
|
||||||
|
sv.suggestion.split("\n").forEach((line) => {
|
||||||
|
if (line.trim()) {
|
||||||
|
console.log(` ${line}`)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatHardcodeViolation(hc: HardcodeViolation, index: number): void {
|
||||||
|
console.log(`${String(index + 1)}. ${hc.file}:${String(hc.line)}:${String(hc.column)}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[hc.severity]}`)
|
||||||
|
console.log(` Type: ${hc.type}`)
|
||||||
|
console.log(` Value: ${JSON.stringify(hc.value)}`)
|
||||||
|
console.log(` Context: ${hc.context.trim()}`)
|
||||||
|
console.log(` 💡 Suggested: ${hc.suggestion.constantName}`)
|
||||||
|
console.log(` 📁 Location: ${hc.suggestion.location}`)
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
|
formatAnemicModelViolation(am: AnemicModelViolation, index: number): void {
|
||||||
|
const location = am.line ? `${am.file}:${String(am.line)}` : am.file
|
||||||
|
console.log(`${String(index + 1)}. ${location}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[am.severity]}`)
|
||||||
|
console.log(` Class: ${am.className}`)
|
||||||
|
console.log(` Layer: ${am.layer}`)
|
||||||
|
console.log(
|
||||||
|
` Methods: ${String(am.methodCount)} | Properties: ${String(am.propertyCount)}`,
|
||||||
|
)
|
||||||
|
|
||||||
|
if (am.hasPublicSetters) {
|
||||||
|
console.log(" ⚠️ Has public setters (DDD anti-pattern)")
|
||||||
|
}
|
||||||
|
if (am.hasOnlyGettersSetters) {
|
||||||
|
console.log(" ⚠️ Only getters/setters (no business logic)")
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(` ${am.message}`)
|
||||||
|
console.log(" 💡 Suggestion:")
|
||||||
|
am.suggestion.split("\n").forEach((line) => {
|
||||||
|
if (line.trim()) {
|
||||||
|
console.log(` ${line}`)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
}
|
||||||
59
packages/guardian/src/cli/formatters/StatisticsFormatter.ts
Normal file
59
packages/guardian/src/cli/formatters/StatisticsFormatter.ts
Normal file
@@ -0,0 +1,59 @@
|
|||||||
|
import { CLI_LABELS, CLI_MESSAGES } from "../constants"
|
||||||
|
|
||||||
|
interface ProjectMetrics {
|
||||||
|
totalFiles: number
|
||||||
|
totalFunctions: number
|
||||||
|
totalImports: number
|
||||||
|
layerDistribution: Record<string, number>
|
||||||
|
}
|
||||||
|
|
||||||
|
export class StatisticsFormatter {
|
||||||
|
displayMetrics(metrics: ProjectMetrics): void {
|
||||||
|
console.log(CLI_MESSAGES.METRICS_HEADER)
|
||||||
|
console.log(` ${CLI_LABELS.FILES_ANALYZED} ${String(metrics.totalFiles)}`)
|
||||||
|
console.log(` ${CLI_LABELS.TOTAL_FUNCTIONS} ${String(metrics.totalFunctions)}`)
|
||||||
|
console.log(` ${CLI_LABELS.TOTAL_IMPORTS} ${String(metrics.totalImports)}`)
|
||||||
|
|
||||||
|
if (Object.keys(metrics.layerDistribution).length > 0) {
|
||||||
|
console.log(CLI_MESSAGES.LAYER_DISTRIBUTION_HEADER)
|
||||||
|
for (const [layer, count] of Object.entries(metrics.layerDistribution)) {
|
||||||
|
console.log(` ${layer}: ${String(count)} ${CLI_LABELS.FILES}`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
displaySummary(totalIssues: number, verbose: boolean): void {
|
||||||
|
if (totalIssues === 0) {
|
||||||
|
console.log(CLI_MESSAGES.NO_ISSUES)
|
||||||
|
process.exit(0)
|
||||||
|
} else {
|
||||||
|
console.log(
|
||||||
|
`${CLI_MESSAGES.ISSUES_TOTAL} ${String(totalIssues)} ${CLI_LABELS.ISSUES_TOTAL}`,
|
||||||
|
)
|
||||||
|
console.log(CLI_MESSAGES.TIP)
|
||||||
|
|
||||||
|
if (verbose) {
|
||||||
|
console.log(CLI_MESSAGES.HELP_FOOTER)
|
||||||
|
}
|
||||||
|
|
||||||
|
process.exit(1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
displaySeverityFilterMessage(onlyCritical: boolean, minSeverity?: string): void {
|
||||||
|
if (onlyCritical) {
|
||||||
|
console.log("\n🔴 Filtering: Showing only CRITICAL severity issues\n")
|
||||||
|
} else if (minSeverity) {
|
||||||
|
console.log(
|
||||||
|
`\n⚠️ Filtering: Showing ${minSeverity.toUpperCase()} severity and above\n`,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
displayError(message: string): void {
|
||||||
|
console.error(`\n❌ ${CLI_MESSAGES.ERROR_PREFIX}`)
|
||||||
|
console.error(message)
|
||||||
|
console.error("")
|
||||||
|
process.exit(1)
|
||||||
|
}
|
||||||
|
}
|
||||||
29
packages/guardian/src/cli/groupers/ViolationGrouper.ts
Normal file
29
packages/guardian/src/cli/groupers/ViolationGrouper.ts
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
import { SEVERITY_ORDER, type SeverityLevel } from "../../shared/constants"
|
||||||
|
|
||||||
|
export class ViolationGrouper {
|
||||||
|
groupBySeverity<T extends { severity: SeverityLevel }>(
|
||||||
|
violations: T[],
|
||||||
|
): Map<SeverityLevel, T[]> {
|
||||||
|
const grouped = new Map<SeverityLevel, T[]>()
|
||||||
|
|
||||||
|
for (const violation of violations) {
|
||||||
|
const existing = grouped.get(violation.severity) ?? []
|
||||||
|
existing.push(violation)
|
||||||
|
grouped.set(violation.severity, existing)
|
||||||
|
}
|
||||||
|
|
||||||
|
return grouped
|
||||||
|
}
|
||||||
|
|
||||||
|
filterBySeverity<T extends { severity: SeverityLevel }>(
|
||||||
|
violations: T[],
|
||||||
|
minSeverity?: SeverityLevel,
|
||||||
|
): T[] {
|
||||||
|
if (!minSeverity) {
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
const minSeverityOrder = SEVERITY_ORDER[minSeverity]
|
||||||
|
return violations.filter((v) => SEVERITY_ORDER[v.severity] <= minSeverityOrder)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -11,92 +11,11 @@ import {
|
|||||||
CLI_MESSAGES,
|
CLI_MESSAGES,
|
||||||
CLI_OPTIONS,
|
CLI_OPTIONS,
|
||||||
DEFAULT_EXCLUDES,
|
DEFAULT_EXCLUDES,
|
||||||
SEVERITY_DISPLAY_LABELS,
|
|
||||||
SEVERITY_SECTION_HEADERS,
|
|
||||||
} from "./constants"
|
} from "./constants"
|
||||||
import { SEVERITY_LEVELS, SEVERITY_ORDER, type SeverityLevel } from "../shared/constants"
|
import { SEVERITY_LEVELS, type SeverityLevel } from "../shared/constants"
|
||||||
|
import { ViolationGrouper } from "./groupers/ViolationGrouper"
|
||||||
const SEVERITY_LABELS: Record<SeverityLevel, string> = {
|
import { OutputFormatter } from "./formatters/OutputFormatter"
|
||||||
[SEVERITY_LEVELS.CRITICAL]: SEVERITY_DISPLAY_LABELS.CRITICAL,
|
import { StatisticsFormatter } from "./formatters/StatisticsFormatter"
|
||||||
[SEVERITY_LEVELS.HIGH]: SEVERITY_DISPLAY_LABELS.HIGH,
|
|
||||||
[SEVERITY_LEVELS.MEDIUM]: SEVERITY_DISPLAY_LABELS.MEDIUM,
|
|
||||||
[SEVERITY_LEVELS.LOW]: SEVERITY_DISPLAY_LABELS.LOW,
|
|
||||||
}
|
|
||||||
|
|
||||||
const SEVERITY_HEADER: Record<SeverityLevel, string> = {
|
|
||||||
[SEVERITY_LEVELS.CRITICAL]: SEVERITY_SECTION_HEADERS.CRITICAL,
|
|
||||||
[SEVERITY_LEVELS.HIGH]: SEVERITY_SECTION_HEADERS.HIGH,
|
|
||||||
[SEVERITY_LEVELS.MEDIUM]: SEVERITY_SECTION_HEADERS.MEDIUM,
|
|
||||||
[SEVERITY_LEVELS.LOW]: SEVERITY_SECTION_HEADERS.LOW,
|
|
||||||
}
|
|
||||||
|
|
||||||
function groupBySeverity<T extends { severity: SeverityLevel }>(
|
|
||||||
violations: T[],
|
|
||||||
): Map<SeverityLevel, T[]> {
|
|
||||||
const grouped = new Map<SeverityLevel, T[]>()
|
|
||||||
|
|
||||||
for (const violation of violations) {
|
|
||||||
const existing = grouped.get(violation.severity) ?? []
|
|
||||||
existing.push(violation)
|
|
||||||
grouped.set(violation.severity, existing)
|
|
||||||
}
|
|
||||||
|
|
||||||
return grouped
|
|
||||||
}
|
|
||||||
|
|
||||||
function filterBySeverity<T extends { severity: SeverityLevel }>(
|
|
||||||
violations: T[],
|
|
||||||
minSeverity?: SeverityLevel,
|
|
||||||
): T[] {
|
|
||||||
if (!minSeverity) {
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
const minSeverityOrder = SEVERITY_ORDER[minSeverity]
|
|
||||||
return violations.filter((v) => SEVERITY_ORDER[v.severity] <= minSeverityOrder)
|
|
||||||
}
|
|
||||||
|
|
||||||
function displayGroupedViolations<T extends { severity: SeverityLevel }>(
|
|
||||||
violations: T[],
|
|
||||||
displayFn: (v: T, index: number) => void,
|
|
||||||
limit?: number,
|
|
||||||
): void {
|
|
||||||
const grouped = groupBySeverity(violations)
|
|
||||||
const severities: SeverityLevel[] = [
|
|
||||||
SEVERITY_LEVELS.CRITICAL,
|
|
||||||
SEVERITY_LEVELS.HIGH,
|
|
||||||
SEVERITY_LEVELS.MEDIUM,
|
|
||||||
SEVERITY_LEVELS.LOW,
|
|
||||||
]
|
|
||||||
|
|
||||||
let totalDisplayed = 0
|
|
||||||
const totalAvailable = violations.length
|
|
||||||
|
|
||||||
for (const severity of severities) {
|
|
||||||
const items = grouped.get(severity)
|
|
||||||
if (items && items.length > 0) {
|
|
||||||
console.warn(SEVERITY_HEADER[severity])
|
|
||||||
console.warn(`Found ${String(items.length)} issue(s)\n`)
|
|
||||||
|
|
||||||
const itemsToDisplay =
|
|
||||||
limit !== undefined ? items.slice(0, limit - totalDisplayed) : items
|
|
||||||
itemsToDisplay.forEach((item, index) => {
|
|
||||||
displayFn(item, totalDisplayed + index)
|
|
||||||
})
|
|
||||||
totalDisplayed += itemsToDisplay.length
|
|
||||||
|
|
||||||
if (limit !== undefined && totalDisplayed >= limit) {
|
|
||||||
break
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (limit !== undefined && totalAvailable > limit) {
|
|
||||||
console.warn(
|
|
||||||
`\n⚠️ Showing first ${String(limit)} of ${String(totalAvailable)} issues (use --limit to adjust)\n`,
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const program = new Command()
|
const program = new Command()
|
||||||
|
|
||||||
@@ -122,7 +41,20 @@ program
|
|||||||
CLI_HELP_TEXT.FIX_ENTITY +
|
CLI_HELP_TEXT.FIX_ENTITY +
|
||||||
CLI_HELP_TEXT.FIX_DEPENDENCY +
|
CLI_HELP_TEXT.FIX_DEPENDENCY +
|
||||||
CLI_HELP_TEXT.FIX_REPOSITORY +
|
CLI_HELP_TEXT.FIX_REPOSITORY +
|
||||||
CLI_HELP_TEXT.FOOTER,
|
CLI_HELP_TEXT.FOOTER +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_HEADER +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_INTRO +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP1 +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP1_CMD +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP2 +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP2_DETAIL +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP3 +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP3_CMD +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP4 +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_STEP4_CMDS +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_OUTPUT +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_OUTPUT_DETAIL +
|
||||||
|
CLI_HELP_TEXT.AI_AGENT_PRIORITY,
|
||||||
)
|
)
|
||||||
|
|
||||||
program
|
program
|
||||||
@@ -137,6 +69,10 @@ program
|
|||||||
.option(CLI_OPTIONS.ONLY_CRITICAL, CLI_DESCRIPTIONS.ONLY_CRITICAL_OPTION, false)
|
.option(CLI_OPTIONS.ONLY_CRITICAL, CLI_DESCRIPTIONS.ONLY_CRITICAL_OPTION, false)
|
||||||
.option(CLI_OPTIONS.LIMIT, CLI_DESCRIPTIONS.LIMIT_OPTION)
|
.option(CLI_OPTIONS.LIMIT, CLI_DESCRIPTIONS.LIMIT_OPTION)
|
||||||
.action(async (path: string, options) => {
|
.action(async (path: string, options) => {
|
||||||
|
const grouper = new ViolationGrouper()
|
||||||
|
const outputFormatter = new OutputFormatter()
|
||||||
|
const statsFormatter = new StatisticsFormatter()
|
||||||
|
|
||||||
try {
|
try {
|
||||||
console.log(CLI_MESSAGES.ANALYZING)
|
console.log(CLI_MESSAGES.ANALYZING)
|
||||||
|
|
||||||
@@ -155,6 +91,9 @@ program
|
|||||||
entityExposureViolations,
|
entityExposureViolations,
|
||||||
dependencyDirectionViolations,
|
dependencyDirectionViolations,
|
||||||
repositoryPatternViolations,
|
repositoryPatternViolations,
|
||||||
|
aggregateBoundaryViolations,
|
||||||
|
secretViolations,
|
||||||
|
anemicModelViolations,
|
||||||
} = result
|
} = result
|
||||||
|
|
||||||
const minSeverity: SeverityLevel | undefined = options.onlyCritical
|
const minSeverity: SeverityLevel | undefined = options.onlyCritical
|
||||||
@@ -168,237 +107,187 @@ program
|
|||||||
: undefined
|
: undefined
|
||||||
|
|
||||||
if (minSeverity) {
|
if (minSeverity) {
|
||||||
violations = filterBySeverity(violations, minSeverity)
|
violations = grouper.filterBySeverity(violations, minSeverity)
|
||||||
hardcodeViolations = filterBySeverity(hardcodeViolations, minSeverity)
|
hardcodeViolations = grouper.filterBySeverity(hardcodeViolations, minSeverity)
|
||||||
circularDependencyViolations = filterBySeverity(
|
circularDependencyViolations = grouper.filterBySeverity(
|
||||||
circularDependencyViolations,
|
circularDependencyViolations,
|
||||||
minSeverity,
|
minSeverity,
|
||||||
)
|
)
|
||||||
namingViolations = filterBySeverity(namingViolations, minSeverity)
|
namingViolations = grouper.filterBySeverity(namingViolations, minSeverity)
|
||||||
frameworkLeakViolations = filterBySeverity(frameworkLeakViolations, minSeverity)
|
frameworkLeakViolations = grouper.filterBySeverity(
|
||||||
entityExposureViolations = filterBySeverity(entityExposureViolations, minSeverity)
|
frameworkLeakViolations,
|
||||||
dependencyDirectionViolations = filterBySeverity(
|
minSeverity,
|
||||||
|
)
|
||||||
|
entityExposureViolations = grouper.filterBySeverity(
|
||||||
|
entityExposureViolations,
|
||||||
|
minSeverity,
|
||||||
|
)
|
||||||
|
dependencyDirectionViolations = grouper.filterBySeverity(
|
||||||
dependencyDirectionViolations,
|
dependencyDirectionViolations,
|
||||||
minSeverity,
|
minSeverity,
|
||||||
)
|
)
|
||||||
repositoryPatternViolations = filterBySeverity(
|
repositoryPatternViolations = grouper.filterBySeverity(
|
||||||
repositoryPatternViolations,
|
repositoryPatternViolations,
|
||||||
minSeverity,
|
minSeverity,
|
||||||
)
|
)
|
||||||
|
aggregateBoundaryViolations = grouper.filterBySeverity(
|
||||||
|
aggregateBoundaryViolations,
|
||||||
|
minSeverity,
|
||||||
|
)
|
||||||
|
secretViolations = grouper.filterBySeverity(secretViolations, minSeverity)
|
||||||
|
anemicModelViolations = grouper.filterBySeverity(anemicModelViolations, minSeverity)
|
||||||
|
|
||||||
if (options.onlyCritical) {
|
statsFormatter.displaySeverityFilterMessage(
|
||||||
console.log("\n🔴 Filtering: Showing only CRITICAL severity issues\n")
|
options.onlyCritical,
|
||||||
} else {
|
options.minSeverity,
|
||||||
console.log(
|
)
|
||||||
`\n⚠️ Filtering: Showing ${minSeverity.toUpperCase()} severity and above\n`,
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Display metrics
|
statsFormatter.displayMetrics(metrics)
|
||||||
console.log(CLI_MESSAGES.METRICS_HEADER)
|
|
||||||
console.log(` ${CLI_LABELS.FILES_ANALYZED} ${String(metrics.totalFiles)}`)
|
|
||||||
console.log(` ${CLI_LABELS.TOTAL_FUNCTIONS} ${String(metrics.totalFunctions)}`)
|
|
||||||
console.log(` ${CLI_LABELS.TOTAL_IMPORTS} ${String(metrics.totalImports)}`)
|
|
||||||
|
|
||||||
if (Object.keys(metrics.layerDistribution).length > 0) {
|
|
||||||
console.log(CLI_MESSAGES.LAYER_DISTRIBUTION_HEADER)
|
|
||||||
for (const [layer, count] of Object.entries(metrics.layerDistribution)) {
|
|
||||||
console.log(` ${layer}: ${String(count)} ${CLI_LABELS.FILES}`)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Architecture violations
|
|
||||||
if (options.architecture && violations.length > 0) {
|
if (options.architecture && violations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n${CLI_MESSAGES.VIOLATIONS_HEADER} ${String(violations.length)} ${CLI_LABELS.ARCHITECTURE_VIOLATIONS}`,
|
`\n${CLI_MESSAGES.VIOLATIONS_HEADER} ${String(violations.length)} ${CLI_LABELS.ARCHITECTURE_VIOLATIONS}`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
violations,
|
violations,
|
||||||
(v, index) => {
|
(v, i) => {
|
||||||
console.log(`${String(index + 1)}. ${v.file}`)
|
outputFormatter.formatArchitectureViolation(v, i)
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[v.severity]}`)
|
|
||||||
console.log(` Rule: ${v.rule}`)
|
|
||||||
console.log(` ${v.message}`)
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Circular dependency violations
|
|
||||||
if (options.architecture && circularDependencyViolations.length > 0) {
|
if (options.architecture && circularDependencyViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n${CLI_MESSAGES.CIRCULAR_DEPS_HEADER} ${String(circularDependencyViolations.length)} ${CLI_LABELS.CIRCULAR_DEPENDENCIES}`,
|
`\n${CLI_MESSAGES.CIRCULAR_DEPS_HEADER} ${String(circularDependencyViolations.length)} ${CLI_LABELS.CIRCULAR_DEPENDENCIES}`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
circularDependencyViolations,
|
circularDependencyViolations,
|
||||||
(cd, index) => {
|
(cd, i) => {
|
||||||
console.log(`${String(index + 1)}. ${cd.message}`)
|
outputFormatter.formatCircularDependency(cd, i)
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[cd.severity]}`)
|
|
||||||
console.log(" Cycle path:")
|
|
||||||
cd.cycle.forEach((file, i) => {
|
|
||||||
console.log(` ${String(i + 1)}. ${file}`)
|
|
||||||
})
|
|
||||||
console.log(
|
|
||||||
` ${String(cd.cycle.length + 1)}. ${cd.cycle[0]} (back to start)`,
|
|
||||||
)
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Naming convention violations
|
|
||||||
if (options.architecture && namingViolations.length > 0) {
|
if (options.architecture && namingViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n${CLI_MESSAGES.NAMING_VIOLATIONS_HEADER} ${String(namingViolations.length)} ${CLI_LABELS.NAMING_VIOLATIONS}`,
|
`\n${CLI_MESSAGES.NAMING_VIOLATIONS_HEADER} ${String(namingViolations.length)} ${CLI_LABELS.NAMING_VIOLATIONS}`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
namingViolations,
|
namingViolations,
|
||||||
(nc, index) => {
|
(nc, i) => {
|
||||||
console.log(`${String(index + 1)}. ${nc.file}`)
|
outputFormatter.formatNamingViolation(nc, i)
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[nc.severity]}`)
|
|
||||||
console.log(` File: ${nc.fileName}`)
|
|
||||||
console.log(` Layer: ${nc.layer}`)
|
|
||||||
console.log(` Type: ${nc.type}`)
|
|
||||||
console.log(` Message: ${nc.message}`)
|
|
||||||
if (nc.suggestion) {
|
|
||||||
console.log(` 💡 Suggestion: ${nc.suggestion}`)
|
|
||||||
}
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Framework leak violations
|
|
||||||
if (options.architecture && frameworkLeakViolations.length > 0) {
|
if (options.architecture && frameworkLeakViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n🏗️ Found ${String(frameworkLeakViolations.length)} framework leak(s)`,
|
`\n🏗️ Found ${String(frameworkLeakViolations.length)} framework leak(s)`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
frameworkLeakViolations,
|
frameworkLeakViolations,
|
||||||
(fl, index) => {
|
(fl, i) => {
|
||||||
console.log(`${String(index + 1)}. ${fl.file}`)
|
outputFormatter.formatFrameworkLeak(fl, i)
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[fl.severity]}`)
|
|
||||||
console.log(` Package: ${fl.packageName}`)
|
|
||||||
console.log(` Category: ${fl.categoryDescription}`)
|
|
||||||
console.log(` Layer: ${fl.layer}`)
|
|
||||||
console.log(` Rule: ${fl.rule}`)
|
|
||||||
console.log(` ${fl.message}`)
|
|
||||||
console.log(` 💡 Suggestion: ${fl.suggestion}`)
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Entity exposure violations
|
|
||||||
if (options.architecture && entityExposureViolations.length > 0) {
|
if (options.architecture && entityExposureViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n🎭 Found ${String(entityExposureViolations.length)} entity exposure(s)`,
|
`\n🎭 Found ${String(entityExposureViolations.length)} entity exposure(s)`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
entityExposureViolations,
|
entityExposureViolations,
|
||||||
(ee, index) => {
|
(ee, i) => {
|
||||||
const location = ee.line ? `${ee.file}:${String(ee.line)}` : ee.file
|
outputFormatter.formatEntityExposure(ee, i)
|
||||||
console.log(`${String(index + 1)}. ${location}`)
|
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[ee.severity]}`)
|
|
||||||
console.log(` Entity: ${ee.entityName}`)
|
|
||||||
console.log(` Return Type: ${ee.returnType}`)
|
|
||||||
if (ee.methodName) {
|
|
||||||
console.log(` Method: ${ee.methodName}`)
|
|
||||||
}
|
|
||||||
console.log(` Layer: ${ee.layer}`)
|
|
||||||
console.log(` Rule: ${ee.rule}`)
|
|
||||||
console.log(` ${ee.message}`)
|
|
||||||
console.log(" 💡 Suggestion:")
|
|
||||||
ee.suggestion.split("\n").forEach((line) => {
|
|
||||||
if (line.trim()) {
|
|
||||||
console.log(` ${line}`)
|
|
||||||
}
|
|
||||||
})
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Dependency direction violations
|
|
||||||
if (options.architecture && dependencyDirectionViolations.length > 0) {
|
if (options.architecture && dependencyDirectionViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n⚠️ Found ${String(dependencyDirectionViolations.length)} dependency direction violation(s)`,
|
`\n⚠️ Found ${String(dependencyDirectionViolations.length)} dependency direction violation(s)`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
dependencyDirectionViolations,
|
dependencyDirectionViolations,
|
||||||
(dd, index) => {
|
(dd, i) => {
|
||||||
console.log(`${String(index + 1)}. ${dd.file}`)
|
outputFormatter.formatDependencyDirection(dd, i)
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[dd.severity]}`)
|
|
||||||
console.log(` From Layer: ${dd.fromLayer}`)
|
|
||||||
console.log(` To Layer: ${dd.toLayer}`)
|
|
||||||
console.log(` Import: ${dd.importPath}`)
|
|
||||||
console.log(` ${dd.message}`)
|
|
||||||
console.log(` 💡 Suggestion: ${dd.suggestion}`)
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Repository pattern violations
|
|
||||||
if (options.architecture && repositoryPatternViolations.length > 0) {
|
if (options.architecture && repositoryPatternViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n📦 Found ${String(repositoryPatternViolations.length)} repository pattern violation(s)`,
|
`\n📦 Found ${String(repositoryPatternViolations.length)} repository pattern violation(s)`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
repositoryPatternViolations,
|
repositoryPatternViolations,
|
||||||
(rp, index) => {
|
(rp, i) => {
|
||||||
console.log(`${String(index + 1)}. ${rp.file}`)
|
outputFormatter.formatRepositoryPattern(rp, i)
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[rp.severity]}`)
|
},
|
||||||
console.log(` Layer: ${rp.layer}`)
|
limit,
|
||||||
console.log(` Type: ${rp.violationType}`)
|
)
|
||||||
console.log(` Details: ${rp.details}`)
|
}
|
||||||
console.log(` ${rp.message}`)
|
|
||||||
console.log(` 💡 Suggestion: ${rp.suggestion}`)
|
if (options.architecture && aggregateBoundaryViolations.length > 0) {
|
||||||
console.log("")
|
console.log(
|
||||||
|
`\n🔒 Found ${String(aggregateBoundaryViolations.length)} aggregate boundary violation(s)`,
|
||||||
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
|
aggregateBoundaryViolations,
|
||||||
|
(ab, i) => {
|
||||||
|
outputFormatter.formatAggregateBoundary(ab, i)
|
||||||
|
},
|
||||||
|
limit,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
if (secretViolations.length > 0) {
|
||||||
|
console.log(
|
||||||
|
`\n🔐 Found ${String(secretViolations.length)} hardcoded secret(s) - CRITICAL SECURITY RISK`,
|
||||||
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
|
secretViolations,
|
||||||
|
(sv, i) => {
|
||||||
|
outputFormatter.formatSecretViolation(sv, i)
|
||||||
|
},
|
||||||
|
limit,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
if (anemicModelViolations.length > 0) {
|
||||||
|
console.log(
|
||||||
|
`\n🩺 Found ${String(anemicModelViolations.length)} anemic domain model(s)`,
|
||||||
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
|
anemicModelViolations,
|
||||||
|
(am, i) => {
|
||||||
|
outputFormatter.formatAnemicModelViolation(am, i)
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Hardcode violations
|
|
||||||
if (options.hardcode && hardcodeViolations.length > 0) {
|
if (options.hardcode && hardcodeViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n${CLI_MESSAGES.HARDCODE_VIOLATIONS_HEADER} ${String(hardcodeViolations.length)} ${CLI_LABELS.HARDCODE_VIOLATIONS}`,
|
`\n${CLI_MESSAGES.HARDCODE_VIOLATIONS_HEADER} ${String(hardcodeViolations.length)} ${CLI_LABELS.HARDCODE_VIOLATIONS}`,
|
||||||
)
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
displayGroupedViolations(
|
|
||||||
hardcodeViolations,
|
hardcodeViolations,
|
||||||
(hc, index) => {
|
(hc, i) => {
|
||||||
console.log(
|
outputFormatter.formatHardcodeViolation(hc, i)
|
||||||
`${String(index + 1)}. ${hc.file}:${String(hc.line)}:${String(hc.column)}`,
|
|
||||||
)
|
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[hc.severity]}`)
|
|
||||||
console.log(` Type: ${hc.type}`)
|
|
||||||
console.log(` Value: ${JSON.stringify(hc.value)}`)
|
|
||||||
console.log(` Context: ${hc.context.trim()}`)
|
|
||||||
console.log(` 💡 Suggested: ${hc.suggestion.constantName}`)
|
|
||||||
console.log(` 📁 Location: ${hc.suggestion.location}`)
|
|
||||||
console.log("")
|
|
||||||
},
|
},
|
||||||
limit,
|
limit,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Summary
|
|
||||||
const totalIssues =
|
const totalIssues =
|
||||||
violations.length +
|
violations.length +
|
||||||
hardcodeViolations.length +
|
hardcodeViolations.length +
|
||||||
@@ -407,28 +296,14 @@ program
|
|||||||
frameworkLeakViolations.length +
|
frameworkLeakViolations.length +
|
||||||
entityExposureViolations.length +
|
entityExposureViolations.length +
|
||||||
dependencyDirectionViolations.length +
|
dependencyDirectionViolations.length +
|
||||||
repositoryPatternViolations.length
|
repositoryPatternViolations.length +
|
||||||
|
aggregateBoundaryViolations.length +
|
||||||
|
secretViolations.length +
|
||||||
|
anemicModelViolations.length
|
||||||
|
|
||||||
if (totalIssues === 0) {
|
statsFormatter.displaySummary(totalIssues, options.verbose)
|
||||||
console.log(CLI_MESSAGES.NO_ISSUES)
|
|
||||||
process.exit(0)
|
|
||||||
} else {
|
|
||||||
console.log(
|
|
||||||
`${CLI_MESSAGES.ISSUES_TOTAL} ${String(totalIssues)} ${CLI_LABELS.ISSUES_TOTAL}`,
|
|
||||||
)
|
|
||||||
console.log(CLI_MESSAGES.TIP)
|
|
||||||
|
|
||||||
if (options.verbose) {
|
|
||||||
console.log(CLI_MESSAGES.HELP_FOOTER)
|
|
||||||
}
|
|
||||||
|
|
||||||
process.exit(1)
|
|
||||||
}
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error(`\n❌ ${CLI_MESSAGES.ERROR_PREFIX}`)
|
statsFormatter.displayError(error instanceof Error ? error.message : String(error))
|
||||||
console.error(error instanceof Error ? error.message : String(error))
|
|
||||||
console.error("")
|
|
||||||
process.exit(1)
|
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
|
|||||||
@@ -48,3 +48,35 @@ export const REPOSITORY_PATTERN_MESSAGES = {
|
|||||||
SUGGESTION_DELETE: "remove or delete",
|
SUGGESTION_DELETE: "remove or delete",
|
||||||
SUGGESTION_QUERY: "find or search",
|
SUGGESTION_QUERY: "find or search",
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export const REPOSITORY_FALLBACK_SUGGESTIONS = {
|
||||||
|
DEFAULT: "findById() or findByEmail()",
|
||||||
|
}
|
||||||
|
|
||||||
|
export const AGGREGATE_VIOLATION_MESSAGES = {
|
||||||
|
USE_ID_REFERENCE: "1. Reference other aggregates by ID (UserId, OrderId) instead of entity",
|
||||||
|
USE_VALUE_OBJECT:
|
||||||
|
"2. Use Value Objects to store needed data from other aggregates (CustomerInfo, ProductSummary)",
|
||||||
|
AVOID_DIRECT_REFERENCE: "3. Avoid direct entity references to maintain aggregate independence",
|
||||||
|
MAINTAIN_INDEPENDENCE: "4. Each aggregate should be independently modifiable and deployable",
|
||||||
|
}
|
||||||
|
|
||||||
|
export const SECRET_VIOLATION_MESSAGES = {
|
||||||
|
USE_ENV_VARIABLES: "1. Use environment variables for sensitive data (process.env.API_KEY)",
|
||||||
|
USE_SECRET_MANAGER:
|
||||||
|
"2. Use secret management services (AWS Secrets Manager, HashiCorp Vault, etc.)",
|
||||||
|
NEVER_COMMIT_SECRETS: "3. Never commit secrets to version control",
|
||||||
|
ROTATE_IF_EXPOSED: "4. If secret was committed, rotate it immediately",
|
||||||
|
USE_GITIGNORE: "5. Add secret files to .gitignore (.env, credentials.json, etc.)",
|
||||||
|
}
|
||||||
|
|
||||||
|
export const ANEMIC_MODEL_MESSAGES = {
|
||||||
|
REMOVE_PUBLIC_SETTERS: "1. Remove public setters - they allow uncontrolled state changes",
|
||||||
|
USE_METHODS_FOR_CHANGES: "2. Use business methods instead (approve(), cancel(), addItem())",
|
||||||
|
ENCAPSULATE_INVARIANTS: "3. Encapsulate business rules and invariants in methods",
|
||||||
|
ADD_BUSINESS_METHODS: "1. Add business logic methods to the entity",
|
||||||
|
MOVE_LOGIC_FROM_SERVICES:
|
||||||
|
"2. Move business logic from services to domain entities where it belongs",
|
||||||
|
ENCAPSULATE_BUSINESS_RULES: "3. Encapsulate business rules inside entity methods",
|
||||||
|
USE_DOMAIN_EVENTS: "4. Use domain events to communicate state changes",
|
||||||
|
}
|
||||||
|
|||||||
79
packages/guardian/src/domain/constants/SecretExamples.ts
Normal file
79
packages/guardian/src/domain/constants/SecretExamples.ts
Normal file
@@ -0,0 +1,79 @@
|
|||||||
|
/**
|
||||||
|
* Secret detection constants
|
||||||
|
* All hardcoded strings related to secret detection and examples
|
||||||
|
*/
|
||||||
|
|
||||||
|
export const SECRET_KEYWORDS = {
|
||||||
|
AWS: "aws",
|
||||||
|
GITHUB: "github",
|
||||||
|
NPM: "npm",
|
||||||
|
SSH: "ssh",
|
||||||
|
PRIVATE_KEY: "private key",
|
||||||
|
SLACK: "slack",
|
||||||
|
API_KEY: "api key",
|
||||||
|
APIKEY: "apikey",
|
||||||
|
ACCESS_KEY: "access key",
|
||||||
|
SECRET: "secret",
|
||||||
|
TOKEN: "token",
|
||||||
|
PASSWORD: "password",
|
||||||
|
USER: "user",
|
||||||
|
BOT: "bot",
|
||||||
|
RSA: "rsa",
|
||||||
|
DSA: "dsa",
|
||||||
|
ECDSA: "ecdsa",
|
||||||
|
ED25519: "ed25519",
|
||||||
|
BASICAUTH: "basicauth",
|
||||||
|
GCP: "gcp",
|
||||||
|
GOOGLE: "google",
|
||||||
|
PRIVATEKEY: "privatekey",
|
||||||
|
PERSONAL_ACCESS_TOKEN: "personal access token",
|
||||||
|
OAUTH: "oauth",
|
||||||
|
} as const
|
||||||
|
|
||||||
|
export const SECRET_TYPE_NAMES = {
|
||||||
|
AWS_ACCESS_KEY: "AWS Access Key",
|
||||||
|
AWS_SECRET_KEY: "AWS Secret Key",
|
||||||
|
AWS_CREDENTIAL: "AWS Credential",
|
||||||
|
GITHUB_PERSONAL_ACCESS_TOKEN: "GitHub Personal Access Token",
|
||||||
|
GITHUB_OAUTH_TOKEN: "GitHub OAuth Token",
|
||||||
|
GITHUB_TOKEN: "GitHub Token",
|
||||||
|
NPM_TOKEN: "NPM Token",
|
||||||
|
GCP_SERVICE_ACCOUNT_KEY: "GCP Service Account Key",
|
||||||
|
SSH_RSA_PRIVATE_KEY: "SSH RSA Private Key",
|
||||||
|
SSH_DSA_PRIVATE_KEY: "SSH DSA Private Key",
|
||||||
|
SSH_ECDSA_PRIVATE_KEY: "SSH ECDSA Private Key",
|
||||||
|
SSH_ED25519_PRIVATE_KEY: "SSH Ed25519 Private Key",
|
||||||
|
SSH_PRIVATE_KEY: "SSH Private Key",
|
||||||
|
SLACK_BOT_TOKEN: "Slack Bot Token",
|
||||||
|
SLACK_USER_TOKEN: "Slack User Token",
|
||||||
|
SLACK_TOKEN: "Slack Token",
|
||||||
|
BASIC_AUTH_CREDENTIALS: "Basic Authentication Credentials",
|
||||||
|
API_KEY: "API Key",
|
||||||
|
AUTHENTICATION_TOKEN: "Authentication Token",
|
||||||
|
PASSWORD: "Password",
|
||||||
|
SECRET: "Secret",
|
||||||
|
SENSITIVE_DATA: "Sensitive Data",
|
||||||
|
} as const
|
||||||
|
|
||||||
|
export const SECRET_EXAMPLE_VALUES = {
|
||||||
|
AWS_ACCESS_KEY_ID: "AKIA1234567890ABCDEF",
|
||||||
|
AWS_SECRET_ACCESS_KEY: "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY",
|
||||||
|
GITHUB_TOKEN: "ghp_1234567890abcdefghijklmnopqrstuv",
|
||||||
|
NPM_TOKEN: "npm_abc123xyz",
|
||||||
|
SLACK_TOKEN: "xoxb-<token-here>",
|
||||||
|
API_KEY: "sk_live_XXXXXXXXXXXXXXXXXXXX_example_key",
|
||||||
|
HARDCODED_SECRET: "hardcoded-secret-value",
|
||||||
|
} as const
|
||||||
|
|
||||||
|
export const FILE_ENCODING = {
|
||||||
|
UTF8: "utf-8",
|
||||||
|
} as const
|
||||||
|
|
||||||
|
export const REGEX_ESCAPE_PATTERN = {
|
||||||
|
DOLLAR_AMPERSAND: "\\$&",
|
||||||
|
} as const
|
||||||
|
|
||||||
|
export const DYNAMIC_IMPORT_PATTERN_PARTS = {
|
||||||
|
QUOTE_START: '"`][^',
|
||||||
|
QUOTE_END: "`]+['\"",
|
||||||
|
} as const
|
||||||
@@ -0,0 +1,45 @@
|
|||||||
|
import { AggregateBoundaryViolation } from "../value-objects/AggregateBoundaryViolation"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Interface for detecting aggregate boundary violations in DDD
|
||||||
|
*
|
||||||
|
* Aggregate boundary violations occur when an entity from one aggregate
|
||||||
|
* directly references an entity from another aggregate. In DDD, aggregates
|
||||||
|
* should reference each other only by ID or Value Objects to maintain
|
||||||
|
* loose coupling and independence.
|
||||||
|
*/
|
||||||
|
export interface IAggregateBoundaryDetector {
|
||||||
|
/**
|
||||||
|
* Detects aggregate boundary violations in the given code
|
||||||
|
*
|
||||||
|
* Analyzes import statements to identify direct entity references
|
||||||
|
* across aggregate boundaries.
|
||||||
|
*
|
||||||
|
* @param code - Source code to analyze
|
||||||
|
* @param filePath - Path to the file being analyzed
|
||||||
|
* @param layer - The architectural layer of the file (should be 'domain')
|
||||||
|
* @returns Array of detected aggregate boundary violations
|
||||||
|
*/
|
||||||
|
detectViolations(
|
||||||
|
code: string,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
): AggregateBoundaryViolation[]
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a file path belongs to an aggregate
|
||||||
|
*
|
||||||
|
* @param filePath - The file path to check
|
||||||
|
* @returns The aggregate name if found, undefined otherwise
|
||||||
|
*/
|
||||||
|
extractAggregateFromPath(filePath: string): string | undefined
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if an import path references an entity from another aggregate
|
||||||
|
*
|
||||||
|
* @param importPath - The import path to analyze
|
||||||
|
* @param currentAggregate - The aggregate of the current file
|
||||||
|
* @returns True if the import crosses aggregate boundaries inappropriately
|
||||||
|
*/
|
||||||
|
isAggregateBoundaryViolation(importPath: string, currentAggregate: string): boolean
|
||||||
|
}
|
||||||
@@ -0,0 +1,29 @@
|
|||||||
|
import { AnemicModelViolation } from "../value-objects/AnemicModelViolation"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Interface for detecting anemic domain model violations in the codebase
|
||||||
|
*
|
||||||
|
* Anemic domain models are entities that contain only getters/setters
|
||||||
|
* without business logic. This anti-pattern violates Domain-Driven Design
|
||||||
|
* principles and leads to procedural code scattered in services.
|
||||||
|
*/
|
||||||
|
export interface IAnemicModelDetector {
|
||||||
|
/**
|
||||||
|
* Detects anemic model violations in the given code
|
||||||
|
*
|
||||||
|
* Analyzes classes in domain/entities to identify:
|
||||||
|
* - Classes with only getters and setters (no business logic)
|
||||||
|
* - Classes with public setters (DDD anti-pattern)
|
||||||
|
* - Classes with low method-to-property ratio
|
||||||
|
*
|
||||||
|
* @param code - Source code to analyze
|
||||||
|
* @param filePath - Path to the file being analyzed
|
||||||
|
* @param layer - The architectural layer of the file (domain, application, infrastructure, shared)
|
||||||
|
* @returns Array of detected anemic model violations
|
||||||
|
*/
|
||||||
|
detectAnemicModels(
|
||||||
|
code: string,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
): AnemicModelViolation[]
|
||||||
|
}
|
||||||
34
packages/guardian/src/domain/services/ISecretDetector.ts
Normal file
34
packages/guardian/src/domain/services/ISecretDetector.ts
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
import { SecretViolation } from "../value-objects/SecretViolation"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Interface for detecting hardcoded secrets in source code
|
||||||
|
*
|
||||||
|
* Detects sensitive data like API keys, tokens, passwords, and credentials
|
||||||
|
* that should never be hardcoded in source code. Uses industry-standard
|
||||||
|
* Secretlint library for pattern matching.
|
||||||
|
*
|
||||||
|
* All detected secrets are marked as CRITICAL severity violations.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* const detector: ISecretDetector = new SecretDetector()
|
||||||
|
* const violations = await detector.detectAll(
|
||||||
|
* 'const AWS_KEY = "AKIA1234567890ABCDEF"',
|
||||||
|
* 'src/config/aws.ts'
|
||||||
|
* )
|
||||||
|
*
|
||||||
|
* violations.forEach(v => {
|
||||||
|
* console.log(v.getMessage()) // "Hardcoded AWS Access Key detected"
|
||||||
|
* })
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export interface ISecretDetector {
|
||||||
|
/**
|
||||||
|
* Detect all types of hardcoded secrets in the provided code
|
||||||
|
*
|
||||||
|
* @param code - Source code to analyze
|
||||||
|
* @param filePath - Path to the file being analyzed
|
||||||
|
* @returns Array of secret violations found
|
||||||
|
*/
|
||||||
|
detectAll(code: string, filePath: string): Promise<SecretViolation[]>
|
||||||
|
}
|
||||||
@@ -0,0 +1,137 @@
|
|||||||
|
import { ValueObject } from "./ValueObject"
|
||||||
|
import { AGGREGATE_VIOLATION_MESSAGES } from "../constants/Messages"
|
||||||
|
|
||||||
|
interface AggregateBoundaryViolationProps {
|
||||||
|
readonly fromAggregate: string
|
||||||
|
readonly toAggregate: string
|
||||||
|
readonly entityName: string
|
||||||
|
readonly importPath: string
|
||||||
|
readonly filePath: string
|
||||||
|
readonly line?: number
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Represents an aggregate boundary violation in the codebase
|
||||||
|
*
|
||||||
|
* Aggregate boundary violations occur when an entity from one aggregate
|
||||||
|
* directly references an entity from another aggregate, violating DDD principles:
|
||||||
|
* - Aggregates should reference each other only by ID or Value Objects
|
||||||
|
* - Direct entity references create tight coupling between aggregates
|
||||||
|
* - Changes to one aggregate should not require changes to another
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* // Bad: Direct entity reference across aggregates
|
||||||
|
* const violation = AggregateBoundaryViolation.create(
|
||||||
|
* 'order',
|
||||||
|
* 'user',
|
||||||
|
* 'User',
|
||||||
|
* '../user/User',
|
||||||
|
* 'src/domain/aggregates/order/Order.ts',
|
||||||
|
* 5
|
||||||
|
* )
|
||||||
|
*
|
||||||
|
* console.log(violation.getMessage())
|
||||||
|
* // "Order aggregate should not directly reference User entity from User aggregate"
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export class AggregateBoundaryViolation extends ValueObject<AggregateBoundaryViolationProps> {
|
||||||
|
private constructor(props: AggregateBoundaryViolationProps) {
|
||||||
|
super(props)
|
||||||
|
}
|
||||||
|
|
||||||
|
public static create(
|
||||||
|
fromAggregate: string,
|
||||||
|
toAggregate: string,
|
||||||
|
entityName: string,
|
||||||
|
importPath: string,
|
||||||
|
filePath: string,
|
||||||
|
line?: number,
|
||||||
|
): AggregateBoundaryViolation {
|
||||||
|
return new AggregateBoundaryViolation({
|
||||||
|
fromAggregate,
|
||||||
|
toAggregate,
|
||||||
|
entityName,
|
||||||
|
importPath,
|
||||||
|
filePath,
|
||||||
|
line,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
public get fromAggregate(): string {
|
||||||
|
return this.props.fromAggregate
|
||||||
|
}
|
||||||
|
|
||||||
|
public get toAggregate(): string {
|
||||||
|
return this.props.toAggregate
|
||||||
|
}
|
||||||
|
|
||||||
|
public get entityName(): string {
|
||||||
|
return this.props.entityName
|
||||||
|
}
|
||||||
|
|
||||||
|
public get importPath(): string {
|
||||||
|
return this.props.importPath
|
||||||
|
}
|
||||||
|
|
||||||
|
public get filePath(): string {
|
||||||
|
return this.props.filePath
|
||||||
|
}
|
||||||
|
|
||||||
|
public get line(): number | undefined {
|
||||||
|
return this.props.line
|
||||||
|
}
|
||||||
|
|
||||||
|
public getMessage(): string {
|
||||||
|
return `${this.capitalizeFirst(this.props.fromAggregate)} aggregate should not directly reference ${this.props.entityName} entity from ${this.capitalizeFirst(this.props.toAggregate)} aggregate`
|
||||||
|
}
|
||||||
|
|
||||||
|
public getSuggestion(): string {
|
||||||
|
const suggestions: string[] = [
|
||||||
|
AGGREGATE_VIOLATION_MESSAGES.USE_ID_REFERENCE,
|
||||||
|
AGGREGATE_VIOLATION_MESSAGES.USE_VALUE_OBJECT,
|
||||||
|
AGGREGATE_VIOLATION_MESSAGES.AVOID_DIRECT_REFERENCE,
|
||||||
|
AGGREGATE_VIOLATION_MESSAGES.MAINTAIN_INDEPENDENCE,
|
||||||
|
]
|
||||||
|
|
||||||
|
return suggestions.join("\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
public getExampleFix(): string {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Direct entity reference across aggregates
|
||||||
|
// domain/aggregates/order/Order.ts
|
||||||
|
import { User } from '../user/User'
|
||||||
|
|
||||||
|
class Order {
|
||||||
|
constructor(private user: User) {}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ✅ Good: Reference by ID
|
||||||
|
// domain/aggregates/order/Order.ts
|
||||||
|
import { UserId } from '../user/value-objects/UserId'
|
||||||
|
|
||||||
|
class Order {
|
||||||
|
constructor(private userId: UserId) {}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ✅ Good: Use Value Object for needed data
|
||||||
|
// domain/aggregates/order/value-objects/CustomerInfo.ts
|
||||||
|
class CustomerInfo {
|
||||||
|
constructor(
|
||||||
|
readonly customerId: string,
|
||||||
|
readonly customerName: string,
|
||||||
|
readonly customerEmail: string
|
||||||
|
) {}
|
||||||
|
}
|
||||||
|
|
||||||
|
// domain/aggregates/order/Order.ts
|
||||||
|
class Order {
|
||||||
|
constructor(private customerInfo: CustomerInfo) {}
|
||||||
|
}`
|
||||||
|
}
|
||||||
|
|
||||||
|
private capitalizeFirst(str: string): string {
|
||||||
|
return str.charAt(0).toUpperCase() + str.slice(1)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,240 @@
|
|||||||
|
import { ValueObject } from "./ValueObject"
|
||||||
|
import { ANEMIC_MODEL_MESSAGES } from "../constants/Messages"
|
||||||
|
import { EXAMPLE_CODE_CONSTANTS } from "../../shared/constants"
|
||||||
|
|
||||||
|
interface AnemicModelViolationProps {
|
||||||
|
readonly className: string
|
||||||
|
readonly filePath: string
|
||||||
|
readonly layer: string
|
||||||
|
readonly line?: number
|
||||||
|
readonly methodCount: number
|
||||||
|
readonly propertyCount: number
|
||||||
|
readonly hasOnlyGettersSetters: boolean
|
||||||
|
readonly hasPublicSetters: boolean
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Represents an anemic domain model violation in the codebase
|
||||||
|
*
|
||||||
|
* Anemic domain model occurs when entities have only getters/setters
|
||||||
|
* without business logic. This violates Domain-Driven Design principles
|
||||||
|
* and leads to procedural code instead of object-oriented design.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* // Bad: Anemic model with only getters/setters
|
||||||
|
* const violation = AnemicModelViolation.create(
|
||||||
|
* 'Order',
|
||||||
|
* 'src/domain/entities/Order.ts',
|
||||||
|
* 'domain',
|
||||||
|
* 10,
|
||||||
|
* 4,
|
||||||
|
* 2,
|
||||||
|
* true,
|
||||||
|
* true
|
||||||
|
* )
|
||||||
|
*
|
||||||
|
* console.log(violation.getMessage())
|
||||||
|
* // "Class 'Order' is anemic: 4 methods (all getters/setters) for 2 properties"
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export class AnemicModelViolation extends ValueObject<AnemicModelViolationProps> {
|
||||||
|
private constructor(props: AnemicModelViolationProps) {
|
||||||
|
super(props)
|
||||||
|
}
|
||||||
|
|
||||||
|
public static create(
|
||||||
|
className: string,
|
||||||
|
filePath: string,
|
||||||
|
layer: string,
|
||||||
|
line: number | undefined,
|
||||||
|
methodCount: number,
|
||||||
|
propertyCount: number,
|
||||||
|
hasOnlyGettersSetters: boolean,
|
||||||
|
hasPublicSetters: boolean,
|
||||||
|
): AnemicModelViolation {
|
||||||
|
return new AnemicModelViolation({
|
||||||
|
className,
|
||||||
|
filePath,
|
||||||
|
layer,
|
||||||
|
line,
|
||||||
|
methodCount,
|
||||||
|
propertyCount,
|
||||||
|
hasOnlyGettersSetters,
|
||||||
|
hasPublicSetters,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
public get className(): string {
|
||||||
|
return this.props.className
|
||||||
|
}
|
||||||
|
|
||||||
|
public get filePath(): string {
|
||||||
|
return this.props.filePath
|
||||||
|
}
|
||||||
|
|
||||||
|
public get layer(): string {
|
||||||
|
return this.props.layer
|
||||||
|
}
|
||||||
|
|
||||||
|
public get line(): number | undefined {
|
||||||
|
return this.props.line
|
||||||
|
}
|
||||||
|
|
||||||
|
public get methodCount(): number {
|
||||||
|
return this.props.methodCount
|
||||||
|
}
|
||||||
|
|
||||||
|
public get propertyCount(): number {
|
||||||
|
return this.props.propertyCount
|
||||||
|
}
|
||||||
|
|
||||||
|
public get hasOnlyGettersSetters(): boolean {
|
||||||
|
return this.props.hasOnlyGettersSetters
|
||||||
|
}
|
||||||
|
|
||||||
|
public get hasPublicSetters(): boolean {
|
||||||
|
return this.props.hasPublicSetters
|
||||||
|
}
|
||||||
|
|
||||||
|
public getMessage(): string {
|
||||||
|
if (this.props.hasPublicSetters) {
|
||||||
|
return `Class '${this.props.className}' has public setters (anti-pattern in DDD)`
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.props.hasOnlyGettersSetters) {
|
||||||
|
return `Class '${this.props.className}' is anemic: ${String(this.props.methodCount)} methods (all getters/setters) for ${String(this.props.propertyCount)} properties`
|
||||||
|
}
|
||||||
|
|
||||||
|
const ratio = this.props.methodCount / Math.max(this.props.propertyCount, 1)
|
||||||
|
return `Class '${this.props.className}' appears anemic: low method-to-property ratio (${ratio.toFixed(1)}:1)`
|
||||||
|
}
|
||||||
|
|
||||||
|
public getSuggestion(): string {
|
||||||
|
const suggestions: string[] = []
|
||||||
|
|
||||||
|
if (this.props.hasPublicSetters) {
|
||||||
|
suggestions.push(ANEMIC_MODEL_MESSAGES.REMOVE_PUBLIC_SETTERS)
|
||||||
|
suggestions.push(ANEMIC_MODEL_MESSAGES.USE_METHODS_FOR_CHANGES)
|
||||||
|
suggestions.push(ANEMIC_MODEL_MESSAGES.ENCAPSULATE_INVARIANTS)
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.props.hasOnlyGettersSetters || this.props.methodCount < 2) {
|
||||||
|
suggestions.push(ANEMIC_MODEL_MESSAGES.ADD_BUSINESS_METHODS)
|
||||||
|
suggestions.push(ANEMIC_MODEL_MESSAGES.MOVE_LOGIC_FROM_SERVICES)
|
||||||
|
suggestions.push(ANEMIC_MODEL_MESSAGES.ENCAPSULATE_BUSINESS_RULES)
|
||||||
|
suggestions.push(ANEMIC_MODEL_MESSAGES.USE_DOMAIN_EVENTS)
|
||||||
|
}
|
||||||
|
|
||||||
|
return suggestions.join("\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
public getExampleFix(): string {
|
||||||
|
if (this.props.hasPublicSetters) {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Public setters allow uncontrolled state changes
|
||||||
|
class ${this.props.className} {
|
||||||
|
private status: string
|
||||||
|
|
||||||
|
public setStatus(status: string): void {
|
||||||
|
this.status = status // No validation!
|
||||||
|
}
|
||||||
|
|
||||||
|
public getStatus(): string {
|
||||||
|
return this.status
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ✅ Good: Business methods with validation
|
||||||
|
class ${this.props.className} {
|
||||||
|
private status: OrderStatus
|
||||||
|
|
||||||
|
public approve(): void {
|
||||||
|
if (!this.canBeApproved()) {
|
||||||
|
throw new CannotApproveOrderError()
|
||||||
|
}
|
||||||
|
this.status = OrderStatus.APPROVED
|
||||||
|
this.events.push(new OrderApprovedEvent(this.id))
|
||||||
|
}
|
||||||
|
|
||||||
|
public reject(reason: string): void {
|
||||||
|
if (!this.canBeRejected()) {
|
||||||
|
throw new CannotRejectOrderError()
|
||||||
|
}
|
||||||
|
this.status = OrderStatus.REJECTED
|
||||||
|
this.rejectionReason = reason
|
||||||
|
this.events.push(new OrderRejectedEvent(this.id, reason))
|
||||||
|
}
|
||||||
|
|
||||||
|
public getStatus(): OrderStatus {
|
||||||
|
return this.status
|
||||||
|
}
|
||||||
|
|
||||||
|
private canBeApproved(): boolean {
|
||||||
|
return this.status === OrderStatus.PENDING && this.hasItems()
|
||||||
|
}
|
||||||
|
}`
|
||||||
|
}
|
||||||
|
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Anemic model (only getters/setters)
|
||||||
|
class ${this.props.className} {
|
||||||
|
getStatus() { return this.status }
|
||||||
|
setStatus(status: string) { this.status = status }
|
||||||
|
|
||||||
|
getTotal() { return this.total }
|
||||||
|
setTotal(total: number) { this.total = total }
|
||||||
|
}
|
||||||
|
|
||||||
|
class OrderService {
|
||||||
|
approve(order: ${this.props.className}): void {
|
||||||
|
if (order.getStatus() !== '${EXAMPLE_CODE_CONSTANTS.ORDER_STATUS_PENDING}') {
|
||||||
|
throw new Error('${EXAMPLE_CODE_CONSTANTS.CANNOT_APPROVE_ERROR}')
|
||||||
|
}
|
||||||
|
order.setStatus('${EXAMPLE_CODE_CONSTANTS.ORDER_STATUS_APPROVED}')
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ✅ Good: Rich domain model with business logic
|
||||||
|
class ${this.props.className} {
|
||||||
|
private readonly id: OrderId
|
||||||
|
private status: OrderStatus
|
||||||
|
private items: OrderItem[]
|
||||||
|
private events: DomainEvent[] = []
|
||||||
|
|
||||||
|
public approve(): void {
|
||||||
|
if (!this.isPending()) {
|
||||||
|
throw new CannotApproveOrderError()
|
||||||
|
}
|
||||||
|
this.status = OrderStatus.APPROVED
|
||||||
|
this.events.push(new OrderApprovedEvent(this.id))
|
||||||
|
}
|
||||||
|
|
||||||
|
public calculateTotal(): Money {
|
||||||
|
return this.items.reduce(
|
||||||
|
(sum, item) => sum.add(item.getPrice()),
|
||||||
|
Money.zero()
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
public addItem(item: OrderItem): void {
|
||||||
|
if (this.isApproved()) {
|
||||||
|
throw new CannotModifyApprovedOrderError()
|
||||||
|
}
|
||||||
|
this.items.push(item)
|
||||||
|
}
|
||||||
|
|
||||||
|
public getStatus(): OrderStatus {
|
||||||
|
return this.status
|
||||||
|
}
|
||||||
|
|
||||||
|
private isPending(): boolean {
|
||||||
|
return this.status === OrderStatus.PENDING
|
||||||
|
}
|
||||||
|
|
||||||
|
private isApproved(): boolean {
|
||||||
|
return this.status === OrderStatus.APPROVED
|
||||||
|
}
|
||||||
|
}`
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
import { ValueObject } from "./ValueObject"
|
import { ValueObject } from "./ValueObject"
|
||||||
import { REPOSITORY_VIOLATION_TYPES } from "../../shared/constants/rules"
|
import { REPOSITORY_VIOLATION_TYPES } from "../../shared/constants/rules"
|
||||||
import { REPOSITORY_PATTERN_MESSAGES } from "../constants/Messages"
|
import { REPOSITORY_FALLBACK_SUGGESTIONS, REPOSITORY_PATTERN_MESSAGES } from "../constants/Messages"
|
||||||
|
|
||||||
interface RepositoryViolationProps {
|
interface RepositoryViolationProps {
|
||||||
readonly violationType:
|
readonly violationType:
|
||||||
@@ -177,6 +177,9 @@ export class RepositoryViolation extends ValueObject<RepositoryViolationProps> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
private getNonDomainMethodSuggestion(): string {
|
private getNonDomainMethodSuggestion(): string {
|
||||||
|
const detailsMatch = /Consider: (.+)$/.exec(this.props.details)
|
||||||
|
const smartSuggestion = detailsMatch ? detailsMatch[1] : null
|
||||||
|
|
||||||
const technicalToDomain = {
|
const technicalToDomain = {
|
||||||
findOne: REPOSITORY_PATTERN_MESSAGES.SUGGESTION_FINDONE,
|
findOne: REPOSITORY_PATTERN_MESSAGES.SUGGESTION_FINDONE,
|
||||||
findMany: REPOSITORY_PATTERN_MESSAGES.SUGGESTION_FINDMANY,
|
findMany: REPOSITORY_PATTERN_MESSAGES.SUGGESTION_FINDMANY,
|
||||||
@@ -186,8 +189,10 @@ export class RepositoryViolation extends ValueObject<RepositoryViolationProps> {
|
|||||||
query: REPOSITORY_PATTERN_MESSAGES.SUGGESTION_QUERY,
|
query: REPOSITORY_PATTERN_MESSAGES.SUGGESTION_QUERY,
|
||||||
}
|
}
|
||||||
|
|
||||||
const suggestion =
|
const fallbackSuggestion =
|
||||||
technicalToDomain[this.props.methodName as keyof typeof technicalToDomain]
|
technicalToDomain[this.props.methodName as keyof typeof technicalToDomain]
|
||||||
|
const finalSuggestion =
|
||||||
|
smartSuggestion || fallbackSuggestion || REPOSITORY_FALLBACK_SUGGESTIONS.DEFAULT
|
||||||
|
|
||||||
return [
|
return [
|
||||||
REPOSITORY_PATTERN_MESSAGES.STEP_RENAME_METHOD,
|
REPOSITORY_PATTERN_MESSAGES.STEP_RENAME_METHOD,
|
||||||
@@ -196,7 +201,7 @@ export class RepositoryViolation extends ValueObject<RepositoryViolationProps> {
|
|||||||
"",
|
"",
|
||||||
REPOSITORY_PATTERN_MESSAGES.EXAMPLE_PREFIX,
|
REPOSITORY_PATTERN_MESSAGES.EXAMPLE_PREFIX,
|
||||||
`❌ Bad: ${this.props.methodName || "findOne"}()`,
|
`❌ Bad: ${this.props.methodName || "findOne"}()`,
|
||||||
`✅ Good: ${suggestion || "findById() or findByEmail()"}`,
|
`✅ Good: ${finalSuggestion}`,
|
||||||
].join("\n")
|
].join("\n")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
204
packages/guardian/src/domain/value-objects/SecretViolation.ts
Normal file
204
packages/guardian/src/domain/value-objects/SecretViolation.ts
Normal file
@@ -0,0 +1,204 @@
|
|||||||
|
import { ValueObject } from "./ValueObject"
|
||||||
|
import { SECRET_VIOLATION_MESSAGES } from "../constants/Messages"
|
||||||
|
import { SEVERITY_LEVELS } from "../../shared/constants"
|
||||||
|
import { FILE_ENCODING, SECRET_EXAMPLE_VALUES, SECRET_KEYWORDS } from "../constants/SecretExamples"
|
||||||
|
|
||||||
|
interface SecretViolationProps {
|
||||||
|
readonly file: string
|
||||||
|
readonly line: number
|
||||||
|
readonly column: number
|
||||||
|
readonly secretType: string
|
||||||
|
readonly matchedPattern: string
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Represents a secret exposure violation in the codebase
|
||||||
|
*
|
||||||
|
* Secret violations occur when sensitive data like API keys, tokens, passwords,
|
||||||
|
* or credentials are hardcoded in the source code instead of being stored
|
||||||
|
* in secure environment variables or secret management systems.
|
||||||
|
*
|
||||||
|
* All secret violations are marked as CRITICAL severity because they represent
|
||||||
|
* serious security risks that could lead to unauthorized access, data breaches,
|
||||||
|
* or service compromise.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* const violation = SecretViolation.create(
|
||||||
|
* 'src/config/aws.ts',
|
||||||
|
* 10,
|
||||||
|
* 15,
|
||||||
|
* 'AWS Access Key',
|
||||||
|
* 'AKIA1234567890ABCDEF'
|
||||||
|
* )
|
||||||
|
*
|
||||||
|
* console.log(violation.getMessage())
|
||||||
|
* // "Hardcoded AWS Access Key detected"
|
||||||
|
*
|
||||||
|
* console.log(violation.getSeverity())
|
||||||
|
* // "critical"
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export class SecretViolation extends ValueObject<SecretViolationProps> {
|
||||||
|
private constructor(props: SecretViolationProps) {
|
||||||
|
super(props)
|
||||||
|
}
|
||||||
|
|
||||||
|
public static create(
|
||||||
|
file: string,
|
||||||
|
line: number,
|
||||||
|
column: number,
|
||||||
|
secretType: string,
|
||||||
|
matchedPattern: string,
|
||||||
|
): SecretViolation {
|
||||||
|
return new SecretViolation({
|
||||||
|
file,
|
||||||
|
line,
|
||||||
|
column,
|
||||||
|
secretType,
|
||||||
|
matchedPattern,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
public get file(): string {
|
||||||
|
return this.props.file
|
||||||
|
}
|
||||||
|
|
||||||
|
public get line(): number {
|
||||||
|
return this.props.line
|
||||||
|
}
|
||||||
|
|
||||||
|
public get column(): number {
|
||||||
|
return this.props.column
|
||||||
|
}
|
||||||
|
|
||||||
|
public get secretType(): string {
|
||||||
|
return this.props.secretType
|
||||||
|
}
|
||||||
|
|
||||||
|
public get matchedPattern(): string {
|
||||||
|
return this.props.matchedPattern
|
||||||
|
}
|
||||||
|
|
||||||
|
public getMessage(): string {
|
||||||
|
return `Hardcoded ${this.props.secretType} detected`
|
||||||
|
}
|
||||||
|
|
||||||
|
public getSuggestion(): string {
|
||||||
|
const suggestions: string[] = [
|
||||||
|
SECRET_VIOLATION_MESSAGES.USE_ENV_VARIABLES,
|
||||||
|
SECRET_VIOLATION_MESSAGES.USE_SECRET_MANAGER,
|
||||||
|
SECRET_VIOLATION_MESSAGES.NEVER_COMMIT_SECRETS,
|
||||||
|
SECRET_VIOLATION_MESSAGES.ROTATE_IF_EXPOSED,
|
||||||
|
SECRET_VIOLATION_MESSAGES.USE_GITIGNORE,
|
||||||
|
]
|
||||||
|
|
||||||
|
return suggestions.join("\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
public getExampleFix(): string {
|
||||||
|
return this.getExampleFixForSecretType(this.props.secretType)
|
||||||
|
}
|
||||||
|
|
||||||
|
public getSeverity(): typeof SEVERITY_LEVELS.CRITICAL {
|
||||||
|
return SEVERITY_LEVELS.CRITICAL
|
||||||
|
}
|
||||||
|
|
||||||
|
private getExampleFixForSecretType(secretType: string): string {
|
||||||
|
const lowerType = secretType.toLowerCase()
|
||||||
|
|
||||||
|
if (lowerType.includes(SECRET_KEYWORDS.AWS)) {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded AWS credentials
|
||||||
|
const AWS_ACCESS_KEY_ID = "${SECRET_EXAMPLE_VALUES.AWS_ACCESS_KEY_ID}"
|
||||||
|
const AWS_SECRET_ACCESS_KEY = "${SECRET_EXAMPLE_VALUES.AWS_SECRET_ACCESS_KEY}"
|
||||||
|
|
||||||
|
// ✅ Good: Use environment variables
|
||||||
|
const AWS_ACCESS_KEY_ID = process.env.AWS_ACCESS_KEY_ID
|
||||||
|
const AWS_SECRET_ACCESS_KEY = process.env.AWS_SECRET_ACCESS_KEY
|
||||||
|
|
||||||
|
// ✅ Good: Use credentials provider (in infrastructure layer)
|
||||||
|
// Load credentials from environment or credentials file`
|
||||||
|
}
|
||||||
|
|
||||||
|
if (lowerType.includes(SECRET_KEYWORDS.GITHUB)) {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded GitHub token
|
||||||
|
const GITHUB_TOKEN = "${SECRET_EXAMPLE_VALUES.GITHUB_TOKEN}"
|
||||||
|
|
||||||
|
// ✅ Good: Use environment variables
|
||||||
|
const GITHUB_TOKEN = process.env.GITHUB_TOKEN
|
||||||
|
|
||||||
|
// ✅ Good: GitHub Apps with temporary tokens
|
||||||
|
// Use GitHub Apps for automated workflows instead of personal access tokens`
|
||||||
|
}
|
||||||
|
|
||||||
|
if (lowerType.includes(SECRET_KEYWORDS.NPM)) {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded NPM token in code
|
||||||
|
const NPM_TOKEN = "${SECRET_EXAMPLE_VALUES.NPM_TOKEN}"
|
||||||
|
|
||||||
|
// ✅ Good: Use .npmrc file (add to .gitignore)
|
||||||
|
// .npmrc
|
||||||
|
//registry.npmjs.org/:_authToken=\${NPM_TOKEN}
|
||||||
|
|
||||||
|
// ✅ Good: Use environment variable
|
||||||
|
const NPM_TOKEN = process.env.NPM_TOKEN`
|
||||||
|
}
|
||||||
|
|
||||||
|
if (
|
||||||
|
lowerType.includes(SECRET_KEYWORDS.SSH) ||
|
||||||
|
lowerType.includes(SECRET_KEYWORDS.PRIVATE_KEY)
|
||||||
|
) {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded SSH private key
|
||||||
|
const privateKey = \`-----BEGIN RSA PRIVATE KEY-----
|
||||||
|
MIIEpAIBAAKCAQEA...\`
|
||||||
|
|
||||||
|
// ✅ Good: Load from secure file (not in repository)
|
||||||
|
import fs from "fs"
|
||||||
|
const privateKey = fs.readFileSync(process.env.SSH_KEY_PATH, "${FILE_ENCODING.UTF8}")
|
||||||
|
|
||||||
|
// ✅ Good: Use SSH agent
|
||||||
|
// Configure SSH agent to handle keys securely`
|
||||||
|
}
|
||||||
|
|
||||||
|
if (lowerType.includes(SECRET_KEYWORDS.SLACK)) {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded Slack token
|
||||||
|
const SLACK_TOKEN = "${SECRET_EXAMPLE_VALUES.SLACK_TOKEN}"
|
||||||
|
|
||||||
|
// ✅ Good: Use environment variables
|
||||||
|
const SLACK_TOKEN = process.env.SLACK_BOT_TOKEN
|
||||||
|
|
||||||
|
// ✅ Good: Use OAuth flow for user tokens
|
||||||
|
// Implement OAuth 2.0 flow instead of hardcoding tokens`
|
||||||
|
}
|
||||||
|
|
||||||
|
if (
|
||||||
|
lowerType.includes(SECRET_KEYWORDS.API_KEY) ||
|
||||||
|
lowerType.includes(SECRET_KEYWORDS.APIKEY)
|
||||||
|
) {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded API key
|
||||||
|
const API_KEY = "${SECRET_EXAMPLE_VALUES.API_KEY}"
|
||||||
|
|
||||||
|
// ✅ Good: Use environment variables
|
||||||
|
const API_KEY = process.env.API_KEY
|
||||||
|
|
||||||
|
// ✅ Good: Use secret management service (in infrastructure layer)
|
||||||
|
// AWS Secrets Manager, HashiCorp Vault, Azure Key Vault
|
||||||
|
// Implement secret retrieval in infrastructure and inject via DI`
|
||||||
|
}
|
||||||
|
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded secret
|
||||||
|
const SECRET = "${SECRET_EXAMPLE_VALUES.HARDCODED_SECRET}"
|
||||||
|
|
||||||
|
// ✅ Good: Use environment variables
|
||||||
|
const SECRET = process.env.SECRET_KEY
|
||||||
|
|
||||||
|
// ✅ Good: Use secret management
|
||||||
|
// AWS Secrets Manager, HashiCorp Vault, Azure Key Vault, etc.`
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,167 @@
|
|||||||
|
import { IAggregateBoundaryDetector } from "../../domain/services/IAggregateBoundaryDetector"
|
||||||
|
import { AggregateBoundaryViolation } from "../../domain/value-objects/AggregateBoundaryViolation"
|
||||||
|
import { LAYERS } from "../../shared/constants/rules"
|
||||||
|
import { AggregatePathAnalyzer } from "../strategies/AggregatePathAnalyzer"
|
||||||
|
import { FolderRegistry } from "../strategies/FolderRegistry"
|
||||||
|
import { ImportValidator } from "../strategies/ImportValidator"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects aggregate boundary violations in Domain-Driven Design
|
||||||
|
*
|
||||||
|
* This detector enforces DDD aggregate rules:
|
||||||
|
* - Aggregates should reference each other only by ID or Value Objects
|
||||||
|
* - Direct entity references across aggregates create tight coupling
|
||||||
|
* - Each aggregate should be independently modifiable
|
||||||
|
*
|
||||||
|
* Folder structure patterns detected:
|
||||||
|
* - domain/aggregates/order/Order.ts
|
||||||
|
* - domain/order/Order.ts (aggregate name from parent folder)
|
||||||
|
* - domain/entities/order/Order.ts
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* const detector = new AggregateBoundaryDetector()
|
||||||
|
*
|
||||||
|
* // Detect violations in order aggregate
|
||||||
|
* const code = `
|
||||||
|
* import { User } from '../user/User'
|
||||||
|
* import { UserId } from '../user/value-objects/UserId'
|
||||||
|
* `
|
||||||
|
* const violations = detector.detectViolations(
|
||||||
|
* code,
|
||||||
|
* 'src/domain/aggregates/order/Order.ts',
|
||||||
|
* 'domain'
|
||||||
|
* )
|
||||||
|
*
|
||||||
|
* // violations will contain 1 violation for direct User entity import
|
||||||
|
* // but not for UserId (value object is allowed)
|
||||||
|
* console.log(violations.length) // 1
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export class AggregateBoundaryDetector implements IAggregateBoundaryDetector {
|
||||||
|
private readonly folderRegistry: FolderRegistry
|
||||||
|
private readonly pathAnalyzer: AggregatePathAnalyzer
|
||||||
|
private readonly importValidator: ImportValidator
|
||||||
|
|
||||||
|
constructor() {
|
||||||
|
this.folderRegistry = new FolderRegistry()
|
||||||
|
this.pathAnalyzer = new AggregatePathAnalyzer(this.folderRegistry)
|
||||||
|
this.importValidator = new ImportValidator(this.folderRegistry, this.pathAnalyzer)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects aggregate boundary violations in the given code
|
||||||
|
*
|
||||||
|
* Analyzes import statements to identify direct entity references
|
||||||
|
* across aggregate boundaries in the domain layer.
|
||||||
|
*
|
||||||
|
* @param code - Source code to analyze
|
||||||
|
* @param filePath - Path to the file being analyzed
|
||||||
|
* @param layer - The architectural layer of the file (should be 'domain')
|
||||||
|
* @returns Array of detected aggregate boundary violations
|
||||||
|
*/
|
||||||
|
public detectViolations(
|
||||||
|
code: string,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
): AggregateBoundaryViolation[] {
|
||||||
|
if (layer !== LAYERS.DOMAIN) {
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
|
||||||
|
const currentAggregate = this.pathAnalyzer.extractAggregateFromPath(filePath)
|
||||||
|
if (!currentAggregate) {
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.analyzeImports(code, filePath, currentAggregate)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a file path belongs to an aggregate
|
||||||
|
*
|
||||||
|
* Extracts aggregate name from paths like:
|
||||||
|
* - domain/aggregates/order/Order.ts → 'order'
|
||||||
|
* - domain/order/Order.ts → 'order'
|
||||||
|
* - domain/entities/order/Order.ts → 'order'
|
||||||
|
*
|
||||||
|
* @param filePath - The file path to check
|
||||||
|
* @returns The aggregate name if found, undefined otherwise
|
||||||
|
*/
|
||||||
|
public extractAggregateFromPath(filePath: string): string | undefined {
|
||||||
|
return this.pathAnalyzer.extractAggregateFromPath(filePath)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if an import path references an entity from another aggregate
|
||||||
|
*
|
||||||
|
* @param importPath - The import path to analyze
|
||||||
|
* @param currentAggregate - The aggregate of the current file
|
||||||
|
* @returns True if the import crosses aggregate boundaries inappropriately
|
||||||
|
*/
|
||||||
|
public isAggregateBoundaryViolation(importPath: string, currentAggregate: string): boolean {
|
||||||
|
return this.importValidator.isViolation(importPath, currentAggregate)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Analyzes all imports in code and detects violations
|
||||||
|
*/
|
||||||
|
private analyzeImports(
|
||||||
|
code: string,
|
||||||
|
filePath: string,
|
||||||
|
currentAggregate: string,
|
||||||
|
): AggregateBoundaryViolation[] {
|
||||||
|
const violations: AggregateBoundaryViolation[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
|
for (let i = 0; i < lines.length; i++) {
|
||||||
|
const line = lines[i]
|
||||||
|
const lineNumber = i + 1
|
||||||
|
|
||||||
|
const imports = this.importValidator.extractImports(line)
|
||||||
|
for (const importPath of imports) {
|
||||||
|
const violation = this.checkImport(
|
||||||
|
importPath,
|
||||||
|
currentAggregate,
|
||||||
|
filePath,
|
||||||
|
lineNumber,
|
||||||
|
)
|
||||||
|
if (violation) {
|
||||||
|
violations.push(violation)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks a single import for boundary violations
|
||||||
|
*/
|
||||||
|
private checkImport(
|
||||||
|
importPath: string,
|
||||||
|
currentAggregate: string,
|
||||||
|
filePath: string,
|
||||||
|
lineNumber: number,
|
||||||
|
): AggregateBoundaryViolation | undefined {
|
||||||
|
if (!this.importValidator.isViolation(importPath, currentAggregate)) {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
const targetAggregate = this.pathAnalyzer.extractAggregateFromImport(importPath)
|
||||||
|
const entityName = this.pathAnalyzer.extractEntityName(importPath)
|
||||||
|
|
||||||
|
if (targetAggregate && entityName) {
|
||||||
|
return AggregateBoundaryViolation.create(
|
||||||
|
currentAggregate,
|
||||||
|
targetAggregate,
|
||||||
|
entityName,
|
||||||
|
importPath,
|
||||||
|
filePath,
|
||||||
|
lineNumber,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,318 @@
|
|||||||
|
import { IAnemicModelDetector } from "../../domain/services/IAnemicModelDetector"
|
||||||
|
import { AnemicModelViolation } from "../../domain/value-objects/AnemicModelViolation"
|
||||||
|
import { CLASS_KEYWORDS } from "../../shared/constants"
|
||||||
|
import { LAYERS } from "../../shared/constants/rules"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects anemic domain model violations
|
||||||
|
*
|
||||||
|
* This detector identifies entities that lack business logic and contain
|
||||||
|
* only getters/setters. Anemic models violate Domain-Driven Design principles.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* const detector = new AnemicModelDetector()
|
||||||
|
*
|
||||||
|
* // Detect anemic models in entity file
|
||||||
|
* const code = `
|
||||||
|
* class Order {
|
||||||
|
* getStatus() { return this.status }
|
||||||
|
* setStatus(status: string) { this.status = status }
|
||||||
|
* getTotal() { return this.total }
|
||||||
|
* setTotal(total: number) { this.total = total }
|
||||||
|
* }
|
||||||
|
* `
|
||||||
|
* const violations = detector.detectAnemicModels(
|
||||||
|
* code,
|
||||||
|
* 'src/domain/entities/Order.ts',
|
||||||
|
* 'domain'
|
||||||
|
* )
|
||||||
|
*
|
||||||
|
* // violations will contain anemic model violation
|
||||||
|
* console.log(violations.length) // 1
|
||||||
|
* console.log(violations[0].className) // 'Order'
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export class AnemicModelDetector implements IAnemicModelDetector {
|
||||||
|
private readonly entityPatterns = [/\/entities\//, /\/aggregates\//]
|
||||||
|
private readonly excludePatterns = [
|
||||||
|
/\.test\.ts$/,
|
||||||
|
/\.spec\.ts$/,
|
||||||
|
/Dto\.ts$/,
|
||||||
|
/Request\.ts$/,
|
||||||
|
/Response\.ts$/,
|
||||||
|
/Mapper\.ts$/,
|
||||||
|
]
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects anemic model violations in the given code
|
||||||
|
*/
|
||||||
|
public detectAnemicModels(
|
||||||
|
code: string,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
): AnemicModelViolation[] {
|
||||||
|
if (!this.shouldAnalyze(filePath, layer)) {
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
|
||||||
|
const violations: AnemicModelViolation[] = []
|
||||||
|
const classes = this.extractClasses(code)
|
||||||
|
|
||||||
|
for (const classInfo of classes) {
|
||||||
|
const violation = this.analyzeClass(classInfo, filePath, layer || LAYERS.DOMAIN)
|
||||||
|
if (violation) {
|
||||||
|
violations.push(violation)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if file should be analyzed
|
||||||
|
*/
|
||||||
|
private shouldAnalyze(filePath: string, layer: string | undefined): boolean {
|
||||||
|
if (layer !== LAYERS.DOMAIN) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.excludePatterns.some((pattern) => pattern.test(filePath))) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.entityPatterns.some((pattern) => pattern.test(filePath))
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts class information from code
|
||||||
|
*/
|
||||||
|
private extractClasses(code: string): ClassInfo[] {
|
||||||
|
const classes: ClassInfo[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
let currentClass: { name: string; startLine: number; startIndex: number } | null = null
|
||||||
|
let braceCount = 0
|
||||||
|
let classBody = ""
|
||||||
|
|
||||||
|
for (let i = 0; i < lines.length; i++) {
|
||||||
|
const line = lines[i]
|
||||||
|
|
||||||
|
if (!currentClass) {
|
||||||
|
const classRegex = /^\s*(?:export\s+)?(?:abstract\s+)?class\s+(\w+)/
|
||||||
|
const classMatch = classRegex.exec(line)
|
||||||
|
if (classMatch) {
|
||||||
|
currentClass = {
|
||||||
|
name: classMatch[1],
|
||||||
|
startLine: i + 1,
|
||||||
|
startIndex: lines.slice(0, i).join("\n").length,
|
||||||
|
}
|
||||||
|
braceCount = 0
|
||||||
|
classBody = ""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (currentClass) {
|
||||||
|
for (const char of line) {
|
||||||
|
if (char === "{") {
|
||||||
|
braceCount++
|
||||||
|
} else if (char === "}") {
|
||||||
|
braceCount--
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (braceCount > 0) {
|
||||||
|
classBody = `${classBody}${line}\n`
|
||||||
|
} else if (braceCount === 0 && classBody.length > 0) {
|
||||||
|
const properties = this.extractProperties(classBody)
|
||||||
|
const methods = this.extractMethods(classBody)
|
||||||
|
|
||||||
|
classes.push({
|
||||||
|
className: currentClass.name,
|
||||||
|
lineNumber: currentClass.startLine,
|
||||||
|
properties,
|
||||||
|
methods,
|
||||||
|
})
|
||||||
|
|
||||||
|
currentClass = null
|
||||||
|
classBody = ""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return classes
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts properties from class body
|
||||||
|
*/
|
||||||
|
private extractProperties(classBody: string): PropertyInfo[] {
|
||||||
|
const properties: PropertyInfo[] = []
|
||||||
|
const propertyRegex = /(?:private|protected|public|readonly)*\s*(\w+)(?:\?)?:\s*\w+/g
|
||||||
|
|
||||||
|
let match
|
||||||
|
while ((match = propertyRegex.exec(classBody)) !== null) {
|
||||||
|
const propertyName = match[1]
|
||||||
|
|
||||||
|
if (!this.isMethodSignature(match[0])) {
|
||||||
|
properties.push({ name: propertyName })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return properties
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts methods from class body
|
||||||
|
*/
|
||||||
|
private extractMethods(classBody: string): MethodInfo[] {
|
||||||
|
const methods: MethodInfo[] = []
|
||||||
|
const methodRegex =
|
||||||
|
/(public|private|protected)?\s*(get|set)?\s+(\w+)\s*\([^)]*\)(?:\s*:\s*\w+)?/g
|
||||||
|
|
||||||
|
let match
|
||||||
|
while ((match = methodRegex.exec(classBody)) !== null) {
|
||||||
|
const visibility = match[1] || CLASS_KEYWORDS.PUBLIC
|
||||||
|
const accessor = match[2]
|
||||||
|
const methodName = match[3]
|
||||||
|
|
||||||
|
if (methodName === CLASS_KEYWORDS.CONSTRUCTOR) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
const isGetter = accessor === "get" || this.isGetterMethod(methodName)
|
||||||
|
const isSetter = accessor === "set" || this.isSetterMethod(methodName, classBody)
|
||||||
|
const isPublic = visibility === CLASS_KEYWORDS.PUBLIC || !visibility
|
||||||
|
|
||||||
|
methods.push({
|
||||||
|
name: methodName,
|
||||||
|
isGetter,
|
||||||
|
isSetter,
|
||||||
|
isPublic,
|
||||||
|
isBusinessLogic: !isGetter && !isSetter,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
return methods
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Analyzes class for anemic model violations
|
||||||
|
*/
|
||||||
|
private analyzeClass(
|
||||||
|
classInfo: ClassInfo,
|
||||||
|
filePath: string,
|
||||||
|
layer: string,
|
||||||
|
): AnemicModelViolation | null {
|
||||||
|
const { className, lineNumber, properties, methods } = classInfo
|
||||||
|
|
||||||
|
if (properties.length === 0 && methods.length === 0) {
|
||||||
|
return null
|
||||||
|
}
|
||||||
|
|
||||||
|
const businessMethods = methods.filter((m) => m.isBusinessLogic)
|
||||||
|
const hasOnlyGettersSetters = businessMethods.length === 0 && methods.length > 0
|
||||||
|
const hasPublicSetters = methods.some((m) => m.isSetter && m.isPublic)
|
||||||
|
|
||||||
|
const methodCount = methods.length
|
||||||
|
const propertyCount = properties.length
|
||||||
|
|
||||||
|
if (hasPublicSetters) {
|
||||||
|
return AnemicModelViolation.create(
|
||||||
|
className,
|
||||||
|
filePath,
|
||||||
|
layer,
|
||||||
|
lineNumber,
|
||||||
|
methodCount,
|
||||||
|
propertyCount,
|
||||||
|
false,
|
||||||
|
true,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
if (hasOnlyGettersSetters && methodCount >= 2 && propertyCount > 0) {
|
||||||
|
return AnemicModelViolation.create(
|
||||||
|
className,
|
||||||
|
filePath,
|
||||||
|
layer,
|
||||||
|
lineNumber,
|
||||||
|
methodCount,
|
||||||
|
propertyCount,
|
||||||
|
true,
|
||||||
|
false,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
const methodToPropertyRatio = methodCount / Math.max(propertyCount, 1)
|
||||||
|
if (
|
||||||
|
propertyCount > 0 &&
|
||||||
|
businessMethods.length < 2 &&
|
||||||
|
methodToPropertyRatio < 1.0 &&
|
||||||
|
methodCount > 0
|
||||||
|
) {
|
||||||
|
return AnemicModelViolation.create(
|
||||||
|
className,
|
||||||
|
filePath,
|
||||||
|
layer,
|
||||||
|
lineNumber,
|
||||||
|
methodCount,
|
||||||
|
propertyCount,
|
||||||
|
false,
|
||||||
|
false,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
return null
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if method name is a getter pattern
|
||||||
|
*/
|
||||||
|
private isGetterMethod(methodName: string): boolean {
|
||||||
|
return (
|
||||||
|
methodName.startsWith("get") ||
|
||||||
|
methodName.startsWith("is") ||
|
||||||
|
methodName.startsWith("has")
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if method is a setter pattern
|
||||||
|
*/
|
||||||
|
private isSetterMethod(methodName: string, _classBody: string): boolean {
|
||||||
|
return methodName.startsWith("set")
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if property declaration is actually a method signature
|
||||||
|
*/
|
||||||
|
private isMethodSignature(propertyDeclaration: string): boolean {
|
||||||
|
return propertyDeclaration.includes("(") && propertyDeclaration.includes(")")
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Gets line number for a position in code
|
||||||
|
*/
|
||||||
|
private getLineNumber(code: string, position: number): number {
|
||||||
|
const lines = code.substring(0, position).split("\n")
|
||||||
|
return lines.length
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ClassInfo {
|
||||||
|
className: string
|
||||||
|
lineNumber: number
|
||||||
|
properties: PropertyInfo[]
|
||||||
|
methods: MethodInfo[]
|
||||||
|
}
|
||||||
|
|
||||||
|
interface PropertyInfo {
|
||||||
|
name: string
|
||||||
|
}
|
||||||
|
|
||||||
|
interface MethodInfo {
|
||||||
|
name: string
|
||||||
|
isGetter: boolean
|
||||||
|
isSetter: boolean
|
||||||
|
isPublic: boolean
|
||||||
|
isBusinessLogic: boolean
|
||||||
|
}
|
||||||
@@ -1,7 +1,10 @@
|
|||||||
import { IHardcodeDetector } from "../../domain/services/IHardcodeDetector"
|
import { IHardcodeDetector } from "../../domain/services/IHardcodeDetector"
|
||||||
import { HardcodedValue } from "../../domain/value-objects/HardcodedValue"
|
import { HardcodedValue } from "../../domain/value-objects/HardcodedValue"
|
||||||
import { ALLOWED_NUMBERS, CODE_PATTERNS, DETECTION_KEYWORDS } from "../constants/defaults"
|
import { BraceTracker } from "../strategies/BraceTracker"
|
||||||
import { HARDCODE_TYPES } from "../../shared/constants"
|
import { ConstantsFileChecker } from "../strategies/ConstantsFileChecker"
|
||||||
|
import { ExportConstantAnalyzer } from "../strategies/ExportConstantAnalyzer"
|
||||||
|
import { MagicNumberMatcher } from "../strategies/MagicNumberMatcher"
|
||||||
|
import { MagicStringMatcher } from "../strategies/MagicStringMatcher"
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Detects hardcoded values (magic numbers and strings) in TypeScript/JavaScript code
|
* Detects hardcoded values (magic numbers and strings) in TypeScript/JavaScript code
|
||||||
@@ -22,9 +25,19 @@ import { HARDCODE_TYPES } from "../../shared/constants"
|
|||||||
* ```
|
* ```
|
||||||
*/
|
*/
|
||||||
export class HardcodeDetector implements IHardcodeDetector {
|
export class HardcodeDetector implements IHardcodeDetector {
|
||||||
private readonly ALLOWED_NUMBERS = ALLOWED_NUMBERS
|
private readonly constantsChecker: ConstantsFileChecker
|
||||||
|
private readonly braceTracker: BraceTracker
|
||||||
|
private readonly exportAnalyzer: ExportConstantAnalyzer
|
||||||
|
private readonly numberMatcher: MagicNumberMatcher
|
||||||
|
private readonly stringMatcher: MagicStringMatcher
|
||||||
|
|
||||||
private readonly ALLOWED_STRING_PATTERNS = [/^[a-z]$/i, /^\/$/, /^\\$/, /^\s+$/, /^,$/, /^\.$/]
|
constructor() {
|
||||||
|
this.constantsChecker = new ConstantsFileChecker()
|
||||||
|
this.braceTracker = new BraceTracker()
|
||||||
|
this.exportAnalyzer = new ExportConstantAnalyzer(this.braceTracker)
|
||||||
|
this.numberMatcher = new MagicNumberMatcher(this.exportAnalyzer)
|
||||||
|
this.stringMatcher = new MagicStringMatcher(this.exportAnalyzer)
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Detects all hardcoded values (both numbers and strings) in the given code
|
* Detects all hardcoded values (both numbers and strings) in the given code
|
||||||
@@ -34,358 +47,43 @@ export class HardcodeDetector implements IHardcodeDetector {
|
|||||||
* @returns Array of detected hardcoded values with suggestions
|
* @returns Array of detected hardcoded values with suggestions
|
||||||
*/
|
*/
|
||||||
public detectAll(code: string, filePath: string): HardcodedValue[] {
|
public detectAll(code: string, filePath: string): HardcodedValue[] {
|
||||||
if (this.isConstantsFile(filePath)) {
|
if (this.constantsChecker.isConstantsFile(filePath)) {
|
||||||
return []
|
return []
|
||||||
}
|
}
|
||||||
const magicNumbers = this.detectMagicNumbers(code, filePath)
|
|
||||||
const magicStrings = this.detectMagicStrings(code, filePath)
|
const magicNumbers = this.numberMatcher.detect(code)
|
||||||
|
const magicStrings = this.stringMatcher.detect(code)
|
||||||
|
|
||||||
return [...magicNumbers, ...magicStrings]
|
return [...magicNumbers, ...magicStrings]
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Check if a file is a constants definition file
|
* Detects magic numbers in code
|
||||||
*/
|
|
||||||
private isConstantsFile(filePath: string): boolean {
|
|
||||||
const _fileName = filePath.split("/").pop() ?? ""
|
|
||||||
const constantsPatterns = [
|
|
||||||
/^constants?\.(ts|js)$/i,
|
|
||||||
/constants?\/.*\.(ts|js)$/i,
|
|
||||||
/\/(constants|config|settings|defaults)\.ts$/i,
|
|
||||||
]
|
|
||||||
return constantsPatterns.some((pattern) => pattern.test(filePath))
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if a line is inside an exported constant definition
|
|
||||||
*/
|
|
||||||
private isInExportedConstant(lines: string[], lineIndex: number): boolean {
|
|
||||||
const currentLineTrimmed = lines[lineIndex].trim()
|
|
||||||
|
|
||||||
if (this.isSingleLineExportConst(currentLineTrimmed)) {
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
|
|
||||||
const exportConstStart = this.findExportConstStart(lines, lineIndex)
|
|
||||||
if (exportConstStart === -1) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
const { braces, brackets } = this.countUnclosedBraces(lines, exportConstStart, lineIndex)
|
|
||||||
return braces > 0 || brackets > 0
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if a line is a single-line export const declaration
|
|
||||||
*/
|
|
||||||
private isSingleLineExportConst(line: string): boolean {
|
|
||||||
if (!line.startsWith(CODE_PATTERNS.EXPORT_CONST)) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
const hasObjectOrArray =
|
|
||||||
line.includes(CODE_PATTERNS.OBJECT_START) || line.includes(CODE_PATTERNS.ARRAY_START)
|
|
||||||
|
|
||||||
if (hasObjectOrArray) {
|
|
||||||
const hasAsConstEnding =
|
|
||||||
line.includes(CODE_PATTERNS.AS_CONST_OBJECT) ||
|
|
||||||
line.includes(CODE_PATTERNS.AS_CONST_ARRAY) ||
|
|
||||||
line.includes(CODE_PATTERNS.AS_CONST_END_SEMICOLON_OBJECT) ||
|
|
||||||
line.includes(CODE_PATTERNS.AS_CONST_END_SEMICOLON_ARRAY)
|
|
||||||
|
|
||||||
return hasAsConstEnding
|
|
||||||
}
|
|
||||||
|
|
||||||
return line.includes(CODE_PATTERNS.AS_CONST)
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Find the starting line of an export const declaration
|
|
||||||
*/
|
|
||||||
private findExportConstStart(lines: string[], lineIndex: number): number {
|
|
||||||
for (let currentLine = lineIndex; currentLine >= 0; currentLine--) {
|
|
||||||
const trimmed = lines[currentLine].trim()
|
|
||||||
|
|
||||||
const isExportConst =
|
|
||||||
trimmed.startsWith(CODE_PATTERNS.EXPORT_CONST) &&
|
|
||||||
(trimmed.includes(CODE_PATTERNS.OBJECT_START) ||
|
|
||||||
trimmed.includes(CODE_PATTERNS.ARRAY_START))
|
|
||||||
|
|
||||||
if (isExportConst) {
|
|
||||||
return currentLine
|
|
||||||
}
|
|
||||||
|
|
||||||
const isTopLevelStatement =
|
|
||||||
currentLine < lineIndex &&
|
|
||||||
(trimmed.startsWith(CODE_PATTERNS.EXPORT) ||
|
|
||||||
trimmed.startsWith(CODE_PATTERNS.IMPORT))
|
|
||||||
|
|
||||||
if (isTopLevelStatement) {
|
|
||||||
break
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return -1
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Count unclosed braces and brackets between two line indices
|
|
||||||
*/
|
|
||||||
private countUnclosedBraces(
|
|
||||||
lines: string[],
|
|
||||||
startLine: number,
|
|
||||||
endLine: number,
|
|
||||||
): { braces: number; brackets: number } {
|
|
||||||
let braces = 0
|
|
||||||
let brackets = 0
|
|
||||||
|
|
||||||
for (let i = startLine; i <= endLine; i++) {
|
|
||||||
const line = lines[i]
|
|
||||||
let inString = false
|
|
||||||
let stringChar = ""
|
|
||||||
|
|
||||||
for (let j = 0; j < line.length; j++) {
|
|
||||||
const char = line[j]
|
|
||||||
const prevChar = j > 0 ? line[j - 1] : ""
|
|
||||||
|
|
||||||
if ((char === "'" || char === '"' || char === "`") && prevChar !== "\\") {
|
|
||||||
if (!inString) {
|
|
||||||
inString = true
|
|
||||||
stringChar = char
|
|
||||||
} else if (char === stringChar) {
|
|
||||||
inString = false
|
|
||||||
stringChar = ""
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!inString) {
|
|
||||||
if (char === "{") {
|
|
||||||
braces++
|
|
||||||
} else if (char === "}") {
|
|
||||||
braces--
|
|
||||||
} else if (char === "[") {
|
|
||||||
brackets++
|
|
||||||
} else if (char === "]") {
|
|
||||||
brackets--
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return { braces, brackets }
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Detects magic numbers in code (timeouts, ports, limits, retries, etc.)
|
|
||||||
*
|
|
||||||
* Skips allowed numbers (-1, 0, 1, 2, 10, 100, 1000) and values in exported constants
|
|
||||||
*
|
*
|
||||||
* @param code - Source code to analyze
|
* @param code - Source code to analyze
|
||||||
* @param _filePath - File path (currently unused, reserved for future use)
|
* @param filePath - File path (used for constants file check)
|
||||||
* @returns Array of detected magic numbers
|
* @returns Array of detected magic numbers
|
||||||
*/
|
*/
|
||||||
public detectMagicNumbers(code: string, _filePath: string): HardcodedValue[] {
|
public detectMagicNumbers(code: string, filePath: string): HardcodedValue[] {
|
||||||
const results: HardcodedValue[] = []
|
if (this.constantsChecker.isConstantsFile(filePath)) {
|
||||||
const lines = code.split("\n")
|
return []
|
||||||
|
}
|
||||||
|
|
||||||
const numberPatterns = [
|
return this.numberMatcher.detect(code)
|
||||||
/(?:setTimeout|setInterval)\s*\(\s*[^,]+,\s*(\d+)/g,
|
|
||||||
/(?:maxRetries|retries|attempts)\s*[=:]\s*(\d+)/gi,
|
|
||||||
/(?:limit|max|min)\s*[=:]\s*(\d+)/gi,
|
|
||||||
/(?:port|PORT)\s*[=:]\s*(\d+)/g,
|
|
||||||
/(?:delay|timeout|TIMEOUT)\s*[=:]\s*(\d+)/gi,
|
|
||||||
]
|
|
||||||
|
|
||||||
lines.forEach((line, lineIndex) => {
|
|
||||||
if (line.trim().startsWith("//") || line.trim().startsWith("*")) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
// Skip lines inside exported constants
|
|
||||||
if (this.isInExportedConstant(lines, lineIndex)) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
numberPatterns.forEach((pattern) => {
|
|
||||||
let match
|
|
||||||
const regex = new RegExp(pattern)
|
|
||||||
|
|
||||||
while ((match = regex.exec(line)) !== null) {
|
|
||||||
const value = parseInt(match[1], 10)
|
|
||||||
|
|
||||||
if (!this.ALLOWED_NUMBERS.has(value)) {
|
|
||||||
results.push(
|
|
||||||
HardcodedValue.create(
|
|
||||||
value,
|
|
||||||
HARDCODE_TYPES.MAGIC_NUMBER,
|
|
||||||
lineIndex + 1,
|
|
||||||
match.index,
|
|
||||||
line.trim(),
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
|
||||||
const genericNumberRegex = /\b(\d{3,})\b/g
|
|
||||||
let match
|
|
||||||
|
|
||||||
while ((match = genericNumberRegex.exec(line)) !== null) {
|
|
||||||
const value = parseInt(match[1], 10)
|
|
||||||
|
|
||||||
if (
|
|
||||||
!this.ALLOWED_NUMBERS.has(value) &&
|
|
||||||
!this.isInComment(line, match.index) &&
|
|
||||||
!this.isInString(line, match.index)
|
|
||||||
) {
|
|
||||||
const context = this.extractContext(line, match.index)
|
|
||||||
if (this.looksLikeMagicNumber(context)) {
|
|
||||||
results.push(
|
|
||||||
HardcodedValue.create(
|
|
||||||
value,
|
|
||||||
HARDCODE_TYPES.MAGIC_NUMBER,
|
|
||||||
lineIndex + 1,
|
|
||||||
match.index,
|
|
||||||
line.trim(),
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
|
||||||
return results
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Detects magic strings in code (URLs, connection strings, error messages, etc.)
|
* Detects magic strings in code
|
||||||
*
|
|
||||||
* Skips short strings (≤3 chars), console logs, test descriptions, imports,
|
|
||||||
* and values in exported constants
|
|
||||||
*
|
*
|
||||||
* @param code - Source code to analyze
|
* @param code - Source code to analyze
|
||||||
* @param _filePath - File path (currently unused, reserved for future use)
|
* @param filePath - File path (used for constants file check)
|
||||||
* @returns Array of detected magic strings
|
* @returns Array of detected magic strings
|
||||||
*/
|
*/
|
||||||
public detectMagicStrings(code: string, _filePath: string): HardcodedValue[] {
|
public detectMagicStrings(code: string, filePath: string): HardcodedValue[] {
|
||||||
const results: HardcodedValue[] = []
|
if (this.constantsChecker.isConstantsFile(filePath)) {
|
||||||
const lines = code.split("\n")
|
return []
|
||||||
|
|
||||||
const stringRegex = /(['"`])(?:(?!\1).)+\1/g
|
|
||||||
|
|
||||||
lines.forEach((line, lineIndex) => {
|
|
||||||
if (
|
|
||||||
line.trim().startsWith("//") ||
|
|
||||||
line.trim().startsWith("*") ||
|
|
||||||
line.includes("import ") ||
|
|
||||||
line.includes("from ")
|
|
||||||
) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
// Skip lines inside exported constants
|
|
||||||
if (this.isInExportedConstant(lines, lineIndex)) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
let match
|
|
||||||
const regex = new RegExp(stringRegex)
|
|
||||||
|
|
||||||
while ((match = regex.exec(line)) !== null) {
|
|
||||||
const fullMatch = match[0]
|
|
||||||
const value = fullMatch.slice(1, -1)
|
|
||||||
|
|
||||||
// Skip template literals (backtick strings with ${} interpolation)
|
|
||||||
if (fullMatch.startsWith("`") || value.includes("${")) {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!this.isAllowedString(value) && this.looksLikeMagicString(line, value)) {
|
|
||||||
results.push(
|
|
||||||
HardcodedValue.create(
|
|
||||||
value,
|
|
||||||
HARDCODE_TYPES.MAGIC_STRING,
|
|
||||||
lineIndex + 1,
|
|
||||||
match.index,
|
|
||||||
line.trim(),
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
|
||||||
return results
|
|
||||||
}
|
|
||||||
|
|
||||||
private isAllowedString(str: string): boolean {
|
|
||||||
if (str.length <= 1) {
|
|
||||||
return true
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return this.ALLOWED_STRING_PATTERNS.some((pattern) => pattern.test(str))
|
return this.stringMatcher.detect(code)
|
||||||
}
|
|
||||||
|
|
||||||
private looksLikeMagicString(line: string, value: string): boolean {
|
|
||||||
const lowerLine = line.toLowerCase()
|
|
||||||
|
|
||||||
if (
|
|
||||||
lowerLine.includes(DETECTION_KEYWORDS.TEST) ||
|
|
||||||
lowerLine.includes(DETECTION_KEYWORDS.DESCRIBE)
|
|
||||||
) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
if (
|
|
||||||
lowerLine.includes(DETECTION_KEYWORDS.CONSOLE_LOG) ||
|
|
||||||
lowerLine.includes(DETECTION_KEYWORDS.CONSOLE_ERROR)
|
|
||||||
) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
if (value.includes(DETECTION_KEYWORDS.HTTP) || value.includes(DETECTION_KEYWORDS.API)) {
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
|
|
||||||
if (/^\d{2,}$/.test(value)) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
return value.length > 3
|
|
||||||
}
|
|
||||||
|
|
||||||
private looksLikeMagicNumber(context: string): boolean {
|
|
||||||
const lowerContext = context.toLowerCase()
|
|
||||||
|
|
||||||
const configKeywords = [
|
|
||||||
DETECTION_KEYWORDS.TIMEOUT,
|
|
||||||
DETECTION_KEYWORDS.DELAY,
|
|
||||||
DETECTION_KEYWORDS.RETRY,
|
|
||||||
DETECTION_KEYWORDS.LIMIT,
|
|
||||||
DETECTION_KEYWORDS.MAX,
|
|
||||||
DETECTION_KEYWORDS.MIN,
|
|
||||||
DETECTION_KEYWORDS.PORT,
|
|
||||||
DETECTION_KEYWORDS.INTERVAL,
|
|
||||||
]
|
|
||||||
|
|
||||||
return configKeywords.some((keyword) => lowerContext.includes(keyword))
|
|
||||||
}
|
|
||||||
|
|
||||||
private isInComment(line: string, index: number): boolean {
|
|
||||||
const beforeIndex = line.substring(0, index)
|
|
||||||
return beforeIndex.includes("//") || beforeIndex.includes("/*")
|
|
||||||
}
|
|
||||||
|
|
||||||
private isInString(line: string, index: number): boolean {
|
|
||||||
const beforeIndex = line.substring(0, index)
|
|
||||||
const singleQuotes = (beforeIndex.match(/'/g) ?? []).length
|
|
||||||
const doubleQuotes = (beforeIndex.match(/"/g) ?? []).length
|
|
||||||
const backticks = (beforeIndex.match(/`/g) ?? []).length
|
|
||||||
|
|
||||||
return singleQuotes % 2 !== 0 || doubleQuotes % 2 !== 0 || backticks % 2 !== 0
|
|
||||||
}
|
|
||||||
|
|
||||||
private extractContext(line: string, index: number): string {
|
|
||||||
const start = Math.max(0, index - 30)
|
|
||||||
const end = Math.min(line.length, index + 30)
|
|
||||||
return line.substring(start, end)
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,8 +1,9 @@
|
|||||||
import { IRepositoryPatternDetector } from "../../domain/services/RepositoryPatternDetectorService"
|
import { IRepositoryPatternDetector } from "../../domain/services/RepositoryPatternDetectorService"
|
||||||
import { RepositoryViolation } from "../../domain/value-objects/RepositoryViolation"
|
import { RepositoryViolation } from "../../domain/value-objects/RepositoryViolation"
|
||||||
import { LAYERS, REPOSITORY_VIOLATION_TYPES } from "../../shared/constants/rules"
|
import { OrmTypeMatcher } from "../strategies/OrmTypeMatcher"
|
||||||
import { ORM_QUERY_METHODS } from "../constants/orm-methods"
|
import { MethodNameValidator } from "../strategies/MethodNameValidator"
|
||||||
import { REPOSITORY_PATTERN_MESSAGES } from "../../domain/constants/Messages"
|
import { RepositoryFileAnalyzer } from "../strategies/RepositoryFileAnalyzer"
|
||||||
|
import { RepositoryViolationDetector } from "../strategies/RepositoryViolationDetector"
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Detects Repository Pattern violations in the codebase
|
* Detects Repository Pattern violations in the codebase
|
||||||
@@ -35,61 +36,20 @@ import { REPOSITORY_PATTERN_MESSAGES } from "../../domain/constants/Messages"
|
|||||||
* ```
|
* ```
|
||||||
*/
|
*/
|
||||||
export class RepositoryPatternDetector implements IRepositoryPatternDetector {
|
export class RepositoryPatternDetector implements IRepositoryPatternDetector {
|
||||||
private readonly ormTypePatterns = [
|
private readonly ormMatcher: OrmTypeMatcher
|
||||||
/Prisma\./,
|
private readonly methodValidator: MethodNameValidator
|
||||||
/PrismaClient/,
|
private readonly fileAnalyzer: RepositoryFileAnalyzer
|
||||||
/TypeORM/,
|
private readonly violationDetector: RepositoryViolationDetector
|
||||||
/@Entity/,
|
|
||||||
/@Column/,
|
|
||||||
/@PrimaryColumn/,
|
|
||||||
/@PrimaryGeneratedColumn/,
|
|
||||||
/@ManyToOne/,
|
|
||||||
/@OneToMany/,
|
|
||||||
/@ManyToMany/,
|
|
||||||
/@JoinColumn/,
|
|
||||||
/@JoinTable/,
|
|
||||||
/Mongoose\./,
|
|
||||||
/Schema/,
|
|
||||||
/Model</,
|
|
||||||
/Document/,
|
|
||||||
/Sequelize\./,
|
|
||||||
/DataTypes\./,
|
|
||||||
/FindOptions/,
|
|
||||||
/WhereOptions/,
|
|
||||||
/IncludeOptions/,
|
|
||||||
/QueryInterface/,
|
|
||||||
/MikroORM/,
|
|
||||||
/EntityManager/,
|
|
||||||
/EntityRepository/,
|
|
||||||
/Collection</,
|
|
||||||
]
|
|
||||||
|
|
||||||
private readonly technicalMethodNames = ORM_QUERY_METHODS
|
constructor() {
|
||||||
|
this.ormMatcher = new OrmTypeMatcher()
|
||||||
private readonly domainMethodPatterns = [
|
this.methodValidator = new MethodNameValidator(this.ormMatcher)
|
||||||
/^findBy[A-Z]/,
|
this.fileAnalyzer = new RepositoryFileAnalyzer()
|
||||||
/^findAll/,
|
this.violationDetector = new RepositoryViolationDetector(
|
||||||
/^save$/,
|
this.ormMatcher,
|
||||||
/^create$/,
|
this.methodValidator,
|
||||||
/^update$/,
|
)
|
||||||
/^delete$/,
|
}
|
||||||
/^remove$/,
|
|
||||||
/^add$/,
|
|
||||||
/^get[A-Z]/,
|
|
||||||
/^search/,
|
|
||||||
/^list/,
|
|
||||||
]
|
|
||||||
|
|
||||||
private readonly concreteRepositoryPatterns = [
|
|
||||||
/PrismaUserRepository/,
|
|
||||||
/MongoUserRepository/,
|
|
||||||
/TypeOrmUserRepository/,
|
|
||||||
/SequelizeUserRepository/,
|
|
||||||
/InMemoryUserRepository/,
|
|
||||||
/PostgresUserRepository/,
|
|
||||||
/MySqlUserRepository/,
|
|
||||||
/Repository(?!Interface)/,
|
|
||||||
]
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Detects all Repository Pattern violations in the given code
|
* Detects all Repository Pattern violations in the given code
|
||||||
@@ -101,14 +61,16 @@ export class RepositoryPatternDetector implements IRepositoryPatternDetector {
|
|||||||
): RepositoryViolation[] {
|
): RepositoryViolation[] {
|
||||||
const violations: RepositoryViolation[] = []
|
const violations: RepositoryViolation[] = []
|
||||||
|
|
||||||
if (this.isRepositoryInterface(filePath, layer)) {
|
if (this.fileAnalyzer.isRepositoryInterface(filePath, layer)) {
|
||||||
violations.push(...this.detectOrmTypesInInterface(code, filePath, layer))
|
violations.push(...this.violationDetector.detectOrmTypes(code, filePath, layer))
|
||||||
violations.push(...this.detectNonDomainMethodNames(code, filePath, layer))
|
violations.push(...this.violationDetector.detectNonDomainMethods(code, filePath, layer))
|
||||||
}
|
}
|
||||||
|
|
||||||
if (this.isUseCase(filePath, layer)) {
|
if (this.fileAnalyzer.isUseCase(filePath, layer)) {
|
||||||
violations.push(...this.detectConcreteRepositoryUsage(code, filePath, layer))
|
violations.push(
|
||||||
violations.push(...this.detectNewRepositoryInstantiation(code, filePath, layer))
|
...this.violationDetector.detectConcreteRepositoryUsage(code, filePath, layer),
|
||||||
|
)
|
||||||
|
violations.push(...this.violationDetector.detectNewInstantiation(code, filePath, layer))
|
||||||
}
|
}
|
||||||
|
|
||||||
return violations
|
return violations
|
||||||
@@ -118,270 +80,27 @@ export class RepositoryPatternDetector implements IRepositoryPatternDetector {
|
|||||||
* Checks if a type is an ORM-specific type
|
* Checks if a type is an ORM-specific type
|
||||||
*/
|
*/
|
||||||
public isOrmType(typeName: string): boolean {
|
public isOrmType(typeName: string): boolean {
|
||||||
return this.ormTypePatterns.some((pattern) => pattern.test(typeName))
|
return this.ormMatcher.isOrmType(typeName)
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Checks if a method name follows domain language conventions
|
* Checks if a method name follows domain language conventions
|
||||||
*/
|
*/
|
||||||
public isDomainMethodName(methodName: string): boolean {
|
public isDomainMethodName(methodName: string): boolean {
|
||||||
if ((this.technicalMethodNames as readonly string[]).includes(methodName)) {
|
return this.methodValidator.isDomainMethodName(methodName)
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
return this.domainMethodPatterns.some((pattern) => pattern.test(methodName))
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Checks if a file is a repository interface
|
* Checks if a file is a repository interface
|
||||||
*/
|
*/
|
||||||
public isRepositoryInterface(filePath: string, layer: string | undefined): boolean {
|
public isRepositoryInterface(filePath: string, layer: string | undefined): boolean {
|
||||||
if (layer !== LAYERS.DOMAIN) {
|
return this.fileAnalyzer.isRepositoryInterface(filePath, layer)
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
return /I[A-Z]\w*Repository\.ts$/.test(filePath) && /repositories?\//.test(filePath)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Checks if a file is a use case
|
* Checks if a file is a use case
|
||||||
*/
|
*/
|
||||||
public isUseCase(filePath: string, layer: string | undefined): boolean {
|
public isUseCase(filePath: string, layer: string | undefined): boolean {
|
||||||
if (layer !== LAYERS.APPLICATION) {
|
return this.fileAnalyzer.isUseCase(filePath, layer)
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
return /use-cases?\//.test(filePath) && /[A-Z][a-z]+[A-Z]\w*\.ts$/.test(filePath)
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Detects ORM-specific types in repository interfaces
|
|
||||||
*/
|
|
||||||
private detectOrmTypesInInterface(
|
|
||||||
code: string,
|
|
||||||
filePath: string,
|
|
||||||
layer: string | undefined,
|
|
||||||
): RepositoryViolation[] {
|
|
||||||
const violations: RepositoryViolation[] = []
|
|
||||||
const lines = code.split("\n")
|
|
||||||
|
|
||||||
for (let i = 0; i < lines.length; i++) {
|
|
||||||
const line = lines[i]
|
|
||||||
const lineNumber = i + 1
|
|
||||||
|
|
||||||
const methodMatch =
|
|
||||||
/(\w+)\s*\([^)]*:\s*([^)]+)\)\s*:\s*.*?(?:Promise<([^>]+)>|([A-Z]\w+))/.exec(line)
|
|
||||||
|
|
||||||
if (methodMatch) {
|
|
||||||
const params = methodMatch[2]
|
|
||||||
const returnType = methodMatch[3] || methodMatch[4]
|
|
||||||
|
|
||||||
if (this.isOrmType(params)) {
|
|
||||||
const ormType = this.extractOrmType(params)
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.DOMAIN,
|
|
||||||
lineNumber,
|
|
||||||
`Method parameter uses ORM type: ${ormType}`,
|
|
||||||
ormType,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (returnType && this.isOrmType(returnType)) {
|
|
||||||
const ormType = this.extractOrmType(returnType)
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.DOMAIN,
|
|
||||||
lineNumber,
|
|
||||||
`Method return type uses ORM type: ${ormType}`,
|
|
||||||
ormType,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const pattern of this.ormTypePatterns) {
|
|
||||||
if (pattern.test(line) && !line.trim().startsWith("//")) {
|
|
||||||
const ormType = this.extractOrmType(line)
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.DOMAIN,
|
|
||||||
lineNumber,
|
|
||||||
`Repository interface contains ORM-specific type: ${ormType}`,
|
|
||||||
ormType,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
break
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Detects non-domain method names in repository interfaces
|
|
||||||
*/
|
|
||||||
private detectNonDomainMethodNames(
|
|
||||||
code: string,
|
|
||||||
filePath: string,
|
|
||||||
layer: string | undefined,
|
|
||||||
): RepositoryViolation[] {
|
|
||||||
const violations: RepositoryViolation[] = []
|
|
||||||
const lines = code.split("\n")
|
|
||||||
|
|
||||||
for (let i = 0; i < lines.length; i++) {
|
|
||||||
const line = lines[i]
|
|
||||||
const lineNumber = i + 1
|
|
||||||
|
|
||||||
const methodMatch = /^\s*(\w+)\s*\(/.exec(line)
|
|
||||||
|
|
||||||
if (methodMatch) {
|
|
||||||
const methodName = methodMatch[1]
|
|
||||||
|
|
||||||
if (!this.isDomainMethodName(methodName) && !line.trim().startsWith("//")) {
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.DOMAIN,
|
|
||||||
lineNumber,
|
|
||||||
`Method '${methodName}' uses technical name instead of domain language`,
|
|
||||||
undefined,
|
|
||||||
undefined,
|
|
||||||
methodName,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Detects concrete repository usage in use cases
|
|
||||||
*/
|
|
||||||
private detectConcreteRepositoryUsage(
|
|
||||||
code: string,
|
|
||||||
filePath: string,
|
|
||||||
layer: string | undefined,
|
|
||||||
): RepositoryViolation[] {
|
|
||||||
const violations: RepositoryViolation[] = []
|
|
||||||
const lines = code.split("\n")
|
|
||||||
|
|
||||||
for (let i = 0; i < lines.length; i++) {
|
|
||||||
const line = lines[i]
|
|
||||||
const lineNumber = i + 1
|
|
||||||
|
|
||||||
const constructorParamMatch =
|
|
||||||
/constructor\s*\([^)]*(?:private|public|protected)\s+(?:readonly\s+)?(\w+)\s*:\s*([A-Z]\w*Repository)/.exec(
|
|
||||||
line,
|
|
||||||
)
|
|
||||||
|
|
||||||
if (constructorParamMatch) {
|
|
||||||
const repositoryType = constructorParamMatch[2]
|
|
||||||
|
|
||||||
if (!repositoryType.startsWith("I")) {
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.APPLICATION,
|
|
||||||
lineNumber,
|
|
||||||
`Use case depends on concrete repository '${repositoryType}'`,
|
|
||||||
undefined,
|
|
||||||
repositoryType,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const fieldMatch =
|
|
||||||
/(?:private|public|protected)\s+(?:readonly\s+)?(\w+)\s*:\s*([A-Z]\w*Repository)/.exec(
|
|
||||||
line,
|
|
||||||
)
|
|
||||||
|
|
||||||
if (fieldMatch) {
|
|
||||||
const repositoryType = fieldMatch[2]
|
|
||||||
|
|
||||||
if (
|
|
||||||
!repositoryType.startsWith("I") &&
|
|
||||||
!line.includes(REPOSITORY_PATTERN_MESSAGES.CONSTRUCTOR)
|
|
||||||
) {
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.APPLICATION,
|
|
||||||
lineNumber,
|
|
||||||
`Use case field uses concrete repository '${repositoryType}'`,
|
|
||||||
undefined,
|
|
||||||
repositoryType,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Detects 'new Repository()' instantiation in use cases
|
|
||||||
*/
|
|
||||||
private detectNewRepositoryInstantiation(
|
|
||||||
code: string,
|
|
||||||
filePath: string,
|
|
||||||
layer: string | undefined,
|
|
||||||
): RepositoryViolation[] {
|
|
||||||
const violations: RepositoryViolation[] = []
|
|
||||||
const lines = code.split("\n")
|
|
||||||
|
|
||||||
for (let i = 0; i < lines.length; i++) {
|
|
||||||
const line = lines[i]
|
|
||||||
const lineNumber = i + 1
|
|
||||||
|
|
||||||
const newRepositoryMatch = /new\s+([A-Z]\w*Repository)\s*\(/.exec(line)
|
|
||||||
|
|
||||||
if (newRepositoryMatch && !line.trim().startsWith("//")) {
|
|
||||||
const repositoryName = newRepositoryMatch[1]
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.NEW_REPOSITORY_IN_USE_CASE,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.APPLICATION,
|
|
||||||
lineNumber,
|
|
||||||
`Use case creates repository with 'new ${repositoryName}()'`,
|
|
||||||
undefined,
|
|
||||||
repositoryName,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Extracts ORM type name from a code line
|
|
||||||
*/
|
|
||||||
private extractOrmType(line: string): string {
|
|
||||||
for (const pattern of this.ormTypePatterns) {
|
|
||||||
const match = line.match(pattern)
|
|
||||||
if (match) {
|
|
||||||
const startIdx = match.index || 0
|
|
||||||
const typeMatch = /[\w.]+/.exec(line.slice(startIdx))
|
|
||||||
return typeMatch ? typeMatch[0] : REPOSITORY_PATTERN_MESSAGES.UNKNOWN_TYPE
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return REPOSITORY_PATTERN_MESSAGES.UNKNOWN_TYPE
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
168
packages/guardian/src/infrastructure/analyzers/SecretDetector.ts
Normal file
168
packages/guardian/src/infrastructure/analyzers/SecretDetector.ts
Normal file
@@ -0,0 +1,168 @@
|
|||||||
|
import { createEngine } from "@secretlint/node"
|
||||||
|
import type { SecretLintConfigDescriptor } from "@secretlint/types"
|
||||||
|
import { ISecretDetector } from "../../domain/services/ISecretDetector"
|
||||||
|
import { SecretViolation } from "../../domain/value-objects/SecretViolation"
|
||||||
|
import { SECRET_KEYWORDS, SECRET_TYPE_NAMES } from "../../domain/constants/SecretExamples"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects hardcoded secrets in TypeScript/JavaScript code
|
||||||
|
*
|
||||||
|
* Uses industry-standard Secretlint library to detect 350+ types of secrets
|
||||||
|
* including AWS keys, GitHub tokens, NPM tokens, SSH keys, API keys, and more.
|
||||||
|
*
|
||||||
|
* All detected secrets are marked as CRITICAL severity because they represent
|
||||||
|
* serious security risks that could lead to unauthorized access or data breaches.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* const detector = new SecretDetector()
|
||||||
|
* const code = `const AWS_KEY = "AKIA1234567890ABCDEF"`
|
||||||
|
* const violations = await detector.detectAll(code, 'config.ts')
|
||||||
|
* // Returns array of SecretViolation objects with CRITICAL severity
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export class SecretDetector implements ISecretDetector {
|
||||||
|
private readonly secretlintConfig: SecretLintConfigDescriptor = {
|
||||||
|
rules: [
|
||||||
|
{
|
||||||
|
id: "@secretlint/secretlint-rule-preset-recommend",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects all types of hardcoded secrets in the provided code
|
||||||
|
*
|
||||||
|
* @param code - Source code to analyze
|
||||||
|
* @param filePath - Path to the file being analyzed
|
||||||
|
* @returns Promise resolving to array of secret violations
|
||||||
|
*/
|
||||||
|
public async detectAll(code: string, filePath: string): Promise<SecretViolation[]> {
|
||||||
|
try {
|
||||||
|
const engine = await createEngine({
|
||||||
|
cwd: process.cwd(),
|
||||||
|
configFileJSON: this.secretlintConfig,
|
||||||
|
formatter: "stylish",
|
||||||
|
color: false,
|
||||||
|
})
|
||||||
|
|
||||||
|
const result = await engine.executeOnContent({
|
||||||
|
content: code,
|
||||||
|
filePath,
|
||||||
|
})
|
||||||
|
|
||||||
|
return this.parseOutputToViolations(result.output, filePath)
|
||||||
|
} catch (_error) {
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private parseOutputToViolations(output: string, filePath: string): SecretViolation[] {
|
||||||
|
const violations: SecretViolation[] = []
|
||||||
|
|
||||||
|
if (!output || output.trim() === "") {
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
const lines = output.split("\n")
|
||||||
|
|
||||||
|
for (const line of lines) {
|
||||||
|
const match = /^\s*(\d+):(\d+)\s+(error|warning)\s+(.+?)\s+(.+)$/.exec(line)
|
||||||
|
|
||||||
|
if (match) {
|
||||||
|
const [, lineNum, column, , message, ruleId] = match
|
||||||
|
const secretType = this.extractSecretType(message, ruleId)
|
||||||
|
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
filePath,
|
||||||
|
parseInt(lineNum, 10),
|
||||||
|
parseInt(column, 10),
|
||||||
|
secretType,
|
||||||
|
message,
|
||||||
|
)
|
||||||
|
|
||||||
|
violations.push(violation)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractSecretType(message: string, ruleId: string): string {
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.AWS)) {
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.ACCESS_KEY)) {
|
||||||
|
return SECRET_TYPE_NAMES.AWS_ACCESS_KEY
|
||||||
|
}
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.SECRET)) {
|
||||||
|
return SECRET_TYPE_NAMES.AWS_SECRET_KEY
|
||||||
|
}
|
||||||
|
return SECRET_TYPE_NAMES.AWS_CREDENTIAL
|
||||||
|
}
|
||||||
|
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.GITHUB)) {
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.PERSONAL_ACCESS_TOKEN)) {
|
||||||
|
return SECRET_TYPE_NAMES.GITHUB_PERSONAL_ACCESS_TOKEN
|
||||||
|
}
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.OAUTH)) {
|
||||||
|
return SECRET_TYPE_NAMES.GITHUB_OAUTH_TOKEN
|
||||||
|
}
|
||||||
|
return SECRET_TYPE_NAMES.GITHUB_TOKEN
|
||||||
|
}
|
||||||
|
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.NPM)) {
|
||||||
|
return SECRET_TYPE_NAMES.NPM_TOKEN
|
||||||
|
}
|
||||||
|
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.GCP) || ruleId.includes(SECRET_KEYWORDS.GOOGLE)) {
|
||||||
|
return SECRET_TYPE_NAMES.GCP_SERVICE_ACCOUNT_KEY
|
||||||
|
}
|
||||||
|
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.PRIVATEKEY) || ruleId.includes(SECRET_KEYWORDS.SSH)) {
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.RSA)) {
|
||||||
|
return SECRET_TYPE_NAMES.SSH_RSA_PRIVATE_KEY
|
||||||
|
}
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.DSA)) {
|
||||||
|
return SECRET_TYPE_NAMES.SSH_DSA_PRIVATE_KEY
|
||||||
|
}
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.ECDSA)) {
|
||||||
|
return SECRET_TYPE_NAMES.SSH_ECDSA_PRIVATE_KEY
|
||||||
|
}
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.ED25519)) {
|
||||||
|
return SECRET_TYPE_NAMES.SSH_ED25519_PRIVATE_KEY
|
||||||
|
}
|
||||||
|
return SECRET_TYPE_NAMES.SSH_PRIVATE_KEY
|
||||||
|
}
|
||||||
|
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.SLACK)) {
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.BOT)) {
|
||||||
|
return SECRET_TYPE_NAMES.SLACK_BOT_TOKEN
|
||||||
|
}
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.USER)) {
|
||||||
|
return SECRET_TYPE_NAMES.SLACK_USER_TOKEN
|
||||||
|
}
|
||||||
|
return SECRET_TYPE_NAMES.SLACK_TOKEN
|
||||||
|
}
|
||||||
|
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.BASICAUTH)) {
|
||||||
|
return SECRET_TYPE_NAMES.BASIC_AUTH_CREDENTIALS
|
||||||
|
}
|
||||||
|
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.API_KEY)) {
|
||||||
|
return SECRET_TYPE_NAMES.API_KEY
|
||||||
|
}
|
||||||
|
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.TOKEN)) {
|
||||||
|
return SECRET_TYPE_NAMES.AUTHENTICATION_TOKEN
|
||||||
|
}
|
||||||
|
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.PASSWORD)) {
|
||||||
|
return SECRET_TYPE_NAMES.PASSWORD
|
||||||
|
}
|
||||||
|
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.SECRET)) {
|
||||||
|
return SECRET_TYPE_NAMES.SECRET
|
||||||
|
}
|
||||||
|
|
||||||
|
return SECRET_TYPE_NAMES.SENSITIVE_DATA
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -64,3 +64,47 @@ export const NAMING_ERROR_MESSAGES = {
|
|||||||
USE_VERB_NOUN: "Use verb + noun in PascalCase (e.g., CreateUser.ts, UpdateProfile.ts)",
|
USE_VERB_NOUN: "Use verb + noun in PascalCase (e.g., CreateUser.ts, UpdateProfile.ts)",
|
||||||
USE_CASE_START_VERB: "Use cases should start with a verb",
|
USE_CASE_START_VERB: "Use cases should start with a verb",
|
||||||
} as const
|
} as const
|
||||||
|
|
||||||
|
/**
|
||||||
|
* DDD folder names for aggregate boundary detection
|
||||||
|
*/
|
||||||
|
export const DDD_FOLDER_NAMES = {
|
||||||
|
ENTITIES: "entities",
|
||||||
|
AGGREGATES: "aggregates",
|
||||||
|
VALUE_OBJECTS: "value-objects",
|
||||||
|
VO: "vo",
|
||||||
|
EVENTS: "events",
|
||||||
|
DOMAIN_EVENTS: "domain-events",
|
||||||
|
REPOSITORIES: "repositories",
|
||||||
|
SERVICES: "services",
|
||||||
|
SPECIFICATIONS: "specifications",
|
||||||
|
DOMAIN: "domain",
|
||||||
|
CONSTANTS: "constants",
|
||||||
|
SHARED: "shared",
|
||||||
|
FACTORIES: "factories",
|
||||||
|
PORTS: "ports",
|
||||||
|
INTERFACES: "interfaces",
|
||||||
|
ERRORS: "errors",
|
||||||
|
EXCEPTIONS: "exceptions",
|
||||||
|
} as const
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Repository method suggestions for domain language
|
||||||
|
*/
|
||||||
|
export const REPOSITORY_METHOD_SUGGESTIONS = {
|
||||||
|
SEARCH: "search",
|
||||||
|
FIND_BY_PROPERTY: "findBy[Property]",
|
||||||
|
GET_ENTITY: "get[Entity]",
|
||||||
|
CREATE: "create",
|
||||||
|
ADD_ENTITY: "add[Entity]",
|
||||||
|
STORE_ENTITY: "store[Entity]",
|
||||||
|
UPDATE: "update",
|
||||||
|
MODIFY_ENTITY: "modify[Entity]",
|
||||||
|
SAVE: "save",
|
||||||
|
DELETE: "delete",
|
||||||
|
REMOVE_BY_PROPERTY: "removeBy[Property]",
|
||||||
|
FIND_ALL: "findAll",
|
||||||
|
LIST_ALL: "listAll",
|
||||||
|
DEFAULT_SUGGESTION:
|
||||||
|
"Use domain-specific names like: findBy[Property], save, create, delete, update, add[Entity]",
|
||||||
|
} as const
|
||||||
|
|||||||
@@ -2,7 +2,6 @@ export const ORM_QUERY_METHODS = [
|
|||||||
"findOne",
|
"findOne",
|
||||||
"findMany",
|
"findMany",
|
||||||
"findFirst",
|
"findFirst",
|
||||||
"findAll",
|
|
||||||
"findAndCountAll",
|
"findAndCountAll",
|
||||||
"insert",
|
"insert",
|
||||||
"insertMany",
|
"insertMany",
|
||||||
@@ -17,8 +16,6 @@ export const ORM_QUERY_METHODS = [
|
|||||||
"run",
|
"run",
|
||||||
"exec",
|
"exec",
|
||||||
"aggregate",
|
"aggregate",
|
||||||
"count",
|
|
||||||
"exists",
|
|
||||||
] as const
|
] as const
|
||||||
|
|
||||||
export type OrmQueryMethod = (typeof ORM_QUERY_METHODS)[number]
|
export type OrmQueryMethod = (typeof ORM_QUERY_METHODS)[number]
|
||||||
|
|||||||
@@ -0,0 +1,177 @@
|
|||||||
|
import { DDD_FOLDER_NAMES } from "../constants/detectorPatterns"
|
||||||
|
import { IMPORT_PATTERNS } from "../constants/paths"
|
||||||
|
import { FolderRegistry } from "./FolderRegistry"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Analyzes file paths and imports to extract aggregate information
|
||||||
|
*
|
||||||
|
* Handles path normalization, aggregate extraction, and entity name detection
|
||||||
|
* for aggregate boundary validation.
|
||||||
|
*/
|
||||||
|
export class AggregatePathAnalyzer {
|
||||||
|
constructor(private readonly folderRegistry: FolderRegistry) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts the aggregate name from a file path
|
||||||
|
*
|
||||||
|
* Handles patterns like:
|
||||||
|
* - domain/aggregates/order/Order.ts → 'order'
|
||||||
|
* - domain/order/Order.ts → 'order'
|
||||||
|
* - domain/entities/order/Order.ts → 'order'
|
||||||
|
*/
|
||||||
|
public extractAggregateFromPath(filePath: string): string | undefined {
|
||||||
|
const normalizedPath = this.normalizePath(filePath)
|
||||||
|
const segments = this.getPathSegmentsAfterDomain(normalizedPath)
|
||||||
|
|
||||||
|
if (!segments || segments.length < 2) {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.findAggregateInSegments(segments)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts the aggregate name from an import path
|
||||||
|
*/
|
||||||
|
public extractAggregateFromImport(importPath: string): string | undefined {
|
||||||
|
const normalizedPath = importPath.replace(IMPORT_PATTERNS.QUOTE, "").toLowerCase()
|
||||||
|
const segments = normalizedPath.split("/").filter((seg) => seg !== ".." && seg !== ".")
|
||||||
|
|
||||||
|
if (segments.length === 0) {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.findAggregateInImportSegments(segments)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts the entity name from an import path
|
||||||
|
*/
|
||||||
|
public extractEntityName(importPath: string): string | undefined {
|
||||||
|
const normalizedPath = importPath.replace(IMPORT_PATTERNS.QUOTE, "")
|
||||||
|
const segments = normalizedPath.split("/")
|
||||||
|
const lastSegment = segments[segments.length - 1]
|
||||||
|
|
||||||
|
if (lastSegment) {
|
||||||
|
return lastSegment.replace(/\.(ts|js)$/, "")
|
||||||
|
}
|
||||||
|
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Normalizes a file path for consistent processing
|
||||||
|
*/
|
||||||
|
private normalizePath(filePath: string): string {
|
||||||
|
return filePath.toLowerCase().replace(/\\/g, "/")
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Gets path segments after the 'domain' folder
|
||||||
|
*/
|
||||||
|
private getPathSegmentsAfterDomain(normalizedPath: string): string[] | undefined {
|
||||||
|
const domainMatch = /(?:^|\/)(domain)\//.exec(normalizedPath)
|
||||||
|
if (!domainMatch) {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
const domainEndIndex = domainMatch.index + domainMatch[0].length
|
||||||
|
const pathAfterDomain = normalizedPath.substring(domainEndIndex)
|
||||||
|
return pathAfterDomain.split("/").filter(Boolean)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Finds aggregate name in path segments after domain folder
|
||||||
|
*/
|
||||||
|
private findAggregateInSegments(segments: string[]): string | undefined {
|
||||||
|
if (this.folderRegistry.isEntityFolder(segments[0])) {
|
||||||
|
return this.extractFromEntityFolder(segments)
|
||||||
|
}
|
||||||
|
|
||||||
|
const aggregate = segments[0]
|
||||||
|
if (this.folderRegistry.isNonAggregateFolder(aggregate)) {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
return aggregate
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts aggregate from entity folder structure
|
||||||
|
*/
|
||||||
|
private extractFromEntityFolder(segments: string[]): string | undefined {
|
||||||
|
if (segments.length < 3) {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
const aggregate = segments[1]
|
||||||
|
if (this.folderRegistry.isNonAggregateFolder(aggregate)) {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
return aggregate
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Finds aggregate in import path segments
|
||||||
|
*/
|
||||||
|
private findAggregateInImportSegments(segments: string[]): string | undefined {
|
||||||
|
const aggregateFromDomainFolder = this.findAggregateAfterDomainFolder(segments)
|
||||||
|
if (aggregateFromDomainFolder) {
|
||||||
|
return aggregateFromDomainFolder
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.findAggregateFromSecondLastSegment(segments)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Finds aggregate after 'domain' or 'aggregates' folder in import
|
||||||
|
*/
|
||||||
|
private findAggregateAfterDomainFolder(segments: string[]): string | undefined {
|
||||||
|
for (let i = 0; i < segments.length; i++) {
|
||||||
|
const isDomainOrAggregatesFolder =
|
||||||
|
segments[i] === DDD_FOLDER_NAMES.DOMAIN ||
|
||||||
|
segments[i] === DDD_FOLDER_NAMES.AGGREGATES
|
||||||
|
|
||||||
|
if (!isDomainOrAggregatesFolder) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
if (i + 1 >= segments.length) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
const nextSegment = segments[i + 1]
|
||||||
|
const isEntityOrAggregateFolder =
|
||||||
|
this.folderRegistry.isEntityFolder(nextSegment) ||
|
||||||
|
nextSegment === DDD_FOLDER_NAMES.AGGREGATES
|
||||||
|
|
||||||
|
if (isEntityOrAggregateFolder) {
|
||||||
|
return i + 2 < segments.length ? segments[i + 2] : undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
return nextSegment
|
||||||
|
}
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts aggregate from second-to-last segment if applicable
|
||||||
|
*/
|
||||||
|
private findAggregateFromSecondLastSegment(segments: string[]): string | undefined {
|
||||||
|
if (segments.length >= 2) {
|
||||||
|
const secondLastSegment = segments[segments.length - 2]
|
||||||
|
|
||||||
|
if (
|
||||||
|
!this.folderRegistry.isEntityFolder(secondLastSegment) &&
|
||||||
|
!this.folderRegistry.isValueObjectFolder(secondLastSegment) &&
|
||||||
|
!this.folderRegistry.isAllowedFolder(secondLastSegment) &&
|
||||||
|
secondLastSegment !== DDD_FOLDER_NAMES.DOMAIN
|
||||||
|
) {
|
||||||
|
return secondLastSegment
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,96 @@
|
|||||||
|
/**
|
||||||
|
* Tracks braces and brackets in code for context analysis
|
||||||
|
*
|
||||||
|
* Used to determine if a line is inside an exported constant
|
||||||
|
* by counting unclosed braces and brackets.
|
||||||
|
*/
|
||||||
|
export class BraceTracker {
|
||||||
|
/**
|
||||||
|
* Counts unclosed braces and brackets between two line indices
|
||||||
|
*/
|
||||||
|
public countUnclosed(
|
||||||
|
lines: string[],
|
||||||
|
startLine: number,
|
||||||
|
endLine: number,
|
||||||
|
): { braces: number; brackets: number } {
|
||||||
|
let braces = 0
|
||||||
|
let brackets = 0
|
||||||
|
|
||||||
|
for (let i = startLine; i <= endLine; i++) {
|
||||||
|
const counts = this.countInLine(lines[i])
|
||||||
|
braces += counts.braces
|
||||||
|
brackets += counts.brackets
|
||||||
|
}
|
||||||
|
|
||||||
|
return { braces, brackets }
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Counts braces and brackets in a single line
|
||||||
|
*/
|
||||||
|
private countInLine(line: string): { braces: number; brackets: number } {
|
||||||
|
let braces = 0
|
||||||
|
let brackets = 0
|
||||||
|
let inString = false
|
||||||
|
let stringChar = ""
|
||||||
|
|
||||||
|
for (let j = 0; j < line.length; j++) {
|
||||||
|
const char = line[j]
|
||||||
|
const prevChar = j > 0 ? line[j - 1] : ""
|
||||||
|
|
||||||
|
this.updateStringState(
|
||||||
|
char,
|
||||||
|
prevChar,
|
||||||
|
inString,
|
||||||
|
stringChar,
|
||||||
|
(newInString, newStringChar) => {
|
||||||
|
inString = newInString
|
||||||
|
stringChar = newStringChar
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
if (!inString) {
|
||||||
|
const counts = this.countChar(char)
|
||||||
|
braces += counts.braces
|
||||||
|
brackets += counts.brackets
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return { braces, brackets }
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Updates string tracking state
|
||||||
|
*/
|
||||||
|
private updateStringState(
|
||||||
|
char: string,
|
||||||
|
prevChar: string,
|
||||||
|
inString: boolean,
|
||||||
|
stringChar: string,
|
||||||
|
callback: (inString: boolean, stringChar: string) => void,
|
||||||
|
): void {
|
||||||
|
if ((char === "'" || char === '"' || char === "`") && prevChar !== "\\") {
|
||||||
|
if (!inString) {
|
||||||
|
callback(true, char)
|
||||||
|
} else if (char === stringChar) {
|
||||||
|
callback(false, "")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Counts a single character
|
||||||
|
*/
|
||||||
|
private countChar(char: string): { braces: number; brackets: number } {
|
||||||
|
if (char === "{") {
|
||||||
|
return { braces: 1, brackets: 0 }
|
||||||
|
} else if (char === "}") {
|
||||||
|
return { braces: -1, brackets: 0 }
|
||||||
|
} else if (char === "[") {
|
||||||
|
return { braces: 0, brackets: 1 }
|
||||||
|
} else if (char === "]") {
|
||||||
|
return { braces: 0, brackets: -1 }
|
||||||
|
}
|
||||||
|
return { braces: 0, brackets: 0 }
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,21 @@
|
|||||||
|
/**
|
||||||
|
* Checks if a file is a constants definition file
|
||||||
|
*
|
||||||
|
* Identifies files that should be skipped for hardcode detection
|
||||||
|
* since they are meant to contain constant definitions.
|
||||||
|
*/
|
||||||
|
export class ConstantsFileChecker {
|
||||||
|
private readonly constantsPatterns = [
|
||||||
|
/^constants?\.(ts|js)$/i,
|
||||||
|
/constants?\/.*\.(ts|js)$/i,
|
||||||
|
/\/(constants|config|settings|defaults|tokens)\.ts$/i,
|
||||||
|
/\/di\/tokens\.(ts|js)$/i,
|
||||||
|
]
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a file path represents a constants file
|
||||||
|
*/
|
||||||
|
public isConstantsFile(filePath: string): boolean {
|
||||||
|
return this.constantsPatterns.some((pattern) => pattern.test(filePath))
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,112 @@
|
|||||||
|
import { CODE_PATTERNS } from "../constants/defaults"
|
||||||
|
import { BraceTracker } from "./BraceTracker"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Analyzes export const declarations in code
|
||||||
|
*
|
||||||
|
* Determines if a line is inside an exported constant declaration
|
||||||
|
* to skip hardcode detection in constant definitions.
|
||||||
|
*/
|
||||||
|
export class ExportConstantAnalyzer {
|
||||||
|
constructor(private readonly braceTracker: BraceTracker) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a line is inside an exported constant definition
|
||||||
|
*/
|
||||||
|
public isInExportedConstant(lines: string[], lineIndex: number): boolean {
|
||||||
|
const currentLineTrimmed = lines[lineIndex].trim()
|
||||||
|
|
||||||
|
if (this.isSingleLineExportConst(currentLineTrimmed)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
const exportConstStart = this.findExportConstStart(lines, lineIndex)
|
||||||
|
if (exportConstStart === -1) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
const { braces, brackets } = this.braceTracker.countUnclosed(
|
||||||
|
lines,
|
||||||
|
exportConstStart,
|
||||||
|
lineIndex,
|
||||||
|
)
|
||||||
|
|
||||||
|
return braces > 0 || brackets > 0
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a line is a single-line export const declaration
|
||||||
|
*/
|
||||||
|
public isSingleLineExportConst(line: string): boolean {
|
||||||
|
if (!line.startsWith(CODE_PATTERNS.EXPORT_CONST)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
const hasObjectOrArray = this.hasObjectOrArray(line)
|
||||||
|
|
||||||
|
if (hasObjectOrArray) {
|
||||||
|
return this.hasAsConstEnding(line)
|
||||||
|
}
|
||||||
|
|
||||||
|
return line.includes(CODE_PATTERNS.AS_CONST)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Finds the starting line of an export const declaration
|
||||||
|
*/
|
||||||
|
public findExportConstStart(lines: string[], lineIndex: number): number {
|
||||||
|
for (let currentLine = lineIndex; currentLine >= 0; currentLine--) {
|
||||||
|
const trimmed = lines[currentLine].trim()
|
||||||
|
|
||||||
|
if (this.isExportConstWithStructure(trimmed)) {
|
||||||
|
return currentLine
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isTopLevelStatement(trimmed, currentLine, lineIndex)) {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return -1
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line has object or array structure
|
||||||
|
*/
|
||||||
|
private hasObjectOrArray(line: string): boolean {
|
||||||
|
return line.includes(CODE_PATTERNS.OBJECT_START) || line.includes(CODE_PATTERNS.ARRAY_START)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line has 'as const' ending
|
||||||
|
*/
|
||||||
|
private hasAsConstEnding(line: string): boolean {
|
||||||
|
return (
|
||||||
|
line.includes(CODE_PATTERNS.AS_CONST_OBJECT) ||
|
||||||
|
line.includes(CODE_PATTERNS.AS_CONST_ARRAY) ||
|
||||||
|
line.includes(CODE_PATTERNS.AS_CONST_END_SEMICOLON_OBJECT) ||
|
||||||
|
line.includes(CODE_PATTERNS.AS_CONST_END_SEMICOLON_ARRAY)
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line is export const with object or array
|
||||||
|
*/
|
||||||
|
private isExportConstWithStructure(trimmed: string): boolean {
|
||||||
|
return (
|
||||||
|
trimmed.startsWith(CODE_PATTERNS.EXPORT_CONST) &&
|
||||||
|
(trimmed.includes(CODE_PATTERNS.OBJECT_START) ||
|
||||||
|
trimmed.includes(CODE_PATTERNS.ARRAY_START))
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line is a top-level statement
|
||||||
|
*/
|
||||||
|
private isTopLevelStatement(trimmed: string, currentLine: number, lineIndex: number): boolean {
|
||||||
|
return (
|
||||||
|
currentLine < lineIndex &&
|
||||||
|
(trimmed.startsWith(CODE_PATTERNS.EXPORT) || trimmed.startsWith(CODE_PATTERNS.IMPORT))
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,72 @@
|
|||||||
|
import { DDD_FOLDER_NAMES } from "../constants/detectorPatterns"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Registry for DDD folder names used in aggregate boundary detection
|
||||||
|
*
|
||||||
|
* Centralizes folder name management for cleaner code organization
|
||||||
|
* and easier maintenance of folder name rules.
|
||||||
|
*/
|
||||||
|
export class FolderRegistry {
|
||||||
|
public readonly entityFolders: Set<string>
|
||||||
|
public readonly valueObjectFolders: Set<string>
|
||||||
|
public readonly allowedFolders: Set<string>
|
||||||
|
public readonly nonAggregateFolders: Set<string>
|
||||||
|
|
||||||
|
constructor() {
|
||||||
|
this.entityFolders = new Set<string>([
|
||||||
|
DDD_FOLDER_NAMES.ENTITIES,
|
||||||
|
DDD_FOLDER_NAMES.AGGREGATES,
|
||||||
|
])
|
||||||
|
|
||||||
|
this.valueObjectFolders = new Set<string>([
|
||||||
|
DDD_FOLDER_NAMES.VALUE_OBJECTS,
|
||||||
|
DDD_FOLDER_NAMES.VO,
|
||||||
|
])
|
||||||
|
|
||||||
|
this.allowedFolders = new Set<string>([
|
||||||
|
DDD_FOLDER_NAMES.VALUE_OBJECTS,
|
||||||
|
DDD_FOLDER_NAMES.VO,
|
||||||
|
DDD_FOLDER_NAMES.EVENTS,
|
||||||
|
DDD_FOLDER_NAMES.DOMAIN_EVENTS,
|
||||||
|
DDD_FOLDER_NAMES.REPOSITORIES,
|
||||||
|
DDD_FOLDER_NAMES.SERVICES,
|
||||||
|
DDD_FOLDER_NAMES.SPECIFICATIONS,
|
||||||
|
DDD_FOLDER_NAMES.ERRORS,
|
||||||
|
DDD_FOLDER_NAMES.EXCEPTIONS,
|
||||||
|
])
|
||||||
|
|
||||||
|
this.nonAggregateFolders = new Set<string>([
|
||||||
|
DDD_FOLDER_NAMES.VALUE_OBJECTS,
|
||||||
|
DDD_FOLDER_NAMES.VO,
|
||||||
|
DDD_FOLDER_NAMES.EVENTS,
|
||||||
|
DDD_FOLDER_NAMES.DOMAIN_EVENTS,
|
||||||
|
DDD_FOLDER_NAMES.REPOSITORIES,
|
||||||
|
DDD_FOLDER_NAMES.SERVICES,
|
||||||
|
DDD_FOLDER_NAMES.SPECIFICATIONS,
|
||||||
|
DDD_FOLDER_NAMES.ENTITIES,
|
||||||
|
DDD_FOLDER_NAMES.CONSTANTS,
|
||||||
|
DDD_FOLDER_NAMES.SHARED,
|
||||||
|
DDD_FOLDER_NAMES.FACTORIES,
|
||||||
|
DDD_FOLDER_NAMES.PORTS,
|
||||||
|
DDD_FOLDER_NAMES.INTERFACES,
|
||||||
|
DDD_FOLDER_NAMES.ERRORS,
|
||||||
|
DDD_FOLDER_NAMES.EXCEPTIONS,
|
||||||
|
])
|
||||||
|
}
|
||||||
|
|
||||||
|
public isEntityFolder(folderName: string): boolean {
|
||||||
|
return this.entityFolders.has(folderName)
|
||||||
|
}
|
||||||
|
|
||||||
|
public isValueObjectFolder(folderName: string): boolean {
|
||||||
|
return this.valueObjectFolders.has(folderName)
|
||||||
|
}
|
||||||
|
|
||||||
|
public isAllowedFolder(folderName: string): boolean {
|
||||||
|
return this.allowedFolders.has(folderName)
|
||||||
|
}
|
||||||
|
|
||||||
|
public isNonAggregateFolder(folderName: string): boolean {
|
||||||
|
return this.nonAggregateFolders.has(folderName)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,150 @@
|
|||||||
|
import { IMPORT_PATTERNS } from "../constants/paths"
|
||||||
|
import { AggregatePathAnalyzer } from "./AggregatePathAnalyzer"
|
||||||
|
import { FolderRegistry } from "./FolderRegistry"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validates imports for aggregate boundary violations
|
||||||
|
*
|
||||||
|
* Checks if imports cross aggregate boundaries inappropriately
|
||||||
|
* and ensures proper encapsulation in DDD architecture.
|
||||||
|
*/
|
||||||
|
export class ImportValidator {
|
||||||
|
constructor(
|
||||||
|
private readonly folderRegistry: FolderRegistry,
|
||||||
|
private readonly pathAnalyzer: AggregatePathAnalyzer,
|
||||||
|
) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if an import violates aggregate boundaries
|
||||||
|
*/
|
||||||
|
public isViolation(importPath: string, currentAggregate: string): boolean {
|
||||||
|
const normalizedPath = this.normalizeImportPath(importPath)
|
||||||
|
|
||||||
|
if (!this.isValidImportPath(normalizedPath)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInternalBoundedContextImport(normalizedPath)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
const targetAggregate = this.pathAnalyzer.extractAggregateFromImport(normalizedPath)
|
||||||
|
if (!targetAggregate || targetAggregate === currentAggregate) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isAllowedImport(normalizedPath)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.seemsLikeEntityImport(normalizedPath)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts all import paths from a line of code
|
||||||
|
*/
|
||||||
|
public extractImports(line: string): string[] {
|
||||||
|
const imports: string[] = []
|
||||||
|
|
||||||
|
this.extractEsImports(line, imports)
|
||||||
|
this.extractRequireImports(line, imports)
|
||||||
|
|
||||||
|
return imports
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Normalizes an import path for consistent processing
|
||||||
|
*/
|
||||||
|
private normalizeImportPath(importPath: string): string {
|
||||||
|
return importPath.replace(IMPORT_PATTERNS.QUOTE, "").toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if import path is valid for analysis
|
||||||
|
*/
|
||||||
|
private isValidImportPath(normalizedPath: string): boolean {
|
||||||
|
if (!normalizedPath.includes("/")) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!normalizedPath.startsWith(".") && !normalizedPath.startsWith("/")) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if import is internal to the same bounded context
|
||||||
|
*/
|
||||||
|
private isInternalBoundedContextImport(normalizedPath: string): boolean {
|
||||||
|
const parts = normalizedPath.split("/")
|
||||||
|
const dotDotCount = parts.filter((p) => p === "..").length
|
||||||
|
|
||||||
|
if (dotDotCount === 1) {
|
||||||
|
const nonDotParts = parts.filter((p) => p !== ".." && p !== ".")
|
||||||
|
if (nonDotParts.length >= 1) {
|
||||||
|
const firstFolder = nonDotParts[0]
|
||||||
|
if (this.folderRegistry.isEntityFolder(firstFolder)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if import is from an allowed folder
|
||||||
|
*/
|
||||||
|
private isAllowedImport(normalizedPath: string): boolean {
|
||||||
|
for (const folderName of this.folderRegistry.allowedFolders) {
|
||||||
|
if (normalizedPath.includes(`/${folderName}/`)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if import seems to be an entity
|
||||||
|
*/
|
||||||
|
private seemsLikeEntityImport(normalizedPath: string): boolean {
|
||||||
|
const pathParts = normalizedPath.split("/")
|
||||||
|
const lastPart = pathParts[pathParts.length - 1]
|
||||||
|
|
||||||
|
if (!lastPart) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
const filename = lastPart.replace(/\.(ts|js)$/, "")
|
||||||
|
|
||||||
|
if (filename.length > 0 && /^[a-z][a-z]/.exec(filename)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts ES6 imports from a line
|
||||||
|
*/
|
||||||
|
private extractEsImports(line: string, imports: string[]): void {
|
||||||
|
let match = IMPORT_PATTERNS.ES_IMPORT.exec(line)
|
||||||
|
while (match) {
|
||||||
|
imports.push(match[1])
|
||||||
|
match = IMPORT_PATTERNS.ES_IMPORT.exec(line)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts CommonJS requires from a line
|
||||||
|
*/
|
||||||
|
private extractRequireImports(line: string, imports: string[]): void {
|
||||||
|
let match = IMPORT_PATTERNS.REQUIRE.exec(line)
|
||||||
|
while (match) {
|
||||||
|
imports.push(match[1])
|
||||||
|
match = IMPORT_PATTERNS.REQUIRE.exec(line)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,171 @@
|
|||||||
|
import { HardcodedValue } from "../../domain/value-objects/HardcodedValue"
|
||||||
|
import { ALLOWED_NUMBERS, DETECTION_KEYWORDS } from "../constants/defaults"
|
||||||
|
import { HARDCODE_TYPES } from "../../shared/constants"
|
||||||
|
import { ExportConstantAnalyzer } from "./ExportConstantAnalyzer"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects magic numbers in code
|
||||||
|
*
|
||||||
|
* Identifies hardcoded numeric values that should be extracted
|
||||||
|
* to constants, excluding allowed values and exported constants.
|
||||||
|
*/
|
||||||
|
export class MagicNumberMatcher {
|
||||||
|
private readonly numberPatterns = [
|
||||||
|
/(?:setTimeout|setInterval)\s*\(\s*[^,]+,\s*(\d+)/g,
|
||||||
|
/(?:maxRetries|retries|attempts)\s*[=:]\s*(\d+)/gi,
|
||||||
|
/(?:limit|max|min)\s*[=:]\s*(\d+)/gi,
|
||||||
|
/(?:port|PORT)\s*[=:]\s*(\d+)/g,
|
||||||
|
/(?:delay|timeout|TIMEOUT)\s*[=:]\s*(\d+)/gi,
|
||||||
|
]
|
||||||
|
|
||||||
|
constructor(private readonly exportAnalyzer: ExportConstantAnalyzer) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects magic numbers in code
|
||||||
|
*/
|
||||||
|
public detect(code: string): HardcodedValue[] {
|
||||||
|
const results: HardcodedValue[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
|
lines.forEach((line, lineIndex) => {
|
||||||
|
if (this.shouldSkipLine(line, lines, lineIndex)) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
this.detectInPatterns(line, lineIndex, results)
|
||||||
|
this.detectGenericNumbers(line, lineIndex, results)
|
||||||
|
})
|
||||||
|
|
||||||
|
return results
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line should be skipped
|
||||||
|
*/
|
||||||
|
private shouldSkipLine(line: string, lines: string[], lineIndex: number): boolean {
|
||||||
|
if (line.trim().startsWith("//") || line.trim().startsWith("*")) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.exportAnalyzer.isInExportedConstant(lines, lineIndex)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects numbers in specific patterns
|
||||||
|
*/
|
||||||
|
private detectInPatterns(line: string, lineIndex: number, results: HardcodedValue[]): void {
|
||||||
|
this.numberPatterns.forEach((pattern) => {
|
||||||
|
let match
|
||||||
|
const regex = new RegExp(pattern)
|
||||||
|
|
||||||
|
while ((match = regex.exec(line)) !== null) {
|
||||||
|
const value = parseInt(match[1], 10)
|
||||||
|
|
||||||
|
if (!ALLOWED_NUMBERS.has(value)) {
|
||||||
|
results.push(
|
||||||
|
HardcodedValue.create(
|
||||||
|
value,
|
||||||
|
HARDCODE_TYPES.MAGIC_NUMBER,
|
||||||
|
lineIndex + 1,
|
||||||
|
match.index,
|
||||||
|
line.trim(),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects generic 3+ digit numbers
|
||||||
|
*/
|
||||||
|
private detectGenericNumbers(line: string, lineIndex: number, results: HardcodedValue[]): void {
|
||||||
|
const genericNumberRegex = /\b(\d{3,})\b/g
|
||||||
|
let match
|
||||||
|
|
||||||
|
while ((match = genericNumberRegex.exec(line)) !== null) {
|
||||||
|
const value = parseInt(match[1], 10)
|
||||||
|
|
||||||
|
if (this.shouldDetectNumber(value, line, match.index)) {
|
||||||
|
results.push(
|
||||||
|
HardcodedValue.create(
|
||||||
|
value,
|
||||||
|
HARDCODE_TYPES.MAGIC_NUMBER,
|
||||||
|
lineIndex + 1,
|
||||||
|
match.index,
|
||||||
|
line.trim(),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if number should be detected
|
||||||
|
*/
|
||||||
|
private shouldDetectNumber(value: number, line: string, index: number): boolean {
|
||||||
|
if (ALLOWED_NUMBERS.has(value)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInComment(line, index)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInString(line, index)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
const context = this.extractContext(line, index)
|
||||||
|
return this.looksLikeMagicNumber(context)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if position is in a comment
|
||||||
|
*/
|
||||||
|
private isInComment(line: string, index: number): boolean {
|
||||||
|
const beforeIndex = line.substring(0, index)
|
||||||
|
return beforeIndex.includes("//") || beforeIndex.includes("/*")
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if position is in a string
|
||||||
|
*/
|
||||||
|
private isInString(line: string, index: number): boolean {
|
||||||
|
const beforeIndex = line.substring(0, index)
|
||||||
|
const singleQuotes = (beforeIndex.match(/'/g) ?? []).length
|
||||||
|
const doubleQuotes = (beforeIndex.match(/"/g) ?? []).length
|
||||||
|
const backticks = (beforeIndex.match(/`/g) ?? []).length
|
||||||
|
|
||||||
|
return singleQuotes % 2 !== 0 || doubleQuotes % 2 !== 0 || backticks % 2 !== 0
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts context around a position
|
||||||
|
*/
|
||||||
|
private extractContext(line: string, index: number): string {
|
||||||
|
const start = Math.max(0, index - 30)
|
||||||
|
const end = Math.min(line.length, index + 30)
|
||||||
|
return line.substring(start, end)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if context suggests a magic number
|
||||||
|
*/
|
||||||
|
private looksLikeMagicNumber(context: string): boolean {
|
||||||
|
const lowerContext = context.toLowerCase()
|
||||||
|
|
||||||
|
const configKeywords = [
|
||||||
|
DETECTION_KEYWORDS.TIMEOUT,
|
||||||
|
DETECTION_KEYWORDS.DELAY,
|
||||||
|
DETECTION_KEYWORDS.RETRY,
|
||||||
|
DETECTION_KEYWORDS.LIMIT,
|
||||||
|
DETECTION_KEYWORDS.MAX,
|
||||||
|
DETECTION_KEYWORDS.MIN,
|
||||||
|
DETECTION_KEYWORDS.PORT,
|
||||||
|
DETECTION_KEYWORDS.INTERVAL,
|
||||||
|
]
|
||||||
|
|
||||||
|
return configKeywords.some((keyword) => lowerContext.includes(keyword))
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,220 @@
|
|||||||
|
import { HardcodedValue } from "../../domain/value-objects/HardcodedValue"
|
||||||
|
import { DETECTION_KEYWORDS } from "../constants/defaults"
|
||||||
|
import { HARDCODE_TYPES } from "../../shared/constants"
|
||||||
|
import { ExportConstantAnalyzer } from "./ExportConstantAnalyzer"
|
||||||
|
import {
|
||||||
|
DYNAMIC_IMPORT_PATTERN_PARTS,
|
||||||
|
REGEX_ESCAPE_PATTERN,
|
||||||
|
} from "../../domain/constants/SecretExamples"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects magic strings in code
|
||||||
|
*
|
||||||
|
* Identifies hardcoded string values that should be extracted
|
||||||
|
* to constants, excluding test code, console logs, and type contexts.
|
||||||
|
*/
|
||||||
|
export class MagicStringMatcher {
|
||||||
|
private readonly stringRegex = /(['"`])(?:(?!\1).)+\1/g
|
||||||
|
|
||||||
|
private readonly allowedPatterns = [/^[a-z]$/i, /^\/$/, /^\\$/, /^\s+$/, /^,$/, /^\.$/]
|
||||||
|
|
||||||
|
private readonly typeContextPatterns = [
|
||||||
|
/^\s*type\s+\w+\s*=/i,
|
||||||
|
/^\s*interface\s+\w+/i,
|
||||||
|
/^\s*\w+\s*:\s*['"`]/,
|
||||||
|
/\s+as\s+['"`]/,
|
||||||
|
/Record<.*,\s*import\(/,
|
||||||
|
/typeof\s+\w+\s*===\s*['"`]/,
|
||||||
|
/['"`]\s*===\s*typeof\s+\w+/,
|
||||||
|
]
|
||||||
|
|
||||||
|
constructor(private readonly exportAnalyzer: ExportConstantAnalyzer) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects magic strings in code
|
||||||
|
*/
|
||||||
|
public detect(code: string): HardcodedValue[] {
|
||||||
|
const results: HardcodedValue[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
|
lines.forEach((line, lineIndex) => {
|
||||||
|
if (this.shouldSkipLine(line, lines, lineIndex)) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
this.detectStringsInLine(line, lineIndex, results)
|
||||||
|
})
|
||||||
|
|
||||||
|
return results
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line should be skipped
|
||||||
|
*/
|
||||||
|
private shouldSkipLine(line: string, lines: string[], lineIndex: number): boolean {
|
||||||
|
if (
|
||||||
|
line.trim().startsWith("//") ||
|
||||||
|
line.trim().startsWith("*") ||
|
||||||
|
line.includes("import ") ||
|
||||||
|
line.includes("from ")
|
||||||
|
) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.exportAnalyzer.isInExportedConstant(lines, lineIndex)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects strings in a single line
|
||||||
|
*/
|
||||||
|
private detectStringsInLine(line: string, lineIndex: number, results: HardcodedValue[]): void {
|
||||||
|
let match
|
||||||
|
const regex = new RegExp(this.stringRegex)
|
||||||
|
|
||||||
|
while ((match = regex.exec(line)) !== null) {
|
||||||
|
const fullMatch = match[0]
|
||||||
|
const value = fullMatch.slice(1, -1)
|
||||||
|
|
||||||
|
if (this.shouldDetectString(fullMatch, value, line)) {
|
||||||
|
results.push(
|
||||||
|
HardcodedValue.create(
|
||||||
|
value,
|
||||||
|
HARDCODE_TYPES.MAGIC_STRING,
|
||||||
|
lineIndex + 1,
|
||||||
|
match.index,
|
||||||
|
line.trim(),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if string should be detected
|
||||||
|
*/
|
||||||
|
private shouldDetectString(fullMatch: string, value: string, line: string): boolean {
|
||||||
|
if (fullMatch.startsWith("`") || value.includes("${")) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isAllowedString(value)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.looksLikeMagicString(line, value)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if string is allowed (short strings, single chars, etc.)
|
||||||
|
*/
|
||||||
|
private isAllowedString(str: string): boolean {
|
||||||
|
if (str.length <= 1) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.allowedPatterns.some((pattern) => pattern.test(str))
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line context suggests a magic string
|
||||||
|
*/
|
||||||
|
private looksLikeMagicString(line: string, value: string): boolean {
|
||||||
|
const lowerLine = line.toLowerCase()
|
||||||
|
|
||||||
|
if (this.isTestCode(lowerLine)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isConsoleLog(lowerLine)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInTypeContext(line)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInSymbolCall(line, value)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInImportCall(line, value)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isUrlOrApi(value)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
if (/^\d{2,}$/.test(value)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return value.length > 3
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line is test code
|
||||||
|
*/
|
||||||
|
private isTestCode(lowerLine: string): boolean {
|
||||||
|
return (
|
||||||
|
lowerLine.includes(DETECTION_KEYWORDS.TEST) ||
|
||||||
|
lowerLine.includes(DETECTION_KEYWORDS.DESCRIBE)
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line is console log
|
||||||
|
*/
|
||||||
|
private isConsoleLog(lowerLine: string): boolean {
|
||||||
|
return (
|
||||||
|
lowerLine.includes(DETECTION_KEYWORDS.CONSOLE_LOG) ||
|
||||||
|
lowerLine.includes(DETECTION_KEYWORDS.CONSOLE_ERROR)
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line is in type context
|
||||||
|
*/
|
||||||
|
private isInTypeContext(line: string): boolean {
|
||||||
|
const trimmedLine = line.trim()
|
||||||
|
|
||||||
|
if (this.typeContextPatterns.some((pattern) => pattern.test(trimmedLine))) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
if (trimmedLine.includes("|") && /['"`][^'"`]+['"`]\s*\|/.test(trimmedLine)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if string is inside Symbol() call
|
||||||
|
*/
|
||||||
|
private isInSymbolCall(line: string, stringValue: string): boolean {
|
||||||
|
const escapedValue = stringValue.replace(
|
||||||
|
/[.*+?^${}()|[\]\\]/g,
|
||||||
|
REGEX_ESCAPE_PATTERN.DOLLAR_AMPERSAND,
|
||||||
|
)
|
||||||
|
const symbolPattern = new RegExp(`Symbol\\s*\\(\\s*['"\`]${escapedValue}['"\`]\\s*\\)`)
|
||||||
|
return symbolPattern.test(line)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if string is inside import() call
|
||||||
|
*/
|
||||||
|
private isInImportCall(line: string, stringValue: string): boolean {
|
||||||
|
const importPattern = new RegExp(
|
||||||
|
`import\\s*\\(\\s*['${DYNAMIC_IMPORT_PATTERN_PARTS.QUOTE_START}'${DYNAMIC_IMPORT_PATTERN_PARTS.QUOTE_END}"]\\s*\\)`,
|
||||||
|
)
|
||||||
|
return importPattern.test(line) && line.includes(stringValue)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if string contains URL or API reference
|
||||||
|
*/
|
||||||
|
private isUrlOrApi(value: string): boolean {
|
||||||
|
return value.includes(DETECTION_KEYWORDS.HTTP) || value.includes(DETECTION_KEYWORDS.API)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,134 @@
|
|||||||
|
import { REPOSITORY_METHOD_SUGGESTIONS } from "../constants/detectorPatterns"
|
||||||
|
import { OrmTypeMatcher } from "./OrmTypeMatcher"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validates repository method names for domain language compliance
|
||||||
|
*
|
||||||
|
* Ensures repository methods use domain language instead of
|
||||||
|
* technical database terminology.
|
||||||
|
*/
|
||||||
|
export class MethodNameValidator {
|
||||||
|
private readonly domainMethodPatterns = [
|
||||||
|
/^findBy[A-Z]/,
|
||||||
|
/^findAll$/,
|
||||||
|
/^find[A-Z]/,
|
||||||
|
/^save$/,
|
||||||
|
/^saveAll$/,
|
||||||
|
/^create$/,
|
||||||
|
/^update$/,
|
||||||
|
/^delete$/,
|
||||||
|
/^deleteBy[A-Z]/,
|
||||||
|
/^deleteAll$/,
|
||||||
|
/^remove$/,
|
||||||
|
/^removeBy[A-Z]/,
|
||||||
|
/^removeAll$/,
|
||||||
|
/^add$/,
|
||||||
|
/^add[A-Z]/,
|
||||||
|
/^get[A-Z]/,
|
||||||
|
/^getAll$/,
|
||||||
|
/^search/,
|
||||||
|
/^list/,
|
||||||
|
/^has[A-Z]/,
|
||||||
|
/^is[A-Z]/,
|
||||||
|
/^exists$/,
|
||||||
|
/^exists[A-Z]/,
|
||||||
|
/^existsBy[A-Z]/,
|
||||||
|
/^clear[A-Z]/,
|
||||||
|
/^clearAll$/,
|
||||||
|
/^store[A-Z]/,
|
||||||
|
/^initialize$/,
|
||||||
|
/^initializeCollection$/,
|
||||||
|
/^close$/,
|
||||||
|
/^connect$/,
|
||||||
|
/^disconnect$/,
|
||||||
|
/^count$/,
|
||||||
|
/^countBy[A-Z]/,
|
||||||
|
]
|
||||||
|
|
||||||
|
constructor(private readonly ormMatcher: OrmTypeMatcher) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a method name follows domain language conventions
|
||||||
|
*/
|
||||||
|
public isDomainMethodName(methodName: string): boolean {
|
||||||
|
if (this.ormMatcher.isTechnicalMethod(methodName)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.domainMethodPatterns.some((pattern) => pattern.test(methodName))
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Suggests better domain method names
|
||||||
|
*/
|
||||||
|
public suggestDomainMethodName(methodName: string): string {
|
||||||
|
const lowerName = methodName.toLowerCase()
|
||||||
|
const suggestions: string[] = []
|
||||||
|
|
||||||
|
this.collectSuggestions(lowerName, suggestions)
|
||||||
|
|
||||||
|
if (lowerName.includes("get") && lowerName.includes("all")) {
|
||||||
|
suggestions.push(
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.FIND_ALL,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.LIST_ALL,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
if (suggestions.length === 0) {
|
||||||
|
return REPOSITORY_METHOD_SUGGESTIONS.DEFAULT_SUGGESTION
|
||||||
|
}
|
||||||
|
|
||||||
|
return `Consider: ${suggestions.slice(0, 3).join(", ")}`
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Collects method name suggestions based on keywords
|
||||||
|
*/
|
||||||
|
private collectSuggestions(lowerName: string, suggestions: string[]): void {
|
||||||
|
const suggestionMap: Record<string, string[]> = {
|
||||||
|
query: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.SEARCH,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
||||||
|
],
|
||||||
|
select: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.GET_ENTITY,
|
||||||
|
],
|
||||||
|
insert: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.CREATE,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.ADD_ENTITY,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.STORE_ENTITY,
|
||||||
|
],
|
||||||
|
update: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.UPDATE,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.MODIFY_ENTITY,
|
||||||
|
],
|
||||||
|
upsert: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.SAVE,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.STORE_ENTITY,
|
||||||
|
],
|
||||||
|
remove: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.DELETE,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.REMOVE_BY_PROPERTY,
|
||||||
|
],
|
||||||
|
fetch: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.GET_ENTITY,
|
||||||
|
],
|
||||||
|
retrieve: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.GET_ENTITY,
|
||||||
|
],
|
||||||
|
load: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.GET_ENTITY,
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const [keyword, keywords] of Object.entries(suggestionMap)) {
|
||||||
|
if (lowerName.includes(keyword)) {
|
||||||
|
suggestions.push(...keywords)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,68 @@
|
|||||||
|
import { ORM_QUERY_METHODS } from "../constants/orm-methods"
|
||||||
|
import { REPOSITORY_PATTERN_MESSAGES } from "../../domain/constants/Messages"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Matches and validates ORM-specific types and patterns
|
||||||
|
*
|
||||||
|
* Identifies ORM-specific types (Prisma, TypeORM, Mongoose, etc.)
|
||||||
|
* that should not appear in domain layer repository interfaces.
|
||||||
|
*/
|
||||||
|
export class OrmTypeMatcher {
|
||||||
|
private readonly ormTypePatterns = [
|
||||||
|
/Prisma\./,
|
||||||
|
/PrismaClient/,
|
||||||
|
/TypeORM/,
|
||||||
|
/@Entity/,
|
||||||
|
/@Column/,
|
||||||
|
/@PrimaryColumn/,
|
||||||
|
/@PrimaryGeneratedColumn/,
|
||||||
|
/@ManyToOne/,
|
||||||
|
/@OneToMany/,
|
||||||
|
/@ManyToMany/,
|
||||||
|
/@JoinColumn/,
|
||||||
|
/@JoinTable/,
|
||||||
|
/Mongoose\./,
|
||||||
|
/Schema/,
|
||||||
|
/Model</,
|
||||||
|
/Document/,
|
||||||
|
/Sequelize\./,
|
||||||
|
/DataTypes\./,
|
||||||
|
/FindOptions/,
|
||||||
|
/WhereOptions/,
|
||||||
|
/IncludeOptions/,
|
||||||
|
/QueryInterface/,
|
||||||
|
/MikroORM/,
|
||||||
|
/EntityManager/,
|
||||||
|
/EntityRepository/,
|
||||||
|
/Collection</,
|
||||||
|
]
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a type name is an ORM-specific type
|
||||||
|
*/
|
||||||
|
public isOrmType(typeName: string): boolean {
|
||||||
|
return this.ormTypePatterns.some((pattern) => pattern.test(typeName))
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts ORM type name from a code line
|
||||||
|
*/
|
||||||
|
public extractOrmType(line: string): string {
|
||||||
|
for (const pattern of this.ormTypePatterns) {
|
||||||
|
const match = line.match(pattern)
|
||||||
|
if (match) {
|
||||||
|
const startIdx = match.index || 0
|
||||||
|
const typeMatch = /[\w.]+/.exec(line.slice(startIdx))
|
||||||
|
return typeMatch ? typeMatch[0] : REPOSITORY_PATTERN_MESSAGES.UNKNOWN_TYPE
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return REPOSITORY_PATTERN_MESSAGES.UNKNOWN_TYPE
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a method name is a technical ORM method
|
||||||
|
*/
|
||||||
|
public isTechnicalMethod(methodName: string): boolean {
|
||||||
|
return (ORM_QUERY_METHODS as readonly string[]).includes(methodName)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,31 @@
|
|||||||
|
import { LAYERS } from "../../shared/constants/rules"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Analyzes files to determine their role in the repository pattern
|
||||||
|
*
|
||||||
|
* Identifies repository interfaces and use cases based on file paths
|
||||||
|
* and architectural layer conventions.
|
||||||
|
*/
|
||||||
|
export class RepositoryFileAnalyzer {
|
||||||
|
/**
|
||||||
|
* Checks if a file is a repository interface
|
||||||
|
*/
|
||||||
|
public isRepositoryInterface(filePath: string, layer: string | undefined): boolean {
|
||||||
|
if (layer !== LAYERS.DOMAIN) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return /I[A-Z]\w*Repository\.ts$/.test(filePath) && /repositories?\//.test(filePath)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a file is a use case
|
||||||
|
*/
|
||||||
|
public isUseCase(filePath: string, layer: string | undefined): boolean {
|
||||||
|
if (layer !== LAYERS.APPLICATION) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return /use-cases?\//.test(filePath) && /[A-Z][a-z]+[A-Z]\w*\.ts$/.test(filePath)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,285 @@
|
|||||||
|
import { RepositoryViolation } from "../../domain/value-objects/RepositoryViolation"
|
||||||
|
import { LAYERS, REPOSITORY_VIOLATION_TYPES } from "../../shared/constants/rules"
|
||||||
|
import { REPOSITORY_PATTERN_MESSAGES } from "../../domain/constants/Messages"
|
||||||
|
import { OrmTypeMatcher } from "./OrmTypeMatcher"
|
||||||
|
import { MethodNameValidator } from "./MethodNameValidator"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects specific repository pattern violations
|
||||||
|
*
|
||||||
|
* Handles detection of ORM types, non-domain methods, concrete repositories,
|
||||||
|
* and repository instantiation violations.
|
||||||
|
*/
|
||||||
|
export class RepositoryViolationDetector {
|
||||||
|
constructor(
|
||||||
|
private readonly ormMatcher: OrmTypeMatcher,
|
||||||
|
private readonly methodValidator: MethodNameValidator,
|
||||||
|
) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects ORM types in repository interface
|
||||||
|
*/
|
||||||
|
public detectOrmTypes(
|
||||||
|
code: string,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
): RepositoryViolation[] {
|
||||||
|
const violations: RepositoryViolation[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
|
for (let i = 0; i < lines.length; i++) {
|
||||||
|
const line = lines[i]
|
||||||
|
const lineNumber = i + 1
|
||||||
|
|
||||||
|
this.detectOrmInMethod(line, lineNumber, filePath, layer, violations)
|
||||||
|
this.detectOrmInLine(line, lineNumber, filePath, layer, violations)
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects non-domain method names
|
||||||
|
*/
|
||||||
|
public detectNonDomainMethods(
|
||||||
|
code: string,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
): RepositoryViolation[] {
|
||||||
|
const violations: RepositoryViolation[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
|
for (let i = 0; i < lines.length; i++) {
|
||||||
|
const line = lines[i]
|
||||||
|
const lineNumber = i + 1
|
||||||
|
|
||||||
|
const methodMatch = /^\s*(\w+)\s*\(/.exec(line)
|
||||||
|
|
||||||
|
if (methodMatch) {
|
||||||
|
const methodName = methodMatch[1]
|
||||||
|
|
||||||
|
if (
|
||||||
|
!this.methodValidator.isDomainMethodName(methodName) &&
|
||||||
|
!line.trim().startsWith("//")
|
||||||
|
) {
|
||||||
|
const suggestion = this.methodValidator.suggestDomainMethodName(methodName)
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.DOMAIN,
|
||||||
|
lineNumber,
|
||||||
|
`Method '${methodName}' uses technical name instead of domain language. ${suggestion}`,
|
||||||
|
undefined,
|
||||||
|
undefined,
|
||||||
|
methodName,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects concrete repository usage
|
||||||
|
*/
|
||||||
|
public detectConcreteRepositoryUsage(
|
||||||
|
code: string,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
): RepositoryViolation[] {
|
||||||
|
const violations: RepositoryViolation[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
|
for (let i = 0; i < lines.length; i++) {
|
||||||
|
const line = lines[i]
|
||||||
|
const lineNumber = i + 1
|
||||||
|
|
||||||
|
this.detectConcreteInConstructor(line, lineNumber, filePath, layer, violations)
|
||||||
|
this.detectConcreteInField(line, lineNumber, filePath, layer, violations)
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects new Repository() instantiation
|
||||||
|
*/
|
||||||
|
public detectNewInstantiation(
|
||||||
|
code: string,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
): RepositoryViolation[] {
|
||||||
|
const violations: RepositoryViolation[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
|
for (let i = 0; i < lines.length; i++) {
|
||||||
|
const line = lines[i]
|
||||||
|
const lineNumber = i + 1
|
||||||
|
|
||||||
|
const newRepositoryMatch = /new\s+([A-Z]\w*Repository)\s*\(/.exec(line)
|
||||||
|
|
||||||
|
if (newRepositoryMatch && !line.trim().startsWith("//")) {
|
||||||
|
const repositoryName = newRepositoryMatch[1]
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NEW_REPOSITORY_IN_USE_CASE,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.APPLICATION,
|
||||||
|
lineNumber,
|
||||||
|
`Use case creates repository with 'new ${repositoryName}()'`,
|
||||||
|
undefined,
|
||||||
|
repositoryName,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects ORM types in method signatures
|
||||||
|
*/
|
||||||
|
private detectOrmInMethod(
|
||||||
|
line: string,
|
||||||
|
lineNumber: number,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
violations: RepositoryViolation[],
|
||||||
|
): void {
|
||||||
|
const methodMatch =
|
||||||
|
/(\w+)\s*\([^)]*:\s*([^)]+)\)\s*:\s*.*?(?:Promise<([^>]+)>|([A-Z]\w+))/.exec(line)
|
||||||
|
|
||||||
|
if (methodMatch) {
|
||||||
|
const params = methodMatch[2]
|
||||||
|
const returnType = methodMatch[3] || methodMatch[4]
|
||||||
|
|
||||||
|
if (this.ormMatcher.isOrmType(params)) {
|
||||||
|
const ormType = this.ormMatcher.extractOrmType(params)
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.DOMAIN,
|
||||||
|
lineNumber,
|
||||||
|
`Method parameter uses ORM type: ${ormType}`,
|
||||||
|
ormType,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
if (returnType && this.ormMatcher.isOrmType(returnType)) {
|
||||||
|
const ormType = this.ormMatcher.extractOrmType(returnType)
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.DOMAIN,
|
||||||
|
lineNumber,
|
||||||
|
`Method return type uses ORM type: ${ormType}`,
|
||||||
|
ormType,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects ORM types in general code line
|
||||||
|
*/
|
||||||
|
private detectOrmInLine(
|
||||||
|
line: string,
|
||||||
|
lineNumber: number,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
violations: RepositoryViolation[],
|
||||||
|
): void {
|
||||||
|
if (this.ormMatcher.isOrmType(line) && !line.trim().startsWith("//")) {
|
||||||
|
const ormType = this.ormMatcher.extractOrmType(line)
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.DOMAIN,
|
||||||
|
lineNumber,
|
||||||
|
`Repository interface contains ORM-specific type: ${ormType}`,
|
||||||
|
ormType,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects concrete repository in constructor
|
||||||
|
*/
|
||||||
|
private detectConcreteInConstructor(
|
||||||
|
line: string,
|
||||||
|
lineNumber: number,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
violations: RepositoryViolation[],
|
||||||
|
): void {
|
||||||
|
const constructorParamMatch =
|
||||||
|
/constructor\s*\([^)]*(?:private|public|protected)\s+(?:readonly\s+)?(\w+)\s*:\s*([A-Z]\w*Repository)/.exec(
|
||||||
|
line,
|
||||||
|
)
|
||||||
|
|
||||||
|
if (constructorParamMatch) {
|
||||||
|
const repositoryType = constructorParamMatch[2]
|
||||||
|
|
||||||
|
if (!repositoryType.startsWith("I")) {
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.APPLICATION,
|
||||||
|
lineNumber,
|
||||||
|
`Use case depends on concrete repository '${repositoryType}'`,
|
||||||
|
undefined,
|
||||||
|
repositoryType,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects concrete repository in field
|
||||||
|
*/
|
||||||
|
private detectConcreteInField(
|
||||||
|
line: string,
|
||||||
|
lineNumber: number,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
violations: RepositoryViolation[],
|
||||||
|
): void {
|
||||||
|
const fieldMatch =
|
||||||
|
/(?:private|public|protected)\s+(?:readonly\s+)?(\w+)\s*:\s*([A-Z]\w*Repository)/.exec(
|
||||||
|
line,
|
||||||
|
)
|
||||||
|
|
||||||
|
if (fieldMatch) {
|
||||||
|
const repositoryType = fieldMatch[2]
|
||||||
|
|
||||||
|
if (
|
||||||
|
!repositoryType.startsWith("I") &&
|
||||||
|
!line.includes(REPOSITORY_PATTERN_MESSAGES.CONSTRUCTOR)
|
||||||
|
) {
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.APPLICATION,
|
||||||
|
lineNumber,
|
||||||
|
`Use case field uses concrete repository '${repositoryType}'`,
|
||||||
|
undefined,
|
||||||
|
repositoryType,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -45,6 +45,25 @@ export const TYPE_NAMES = {
|
|||||||
OBJECT: "object",
|
OBJECT: "object",
|
||||||
} as const
|
} as const
|
||||||
|
|
||||||
|
/**
|
||||||
|
* TypeScript class and method keywords
|
||||||
|
*/
|
||||||
|
export const CLASS_KEYWORDS = {
|
||||||
|
CONSTRUCTOR: "constructor",
|
||||||
|
PUBLIC: "public",
|
||||||
|
PRIVATE: "private",
|
||||||
|
PROTECTED: "protected",
|
||||||
|
} as const
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Example code constants for documentation
|
||||||
|
*/
|
||||||
|
export const EXAMPLE_CODE_CONSTANTS = {
|
||||||
|
ORDER_STATUS_PENDING: "pending",
|
||||||
|
ORDER_STATUS_APPROVED: "approved",
|
||||||
|
CANNOT_APPROVE_ERROR: "Cannot approve",
|
||||||
|
} as const
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Common regex patterns
|
* Common regex patterns
|
||||||
*/
|
*/
|
||||||
@@ -86,11 +105,14 @@ export const SEVERITY_ORDER: Record<SeverityLevel, number> = {
|
|||||||
* Violation type to severity mapping
|
* Violation type to severity mapping
|
||||||
*/
|
*/
|
||||||
export const VIOLATION_SEVERITY_MAP = {
|
export const VIOLATION_SEVERITY_MAP = {
|
||||||
|
SECRET_EXPOSURE: SEVERITY_LEVELS.CRITICAL,
|
||||||
CIRCULAR_DEPENDENCY: SEVERITY_LEVELS.CRITICAL,
|
CIRCULAR_DEPENDENCY: SEVERITY_LEVELS.CRITICAL,
|
||||||
REPOSITORY_PATTERN: SEVERITY_LEVELS.CRITICAL,
|
REPOSITORY_PATTERN: SEVERITY_LEVELS.CRITICAL,
|
||||||
|
AGGREGATE_BOUNDARY: SEVERITY_LEVELS.CRITICAL,
|
||||||
DEPENDENCY_DIRECTION: SEVERITY_LEVELS.HIGH,
|
DEPENDENCY_DIRECTION: SEVERITY_LEVELS.HIGH,
|
||||||
FRAMEWORK_LEAK: SEVERITY_LEVELS.HIGH,
|
FRAMEWORK_LEAK: SEVERITY_LEVELS.HIGH,
|
||||||
ENTITY_EXPOSURE: SEVERITY_LEVELS.HIGH,
|
ENTITY_EXPOSURE: SEVERITY_LEVELS.HIGH,
|
||||||
|
ANEMIC_MODEL: SEVERITY_LEVELS.MEDIUM,
|
||||||
NAMING_CONVENTION: SEVERITY_LEVELS.MEDIUM,
|
NAMING_CONVENTION: SEVERITY_LEVELS.MEDIUM,
|
||||||
ARCHITECTURE: SEVERITY_LEVELS.MEDIUM,
|
ARCHITECTURE: SEVERITY_LEVELS.MEDIUM,
|
||||||
HARDCODE: SEVERITY_LEVELS.LOW,
|
HARDCODE: SEVERITY_LEVELS.LOW,
|
||||||
|
|||||||
@@ -10,6 +10,9 @@ export const RULES = {
|
|||||||
ENTITY_EXPOSURE: "entity-exposure",
|
ENTITY_EXPOSURE: "entity-exposure",
|
||||||
DEPENDENCY_DIRECTION: "dependency-direction",
|
DEPENDENCY_DIRECTION: "dependency-direction",
|
||||||
REPOSITORY_PATTERN: "repository-pattern",
|
REPOSITORY_PATTERN: "repository-pattern",
|
||||||
|
AGGREGATE_BOUNDARY: "aggregate-boundary",
|
||||||
|
SECRET_EXPOSURE: "secret-exposure",
|
||||||
|
ANEMIC_MODEL: "anemic-model",
|
||||||
} as const
|
} as const
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -101,32 +104,35 @@ export const NAMING_PATTERNS = {
|
|||||||
* Common verbs for use cases
|
* Common verbs for use cases
|
||||||
*/
|
*/
|
||||||
export const USE_CASE_VERBS = [
|
export const USE_CASE_VERBS = [
|
||||||
|
"Aggregate",
|
||||||
"Analyze",
|
"Analyze",
|
||||||
"Create",
|
"Approve",
|
||||||
"Update",
|
|
||||||
"Delete",
|
|
||||||
"Get",
|
|
||||||
"Find",
|
|
||||||
"List",
|
|
||||||
"Search",
|
|
||||||
"Validate",
|
|
||||||
"Calculate",
|
|
||||||
"Generate",
|
|
||||||
"Send",
|
|
||||||
"Fetch",
|
|
||||||
"Process",
|
|
||||||
"Execute",
|
|
||||||
"Handle",
|
|
||||||
"Register",
|
|
||||||
"Authenticate",
|
"Authenticate",
|
||||||
"Authorize",
|
"Authorize",
|
||||||
"Import",
|
"Calculate",
|
||||||
"Export",
|
|
||||||
"Place",
|
|
||||||
"Cancel",
|
"Cancel",
|
||||||
"Approve",
|
"Collect",
|
||||||
"Reject",
|
|
||||||
"Confirm",
|
"Confirm",
|
||||||
|
"Create",
|
||||||
|
"Delete",
|
||||||
|
"Execute",
|
||||||
|
"Export",
|
||||||
|
"Fetch",
|
||||||
|
"Find",
|
||||||
|
"Generate",
|
||||||
|
"Get",
|
||||||
|
"Handle",
|
||||||
|
"Import",
|
||||||
|
"List",
|
||||||
|
"Parse",
|
||||||
|
"Place",
|
||||||
|
"Process",
|
||||||
|
"Register",
|
||||||
|
"Reject",
|
||||||
|
"Search",
|
||||||
|
"Send",
|
||||||
|
"Update",
|
||||||
|
"Validate",
|
||||||
] as const
|
] as const
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|||||||
538
packages/guardian/tests/AggregateBoundaryDetector.test.ts
Normal file
538
packages/guardian/tests/AggregateBoundaryDetector.test.ts
Normal file
@@ -0,0 +1,538 @@
|
|||||||
|
import { describe, it, expect } from "vitest"
|
||||||
|
import { AggregateBoundaryDetector } from "../src/infrastructure/analyzers/AggregateBoundaryDetector"
|
||||||
|
import { LAYERS } from "../src/shared/constants/rules"
|
||||||
|
|
||||||
|
describe("AggregateBoundaryDetector", () => {
|
||||||
|
const detector = new AggregateBoundaryDetector()
|
||||||
|
|
||||||
|
describe("extractAggregateFromPath", () => {
|
||||||
|
it("should extract aggregate from domain/aggregates/name path", () => {
|
||||||
|
expect(detector.extractAggregateFromPath("src/domain/aggregates/order/Order.ts")).toBe(
|
||||||
|
"order",
|
||||||
|
)
|
||||||
|
expect(detector.extractAggregateFromPath("src/domain/aggregates/user/User.ts")).toBe(
|
||||||
|
"user",
|
||||||
|
)
|
||||||
|
expect(
|
||||||
|
detector.extractAggregateFromPath("src/domain/aggregates/product/Product.ts"),
|
||||||
|
).toBe("product")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should extract aggregate from domain/name path", () => {
|
||||||
|
expect(detector.extractAggregateFromPath("src/domain/order/Order.ts")).toBe("order")
|
||||||
|
expect(detector.extractAggregateFromPath("src/domain/user/User.ts")).toBe("user")
|
||||||
|
expect(detector.extractAggregateFromPath("src/domain/cart/ShoppingCart.ts")).toBe(
|
||||||
|
"cart",
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should extract aggregate from domain/entities/name path", () => {
|
||||||
|
expect(detector.extractAggregateFromPath("src/domain/entities/order/Order.ts")).toBe(
|
||||||
|
"order",
|
||||||
|
)
|
||||||
|
expect(detector.extractAggregateFromPath("src/domain/entities/user/User.ts")).toBe(
|
||||||
|
"user",
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return undefined for non-domain paths", () => {
|
||||||
|
expect(
|
||||||
|
detector.extractAggregateFromPath("src/application/use-cases/CreateUser.ts"),
|
||||||
|
).toBeUndefined()
|
||||||
|
expect(
|
||||||
|
detector.extractAggregateFromPath(
|
||||||
|
"src/infrastructure/repositories/UserRepository.ts",
|
||||||
|
),
|
||||||
|
).toBeUndefined()
|
||||||
|
expect(detector.extractAggregateFromPath("src/shared/types/Result.ts")).toBeUndefined()
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return undefined for paths without aggregate structure", () => {
|
||||||
|
expect(detector.extractAggregateFromPath("src/domain/User.ts")).toBeUndefined()
|
||||||
|
expect(detector.extractAggregateFromPath("src/User.ts")).toBeUndefined()
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle Windows-style paths", () => {
|
||||||
|
expect(
|
||||||
|
detector.extractAggregateFromPath("src\\domain\\aggregates\\order\\Order.ts"),
|
||||||
|
).toBe("order")
|
||||||
|
expect(detector.extractAggregateFromPath("src\\domain\\user\\User.ts")).toBe("user")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("isAggregateBoundaryViolation", () => {
|
||||||
|
it("should detect direct entity import from another aggregate", () => {
|
||||||
|
expect(detector.isAggregateBoundaryViolation("../user/User", "order")).toBe(true)
|
||||||
|
expect(detector.isAggregateBoundaryViolation("../../user/User", "order")).toBe(true)
|
||||||
|
expect(
|
||||||
|
detector.isAggregateBoundaryViolation("../../../domain/user/User", "order"),
|
||||||
|
).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect import from same aggregate", () => {
|
||||||
|
expect(detector.isAggregateBoundaryViolation("../order/Order", "order")).toBe(false)
|
||||||
|
expect(detector.isAggregateBoundaryViolation("./OrderItem", "order")).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect value object imports", () => {
|
||||||
|
expect(
|
||||||
|
detector.isAggregateBoundaryViolation("../user/value-objects/UserId", "order"),
|
||||||
|
).toBe(false)
|
||||||
|
expect(detector.isAggregateBoundaryViolation("../user/vo/Email", "order")).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect event imports", () => {
|
||||||
|
expect(
|
||||||
|
detector.isAggregateBoundaryViolation("../user/events/UserCreatedEvent", "order"),
|
||||||
|
).toBe(false)
|
||||||
|
expect(
|
||||||
|
detector.isAggregateBoundaryViolation(
|
||||||
|
"../user/domain-events/UserRegisteredEvent",
|
||||||
|
"order",
|
||||||
|
),
|
||||||
|
).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect repository interface imports", () => {
|
||||||
|
expect(
|
||||||
|
detector.isAggregateBoundaryViolation(
|
||||||
|
"../user/repositories/IUserRepository",
|
||||||
|
"order",
|
||||||
|
),
|
||||||
|
).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect service imports", () => {
|
||||||
|
expect(
|
||||||
|
detector.isAggregateBoundaryViolation("../user/services/UserService", "order"),
|
||||||
|
).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect external package imports", () => {
|
||||||
|
expect(detector.isAggregateBoundaryViolation("express", "order")).toBe(false)
|
||||||
|
expect(detector.isAggregateBoundaryViolation("@nestjs/common", "order")).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect imports without path separator", () => {
|
||||||
|
expect(detector.isAggregateBoundaryViolation("User", "order")).toBe(false)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("detectViolations", () => {
|
||||||
|
describe("Domain layer aggregate boundary violations", () => {
|
||||||
|
it("should detect direct entity import from another aggregate", () => {
|
||||||
|
const code = `
|
||||||
|
import { User } from '../user/User'
|
||||||
|
|
||||||
|
export class Order {
|
||||||
|
constructor(private user: User) {}
|
||||||
|
}`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
expect(violations[0].fromAggregate).toBe("order")
|
||||||
|
expect(violations[0].toAggregate).toBe("user")
|
||||||
|
expect(violations[0].entityName).toBe("User")
|
||||||
|
expect(violations[0].importPath).toBe("../user/User")
|
||||||
|
expect(violations[0].line).toBe(2)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect multiple entity imports from different aggregates", () => {
|
||||||
|
const code = `
|
||||||
|
import { User } from '../user/User'
|
||||||
|
import { Product } from '../product/Product'
|
||||||
|
import { Category } from '../catalog/Category'
|
||||||
|
|
||||||
|
export class Order {
|
||||||
|
constructor(
|
||||||
|
private user: User,
|
||||||
|
private product: Product,
|
||||||
|
private category: Category
|
||||||
|
) {}
|
||||||
|
}`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(3)
|
||||||
|
expect(violations[0].entityName).toBe("User")
|
||||||
|
expect(violations[1].entityName).toBe("Product")
|
||||||
|
expect(violations[2].entityName).toBe("Category")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect value object imports", () => {
|
||||||
|
const code = `
|
||||||
|
import { UserId } from '../user/value-objects/UserId'
|
||||||
|
import { ProductId } from '../product/value-objects/ProductId'
|
||||||
|
|
||||||
|
export class Order {
|
||||||
|
constructor(
|
||||||
|
private userId: UserId,
|
||||||
|
private productId: ProductId
|
||||||
|
) {}
|
||||||
|
}`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect event imports", () => {
|
||||||
|
const code = `
|
||||||
|
import { UserCreatedEvent } from '../user/events/UserCreatedEvent'
|
||||||
|
import { ProductAddedEvent } from '../product/domain-events/ProductAddedEvent'
|
||||||
|
|
||||||
|
export class Order {
|
||||||
|
handle(event: UserCreatedEvent): void {}
|
||||||
|
}`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect repository interface imports", () => {
|
||||||
|
const code = `
|
||||||
|
import { IUserRepository } from '../user/repositories/IUserRepository'
|
||||||
|
|
||||||
|
export class OrderService {
|
||||||
|
constructor(private userRepo: IUserRepository) {}
|
||||||
|
}`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/OrderService.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect imports from same aggregate", () => {
|
||||||
|
const code = `
|
||||||
|
import { OrderItem } from './OrderItem'
|
||||||
|
import { OrderStatus } from './value-objects/OrderStatus'
|
||||||
|
|
||||||
|
export class Order {
|
||||||
|
constructor(
|
||||||
|
private items: OrderItem[],
|
||||||
|
private status: OrderStatus
|
||||||
|
) {}
|
||||||
|
}`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Non-domain layers", () => {
|
||||||
|
it("should return empty array for application layer", () => {
|
||||||
|
const code = `
|
||||||
|
import { User } from '../../domain/aggregates/user/User'
|
||||||
|
import { Order } from '../../domain/aggregates/order/Order'
|
||||||
|
|
||||||
|
export class CreateOrder {
|
||||||
|
constructor() {}
|
||||||
|
}`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/application/use-cases/CreateOrder.ts",
|
||||||
|
LAYERS.APPLICATION,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return empty array for infrastructure layer", () => {
|
||||||
|
const code = `
|
||||||
|
import { User } from '../../domain/aggregates/user/User'
|
||||||
|
|
||||||
|
export class UserController {
|
||||||
|
constructor() {}
|
||||||
|
}`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/infrastructure/controllers/UserController.ts",
|
||||||
|
LAYERS.INFRASTRUCTURE,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return empty array for undefined layer", () => {
|
||||||
|
const code = `import { User } from '../user/User'`
|
||||||
|
const violations = detector.detectViolations(code, "src/utils/helper.ts", undefined)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Import statement formats", () => {
|
||||||
|
it("should detect violations in named imports", () => {
|
||||||
|
const code = `import { User, UserProfile } from '../user/User'`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect violations in default imports", () => {
|
||||||
|
const code = `import User from '../user/User'`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect violations in namespace imports", () => {
|
||||||
|
const code = `import * as UserAggregate from '../user/User'`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect violations in require statements", () => {
|
||||||
|
const code = `const User = require('../user/User')`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Different path structures", () => {
|
||||||
|
it("should detect violations in domain/aggregates/name structure", () => {
|
||||||
|
const code = `import { User } from '../user/User'`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
expect(violations[0].fromAggregate).toBe("order")
|
||||||
|
expect(violations[0].toAggregate).toBe("user")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect violations in domain/name structure", () => {
|
||||||
|
const code = `import { User } from '../user/User'`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
expect(violations[0].fromAggregate).toBe("order")
|
||||||
|
expect(violations[0].toAggregate).toBe("user")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect violations in domain/entities/name structure", () => {
|
||||||
|
const code = `import { User } from '../../user/User'`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/entities/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
expect(violations[0].fromAggregate).toBe("order")
|
||||||
|
expect(violations[0].toAggregate).toBe("user")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Edge cases", () => {
|
||||||
|
it("should handle empty code", () => {
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
"",
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle code with no imports", () => {
|
||||||
|
const code = `
|
||||||
|
export class Order {
|
||||||
|
constructor(private id: string) {}
|
||||||
|
}`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle file without aggregate in path", () => {
|
||||||
|
const code = `import { User } from '../user/User'`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle comments in imports", () => {
|
||||||
|
const code = `
|
||||||
|
// This is a comment
|
||||||
|
import { User } from '../user/User' // Bad import
|
||||||
|
`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getMessage", () => {
|
||||||
|
it("should return correct violation message", () => {
|
||||||
|
const code = `import { User } from '../user/User'`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations[0].getMessage()).toBe(
|
||||||
|
"Order aggregate should not directly reference User entity from User aggregate",
|
||||||
|
)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should capitalize aggregate names in message", () => {
|
||||||
|
const code = `import { Product } from '../product/Product'`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/cart/ShoppingCart.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations[0].getMessage()).toContain("Cart aggregate")
|
||||||
|
expect(violations[0].getMessage()).toContain("Product aggregate")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getSuggestion", () => {
|
||||||
|
it("should return suggestions for fixing aggregate boundary violations", () => {
|
||||||
|
const code = `import { User } from '../user/User'`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
const suggestion = violations[0].getSuggestion()
|
||||||
|
expect(suggestion).toContain("Reference other aggregates by ID")
|
||||||
|
expect(suggestion).toContain("Use Value Objects")
|
||||||
|
expect(suggestion).toContain("Avoid direct entity references")
|
||||||
|
expect(suggestion).toContain("independently modifiable")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getExampleFix", () => {
|
||||||
|
it("should return example fix for aggregate boundary violation", () => {
|
||||||
|
const code = `import { User } from '../user/User'`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violations[0].getExampleFix()
|
||||||
|
expect(example).toContain("// ❌ Bad")
|
||||||
|
expect(example).toContain("// ✅ Good")
|
||||||
|
expect(example).toContain("UserId")
|
||||||
|
expect(example).toContain("CustomerInfo")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Complex scenarios", () => {
|
||||||
|
it("should detect mixed valid and invalid imports", () => {
|
||||||
|
const code = `
|
||||||
|
import { User } from '../user/User' // VIOLATION
|
||||||
|
import { UserId } from '../user/value-objects/UserId' // OK
|
||||||
|
import { Product } from '../product/Product' // VIOLATION
|
||||||
|
import { ProductId } from '../product/value-objects/ProductId' // OK
|
||||||
|
import { OrderItem } from './OrderItem' // OK - same aggregate
|
||||||
|
|
||||||
|
export class Order {
|
||||||
|
constructor(
|
||||||
|
private user: User,
|
||||||
|
private userId: UserId,
|
||||||
|
private product: Product,
|
||||||
|
private productId: ProductId,
|
||||||
|
private items: OrderItem[]
|
||||||
|
) {}
|
||||||
|
}`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(2)
|
||||||
|
expect(violations[0].entityName).toBe("User")
|
||||||
|
expect(violations[1].entityName).toBe("Product")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle deeply nested import paths", () => {
|
||||||
|
const code = `import { User } from '../../../domain/aggregates/user/entities/User'`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
expect(violations[0].entityName).toBe("User")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect violations with .ts extension in import", () => {
|
||||||
|
const code = `import { User } from '../user/User.ts'`
|
||||||
|
const violations = detector.detectViolations(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/order/Order.ts",
|
||||||
|
LAYERS.DOMAIN,
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
expect(violations[0].entityName).toBe("User")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
372
packages/guardian/tests/AnemicModelDetector.test.ts
Normal file
372
packages/guardian/tests/AnemicModelDetector.test.ts
Normal file
@@ -0,0 +1,372 @@
|
|||||||
|
import { describe, it, expect, beforeEach } from "vitest"
|
||||||
|
import { AnemicModelDetector } from "../src/infrastructure/analyzers/AnemicModelDetector"
|
||||||
|
|
||||||
|
describe("AnemicModelDetector", () => {
|
||||||
|
let detector: AnemicModelDetector
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
detector = new AnemicModelDetector()
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("detectAnemicModels", () => {
|
||||||
|
it("should detect class with only getters and setters", () => {
|
||||||
|
const code = `
|
||||||
|
class Order {
|
||||||
|
private status: string
|
||||||
|
private total: number
|
||||||
|
|
||||||
|
getStatus(): string {
|
||||||
|
return this.status
|
||||||
|
}
|
||||||
|
|
||||||
|
setStatus(status: string): void {
|
||||||
|
this.status = status
|
||||||
|
}
|
||||||
|
|
||||||
|
getTotal(): number {
|
||||||
|
return this.total
|
||||||
|
}
|
||||||
|
|
||||||
|
setTotal(total: number): void {
|
||||||
|
this.total = total
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
const violations = detector.detectAnemicModels(
|
||||||
|
code,
|
||||||
|
"src/domain/entities/Order.ts",
|
||||||
|
"domain",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
expect(violations[0].className).toBe("Order")
|
||||||
|
expect(violations[0].methodCount).toBeGreaterThan(0)
|
||||||
|
expect(violations[0].propertyCount).toBeGreaterThan(0)
|
||||||
|
expect(violations[0].getMessage()).toContain("Order")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect class with public setters", () => {
|
||||||
|
const code = `
|
||||||
|
class User {
|
||||||
|
private email: string
|
||||||
|
private password: string
|
||||||
|
|
||||||
|
public setEmail(email: string): void {
|
||||||
|
this.email = email
|
||||||
|
}
|
||||||
|
|
||||||
|
public getEmail(): string {
|
||||||
|
return this.email
|
||||||
|
}
|
||||||
|
|
||||||
|
public setPassword(password: string): void {
|
||||||
|
this.password = password
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
const violations = detector.detectAnemicModels(
|
||||||
|
code,
|
||||||
|
"src/domain/entities/User.ts",
|
||||||
|
"domain",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
expect(violations[0].className).toBe("User")
|
||||||
|
expect(violations[0].hasPublicSetters).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should not detect rich domain model with business logic", () => {
|
||||||
|
const code = `
|
||||||
|
class Order {
|
||||||
|
private readonly id: string
|
||||||
|
private status: OrderStatus
|
||||||
|
private items: OrderItem[]
|
||||||
|
|
||||||
|
public approve(): void {
|
||||||
|
if (!this.canBeApproved()) {
|
||||||
|
throw new Error("Cannot approve")
|
||||||
|
}
|
||||||
|
this.status = OrderStatus.APPROVED
|
||||||
|
}
|
||||||
|
|
||||||
|
public reject(reason: string): void {
|
||||||
|
if (!this.canBeRejected()) {
|
||||||
|
throw new Error("Cannot reject")
|
||||||
|
}
|
||||||
|
this.status = OrderStatus.REJECTED
|
||||||
|
}
|
||||||
|
|
||||||
|
public addItem(item: OrderItem): void {
|
||||||
|
if (this.isApproved()) {
|
||||||
|
throw new Error("Cannot modify approved order")
|
||||||
|
}
|
||||||
|
this.items.push(item)
|
||||||
|
}
|
||||||
|
|
||||||
|
public calculateTotal(): Money {
|
||||||
|
return this.items.reduce((sum, item) => sum.add(item.getPrice()), Money.zero())
|
||||||
|
}
|
||||||
|
|
||||||
|
public getStatus(): OrderStatus {
|
||||||
|
return this.status
|
||||||
|
}
|
||||||
|
|
||||||
|
private canBeApproved(): boolean {
|
||||||
|
return this.status === OrderStatus.PENDING
|
||||||
|
}
|
||||||
|
|
||||||
|
private canBeRejected(): boolean {
|
||||||
|
return this.status === OrderStatus.PENDING
|
||||||
|
}
|
||||||
|
|
||||||
|
private isApproved(): boolean {
|
||||||
|
return this.status === OrderStatus.APPROVED
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
const violations = detector.detectAnemicModels(
|
||||||
|
code,
|
||||||
|
"src/domain/entities/Order.ts",
|
||||||
|
"domain",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should not analyze files outside domain layer", () => {
|
||||||
|
const code = `
|
||||||
|
class OrderDto {
|
||||||
|
getStatus(): string {
|
||||||
|
return this.status
|
||||||
|
}
|
||||||
|
|
||||||
|
setStatus(status: string): void {
|
||||||
|
this.status = status
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
const violations = detector.detectAnemicModels(
|
||||||
|
code,
|
||||||
|
"src/application/dtos/OrderDto.ts",
|
||||||
|
"application",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should not analyze DTO files", () => {
|
||||||
|
const code = `
|
||||||
|
class UserDto {
|
||||||
|
private email: string
|
||||||
|
|
||||||
|
getEmail(): string {
|
||||||
|
return this.email
|
||||||
|
}
|
||||||
|
|
||||||
|
setEmail(email: string): void {
|
||||||
|
this.email = email
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
const violations = detector.detectAnemicModels(
|
||||||
|
code,
|
||||||
|
"src/domain/dtos/UserDto.ts",
|
||||||
|
"domain",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should not analyze test files", () => {
|
||||||
|
const code = `
|
||||||
|
class Order {
|
||||||
|
getStatus(): string {
|
||||||
|
return this.status
|
||||||
|
}
|
||||||
|
|
||||||
|
setStatus(status: string): void {
|
||||||
|
this.status = status
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
const violations = detector.detectAnemicModels(
|
||||||
|
code,
|
||||||
|
"src/domain/entities/Order.test.ts",
|
||||||
|
"domain",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect anemic model in entities folder", () => {
|
||||||
|
const code = `
|
||||||
|
class Product {
|
||||||
|
private name: string
|
||||||
|
private price: number
|
||||||
|
|
||||||
|
getName(): string {
|
||||||
|
return this.name
|
||||||
|
}
|
||||||
|
|
||||||
|
setName(name: string): void {
|
||||||
|
this.name = name
|
||||||
|
}
|
||||||
|
|
||||||
|
getPrice(): number {
|
||||||
|
return this.price
|
||||||
|
}
|
||||||
|
|
||||||
|
setPrice(price: number): void {
|
||||||
|
this.price = price
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
const violations = detector.detectAnemicModels(
|
||||||
|
code,
|
||||||
|
"src/domain/entities/Product.ts",
|
||||||
|
"domain",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
expect(violations[0].className).toBe("Product")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect anemic model in aggregates folder", () => {
|
||||||
|
const code = `
|
||||||
|
class Customer {
|
||||||
|
private email: string
|
||||||
|
|
||||||
|
getEmail(): string {
|
||||||
|
return this.email
|
||||||
|
}
|
||||||
|
|
||||||
|
setEmail(email: string): void {
|
||||||
|
this.email = email
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
const violations = detector.detectAnemicModels(
|
||||||
|
code,
|
||||||
|
"src/domain/aggregates/customer/Customer.ts",
|
||||||
|
"domain",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
expect(violations[0].className).toBe("Customer")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should not detect class with good method-to-property ratio", () => {
|
||||||
|
const code = `
|
||||||
|
class Account {
|
||||||
|
private balance: number
|
||||||
|
private isActive: boolean
|
||||||
|
|
||||||
|
public deposit(amount: number): void {
|
||||||
|
if (amount <= 0) throw new Error("Invalid amount")
|
||||||
|
this.balance += amount
|
||||||
|
}
|
||||||
|
|
||||||
|
public withdraw(amount: number): void {
|
||||||
|
if (amount > this.balance) throw new Error("Insufficient funds")
|
||||||
|
this.balance -= amount
|
||||||
|
}
|
||||||
|
|
||||||
|
public activate(): void {
|
||||||
|
this.isActive = true
|
||||||
|
}
|
||||||
|
|
||||||
|
public deactivate(): void {
|
||||||
|
this.isActive = false
|
||||||
|
}
|
||||||
|
|
||||||
|
public getBalance(): number {
|
||||||
|
return this.balance
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
const violations = detector.detectAnemicModels(
|
||||||
|
code,
|
||||||
|
"src/domain/entities/Account.ts",
|
||||||
|
"domain",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle class with no properties or methods", () => {
|
||||||
|
const code = `
|
||||||
|
class EmptyEntity {
|
||||||
|
}
|
||||||
|
`
|
||||||
|
const violations = detector.detectAnemicModels(
|
||||||
|
code,
|
||||||
|
"src/domain/entities/EmptyEntity.ts",
|
||||||
|
"domain",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect multiple anemic classes in one file", () => {
|
||||||
|
const code = `
|
||||||
|
class Order {
|
||||||
|
getStatus() { return this.status }
|
||||||
|
setStatus(status: string) { this.status = status }
|
||||||
|
}
|
||||||
|
|
||||||
|
class Item {
|
||||||
|
getPrice() { return this.price }
|
||||||
|
setPrice(price: number) { this.price = price }
|
||||||
|
}
|
||||||
|
`
|
||||||
|
const violations = detector.detectAnemicModels(
|
||||||
|
code,
|
||||||
|
"src/domain/entities/Models.ts",
|
||||||
|
"domain",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(2)
|
||||||
|
expect(violations[0].className).toBe("Order")
|
||||||
|
expect(violations[1].className).toBe("Item")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should provide correct violation details", () => {
|
||||||
|
const code = `
|
||||||
|
class Payment {
|
||||||
|
private amount: number
|
||||||
|
private currency: string
|
||||||
|
|
||||||
|
getAmount(): number {
|
||||||
|
return this.amount
|
||||||
|
}
|
||||||
|
|
||||||
|
setAmount(amount: number): void {
|
||||||
|
this.amount = amount
|
||||||
|
}
|
||||||
|
|
||||||
|
getCurrency(): string {
|
||||||
|
return this.currency
|
||||||
|
}
|
||||||
|
|
||||||
|
setCurrency(currency: string): void {
|
||||||
|
this.currency = currency
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
const violations = detector.detectAnemicModels(
|
||||||
|
code,
|
||||||
|
"src/domain/entities/Payment.ts",
|
||||||
|
"domain",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(1)
|
||||||
|
const violation = violations[0]
|
||||||
|
expect(violation.className).toBe("Payment")
|
||||||
|
expect(violation.filePath).toBe("src/domain/entities/Payment.ts")
|
||||||
|
expect(violation.layer).toBe("domain")
|
||||||
|
expect(violation.line).toBeGreaterThan(0)
|
||||||
|
expect(violation.getMessage()).toContain("Payment")
|
||||||
|
expect(violation.getSuggestion()).toContain("business")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
285
packages/guardian/tests/e2e/AnalyzeProject.e2e.test.ts
Normal file
285
packages/guardian/tests/e2e/AnalyzeProject.e2e.test.ts
Normal file
@@ -0,0 +1,285 @@
|
|||||||
|
import { describe, it, expect } from "vitest"
|
||||||
|
import { analyzeProject } from "../../src/api"
|
||||||
|
import path from "path"
|
||||||
|
|
||||||
|
describe("AnalyzeProject E2E", () => {
|
||||||
|
const EXAMPLES_DIR = path.join(__dirname, "../../examples")
|
||||||
|
|
||||||
|
describe("Full Pipeline", () => {
|
||||||
|
it("should analyze project and return complete results", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
expect(result).toBeDefined()
|
||||||
|
expect(result.metrics).toBeDefined()
|
||||||
|
expect(result.metrics.totalFiles).toBeGreaterThan(0)
|
||||||
|
expect(result.metrics.totalFunctions).toBeGreaterThanOrEqual(0)
|
||||||
|
expect(result.metrics.totalImports).toBeGreaterThanOrEqual(0)
|
||||||
|
expect(result.dependencyGraph).toBeDefined()
|
||||||
|
|
||||||
|
expect(Array.isArray(result.hardcodeViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.violations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.circularDependencyViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.namingViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.frameworkLeakViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.entityExposureViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.dependencyDirectionViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.repositoryPatternViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.aggregateBoundaryViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.anemicModelViolations)).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should respect exclude patterns", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({
|
||||||
|
rootDir,
|
||||||
|
exclude: ["**/dtos/**", "**/mappers/**"],
|
||||||
|
})
|
||||||
|
|
||||||
|
expect(result.metrics.totalFiles).toBeGreaterThan(0)
|
||||||
|
|
||||||
|
const allFiles = [
|
||||||
|
...result.hardcodeViolations.map((v) => v.file),
|
||||||
|
...result.violations.map((v) => v.file),
|
||||||
|
...result.namingViolations.map((v) => v.file),
|
||||||
|
]
|
||||||
|
|
||||||
|
allFiles.forEach((file) => {
|
||||||
|
expect(file).not.toContain("/dtos/")
|
||||||
|
expect(file).not.toContain("/mappers/")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect violations across all detectors", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
const totalViolations =
|
||||||
|
result.hardcodeViolations.length +
|
||||||
|
result.violations.length +
|
||||||
|
result.circularDependencyViolations.length +
|
||||||
|
result.namingViolations.length +
|
||||||
|
result.frameworkLeakViolations.length +
|
||||||
|
result.entityExposureViolations.length +
|
||||||
|
result.dependencyDirectionViolations.length +
|
||||||
|
result.repositoryPatternViolations.length +
|
||||||
|
result.aggregateBoundaryViolations.length +
|
||||||
|
result.anemicModelViolations.length
|
||||||
|
|
||||||
|
expect(totalViolations).toBeGreaterThan(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Good Architecture Examples", () => {
|
||||||
|
it("should find zero violations in good-architecture/", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
expect(result.violations.length).toBe(0)
|
||||||
|
expect(result.frameworkLeakViolations.length).toBe(0)
|
||||||
|
expect(result.entityExposureViolations.length).toBe(0)
|
||||||
|
expect(result.dependencyDirectionViolations.length).toBe(0)
|
||||||
|
expect(result.circularDependencyViolations.length).toBe(0)
|
||||||
|
expect(result.anemicModelViolations.length).toBe(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should have no dependency direction violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture/dependency-direction")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
const goodFiles = result.dependencyDirectionViolations.filter((v) =>
|
||||||
|
v.file.includes("Good"),
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(goodFiles.length).toBe(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should have no entity exposure in good controller", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture/entity-exposure")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
expect(result.entityExposureViolations.length).toBe(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Bad Architecture Examples", () => {
|
||||||
|
it("should detect hardcoded values in bad examples", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture/hardcoded")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
expect(result.hardcodeViolations.length).toBeGreaterThan(0)
|
||||||
|
|
||||||
|
const magicNumbers = result.hardcodeViolations.filter((v) => v.type === "magic-number")
|
||||||
|
expect(magicNumbers.length).toBeGreaterThan(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect circular dependencies", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture/circular")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.circularDependencyViolations.length > 0) {
|
||||||
|
const violation = result.circularDependencyViolations[0]
|
||||||
|
expect(violation.cycle).toBeDefined()
|
||||||
|
expect(violation.cycle.length).toBeGreaterThanOrEqual(2)
|
||||||
|
expect(violation.severity).toBe("critical")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect framework leaks in domain", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture/framework-leaks")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.frameworkLeakViolations.length > 0) {
|
||||||
|
const violation = result.frameworkLeakViolations[0]
|
||||||
|
expect(violation.packageName).toBeDefined()
|
||||||
|
expect(violation.severity).toBe("high")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect naming convention violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture/naming")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.namingViolations.length > 0) {
|
||||||
|
const violation = result.namingViolations[0]
|
||||||
|
expect(violation.expected).toBeDefined()
|
||||||
|
expect(violation.severity).toBe("medium")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect entity exposure violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture/entity-exposure")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.entityExposureViolations.length > 0) {
|
||||||
|
const violation = result.entityExposureViolations[0]
|
||||||
|
expect(violation.entityName).toBeDefined()
|
||||||
|
expect(violation.severity).toBe("high")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect dependency direction violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture/dependency-direction")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.dependencyDirectionViolations.length > 0) {
|
||||||
|
const violation = result.dependencyDirectionViolations[0]
|
||||||
|
expect(violation.fromLayer).toBeDefined()
|
||||||
|
expect(violation.toLayer).toBeDefined()
|
||||||
|
expect(violation.severity).toBe("high")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect repository pattern violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "repository-pattern")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
const badViolations = result.repositoryPatternViolations.filter((v) =>
|
||||||
|
v.file.includes("bad"),
|
||||||
|
)
|
||||||
|
|
||||||
|
if (badViolations.length > 0) {
|
||||||
|
const violation = badViolations[0]
|
||||||
|
expect(violation.violationType).toBeDefined()
|
||||||
|
expect(violation.severity).toBe("critical")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect aggregate boundary violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "aggregate-boundary/bad")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.aggregateBoundaryViolations.length > 0) {
|
||||||
|
const violation = result.aggregateBoundaryViolations[0]
|
||||||
|
expect(violation.fromAggregate).toBeDefined()
|
||||||
|
expect(violation.toAggregate).toBeDefined()
|
||||||
|
expect(violation.severity).toBe("critical")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Metrics", () => {
|
||||||
|
it("should provide accurate file counts", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
expect(result.metrics.totalFiles).toBeGreaterThan(0)
|
||||||
|
expect(result.metrics.totalFunctions).toBeGreaterThanOrEqual(0)
|
||||||
|
expect(result.metrics.totalImports).toBeGreaterThanOrEqual(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should track layer distribution", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
expect(result.metrics.layerDistribution).toBeDefined()
|
||||||
|
expect(typeof result.metrics.layerDistribution).toBe("object")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should calculate correct metrics for bad architecture", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
expect(result.metrics.totalFiles).toBeGreaterThan(0)
|
||||||
|
expect(result.metrics.totalFunctions).toBeGreaterThanOrEqual(0)
|
||||||
|
expect(result.metrics.totalImports).toBeGreaterThanOrEqual(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Dependency Graph", () => {
|
||||||
|
it("should build dependency graph for analyzed files", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
expect(result.dependencyGraph).toBeDefined()
|
||||||
|
expect(result.files).toBeDefined()
|
||||||
|
expect(Array.isArray(result.files)).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should track file metadata", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.files.length > 0) {
|
||||||
|
const file = result.files[0]
|
||||||
|
expect(file).toHaveProperty("path")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Error Handling", () => {
|
||||||
|
it("should handle non-existent directory", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "non-existent-directory")
|
||||||
|
|
||||||
|
await expect(analyzeProject({ rootDir })).rejects.toThrow()
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle empty directory gracefully", async () => {
|
||||||
|
const rootDir = path.join(__dirname, "../../dist")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
expect(result).toBeDefined()
|
||||||
|
expect(result.metrics.totalFiles).toBeGreaterThanOrEqual(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
278
packages/guardian/tests/e2e/CLI.e2e.test.ts
Normal file
278
packages/guardian/tests/e2e/CLI.e2e.test.ts
Normal file
@@ -0,0 +1,278 @@
|
|||||||
|
import { describe, it, expect, beforeAll } from "vitest"
|
||||||
|
import { spawn } from "child_process"
|
||||||
|
import path from "path"
|
||||||
|
import { promisify } from "util"
|
||||||
|
import { exec } from "child_process"
|
||||||
|
|
||||||
|
const execAsync = promisify(exec)
|
||||||
|
|
||||||
|
describe("CLI E2E", () => {
|
||||||
|
const CLI_PATH = path.join(__dirname, "../../bin/guardian.js")
|
||||||
|
const EXAMPLES_DIR = path.join(__dirname, "../../examples")
|
||||||
|
|
||||||
|
beforeAll(async () => {
|
||||||
|
await execAsync("pnpm build", {
|
||||||
|
cwd: path.join(__dirname, "../../"),
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
const runCLI = async (
|
||||||
|
args: string,
|
||||||
|
): Promise<{ stdout: string; stderr: string; exitCode: number }> => {
|
||||||
|
try {
|
||||||
|
const { stdout, stderr } = await execAsync(`node ${CLI_PATH} ${args}`)
|
||||||
|
return { stdout, stderr, exitCode: 0 }
|
||||||
|
} catch (error: unknown) {
|
||||||
|
const err = error as { stdout?: string; stderr?: string; code?: number }
|
||||||
|
return {
|
||||||
|
stdout: err.stdout || "",
|
||||||
|
stderr: err.stderr || "",
|
||||||
|
exitCode: err.code || 1,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("Smoke Tests", () => {
|
||||||
|
it("should display version", async () => {
|
||||||
|
const { stdout } = await execAsync(`node ${CLI_PATH} --version`)
|
||||||
|
|
||||||
|
expect(stdout).toMatch(/\d+\.\d+\.\d+/)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should display help", async () => {
|
||||||
|
const { stdout } = await execAsync(`node ${CLI_PATH} --help`)
|
||||||
|
|
||||||
|
expect(stdout).toContain("Usage:")
|
||||||
|
expect(stdout).toContain("check")
|
||||||
|
expect(stdout).toContain("Options:")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should run check command successfully", async () => {
|
||||||
|
const goodArchDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${goodArchDir}`)
|
||||||
|
|
||||||
|
expect(stdout).toContain("Analyzing")
|
||||||
|
}, 30000)
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Output Format", () => {
|
||||||
|
it("should display violation counts", async () => {
|
||||||
|
const badArchDir = path.join(EXAMPLES_DIR, "bad-architecture")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${badArchDir}`)
|
||||||
|
|
||||||
|
expect(stdout).toContain("Analyzing")
|
||||||
|
const hasViolationCount = stdout.includes("Found") || stdout.includes("issue")
|
||||||
|
expect(hasViolationCount).toBe(true)
|
||||||
|
}, 30000)
|
||||||
|
|
||||||
|
it("should display file paths with violations", async () => {
|
||||||
|
const badArchDir = path.join(EXAMPLES_DIR, "bad-architecture/hardcoded")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${badArchDir}`)
|
||||||
|
|
||||||
|
expect(stdout).toMatch(/\.ts/)
|
||||||
|
}, 30000)
|
||||||
|
|
||||||
|
it("should display severity levels", async () => {
|
||||||
|
const badArchDir = path.join(EXAMPLES_DIR, "bad-architecture")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${badArchDir}`)
|
||||||
|
|
||||||
|
const hasSeverity =
|
||||||
|
stdout.includes("🔴") ||
|
||||||
|
stdout.includes("🟠") ||
|
||||||
|
stdout.includes("🟡") ||
|
||||||
|
stdout.includes("🟢") ||
|
||||||
|
stdout.includes("CRITICAL") ||
|
||||||
|
stdout.includes("HIGH") ||
|
||||||
|
stdout.includes("MEDIUM") ||
|
||||||
|
stdout.includes("LOW")
|
||||||
|
|
||||||
|
expect(hasSeverity).toBe(true)
|
||||||
|
}, 30000)
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("CLI Options", () => {
|
||||||
|
it("should respect --limit option", async () => {
|
||||||
|
const badArchDir = path.join(EXAMPLES_DIR, "bad-architecture")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${badArchDir} --limit 5`)
|
||||||
|
|
||||||
|
expect(stdout).toContain("Analyzing")
|
||||||
|
}, 30000)
|
||||||
|
|
||||||
|
it("should respect --only-critical option", async () => {
|
||||||
|
const badArchDir = path.join(EXAMPLES_DIR, "bad-architecture")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${badArchDir} --only-critical`)
|
||||||
|
|
||||||
|
expect(stdout).toContain("Analyzing")
|
||||||
|
|
||||||
|
if (stdout.includes("🔴") || stdout.includes("CRITICAL")) {
|
||||||
|
const hasNonCritical =
|
||||||
|
stdout.includes("🟠") ||
|
||||||
|
stdout.includes("🟡") ||
|
||||||
|
stdout.includes("🟢") ||
|
||||||
|
(stdout.includes("HIGH") && !stdout.includes("CRITICAL")) ||
|
||||||
|
stdout.includes("MEDIUM") ||
|
||||||
|
stdout.includes("LOW")
|
||||||
|
|
||||||
|
expect(hasNonCritical).toBe(false)
|
||||||
|
}
|
||||||
|
}, 30000)
|
||||||
|
|
||||||
|
it("should respect --min-severity option", async () => {
|
||||||
|
const badArchDir = path.join(EXAMPLES_DIR, "bad-architecture")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${badArchDir} --min-severity high`)
|
||||||
|
|
||||||
|
expect(stdout).toContain("Analyzing")
|
||||||
|
}, 30000)
|
||||||
|
|
||||||
|
it("should respect --exclude option", async () => {
|
||||||
|
const goodArchDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${goodArchDir} --exclude "**/dtos/**"`)
|
||||||
|
|
||||||
|
expect(stdout).not.toContain("/dtos/")
|
||||||
|
}, 30000)
|
||||||
|
|
||||||
|
it("should respect --no-hardcode option", async () => {
|
||||||
|
const badArchDir = path.join(EXAMPLES_DIR, "bad-architecture")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${badArchDir} --no-hardcode`)
|
||||||
|
|
||||||
|
expect(stdout).not.toContain("Magic Number")
|
||||||
|
expect(stdout).not.toContain("Magic String")
|
||||||
|
}, 30000)
|
||||||
|
|
||||||
|
it("should respect --no-architecture option", async () => {
|
||||||
|
const badArchDir = path.join(EXAMPLES_DIR, "bad-architecture")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${badArchDir} --no-architecture`)
|
||||||
|
|
||||||
|
expect(stdout).not.toContain("Architecture Violation")
|
||||||
|
}, 30000)
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Good Architecture Examples", () => {
|
||||||
|
it("should show success message for clean code", async () => {
|
||||||
|
const goodArchDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${goodArchDir}`)
|
||||||
|
|
||||||
|
expect(stdout).toContain("Analyzing")
|
||||||
|
}, 30000)
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Bad Architecture Examples", () => {
|
||||||
|
it("should detect and report hardcoded values", async () => {
|
||||||
|
const hardcodedDir = path.join(EXAMPLES_DIR, "bad-architecture/hardcoded")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${hardcodedDir}`)
|
||||||
|
|
||||||
|
expect(stdout).toContain("ServerWithMagicNumbers.ts")
|
||||||
|
}, 30000)
|
||||||
|
|
||||||
|
it("should detect and report circular dependencies", async () => {
|
||||||
|
const circularDir = path.join(EXAMPLES_DIR, "bad-architecture/circular")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${circularDir}`)
|
||||||
|
|
||||||
|
expect(stdout).toContain("Analyzing")
|
||||||
|
}, 30000)
|
||||||
|
|
||||||
|
it("should detect and report framework leaks", async () => {
|
||||||
|
const frameworkDir = path.join(EXAMPLES_DIR, "bad-architecture/framework-leaks")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${frameworkDir}`)
|
||||||
|
|
||||||
|
expect(stdout).toContain("Analyzing")
|
||||||
|
}, 30000)
|
||||||
|
|
||||||
|
it("should detect and report naming violations", async () => {
|
||||||
|
const namingDir = path.join(EXAMPLES_DIR, "bad-architecture/naming")
|
||||||
|
|
||||||
|
const { stdout } = await runCLI(`check ${namingDir}`)
|
||||||
|
|
||||||
|
expect(stdout).toContain("Analyzing")
|
||||||
|
}, 30000)
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Error Handling", () => {
|
||||||
|
it("should show error for non-existent path", async () => {
|
||||||
|
const nonExistentPath = path.join(EXAMPLES_DIR, "non-existent-directory")
|
||||||
|
|
||||||
|
try {
|
||||||
|
await execAsync(`node ${CLI_PATH} check ${nonExistentPath}`)
|
||||||
|
expect.fail("Should have thrown an error")
|
||||||
|
} catch (error: unknown) {
|
||||||
|
const err = error as { stderr: string }
|
||||||
|
expect(err.stderr).toBeTruthy()
|
||||||
|
}
|
||||||
|
}, 30000)
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Exit Codes", () => {
|
||||||
|
it("should run for clean code", async () => {
|
||||||
|
const goodArchDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const { stdout, exitCode } = await runCLI(`check ${goodArchDir}`)
|
||||||
|
|
||||||
|
expect(stdout).toContain("Analyzing")
|
||||||
|
expect(exitCode).toBeGreaterThanOrEqual(0)
|
||||||
|
}, 30000)
|
||||||
|
|
||||||
|
it("should handle violations gracefully", async () => {
|
||||||
|
const badArchDir = path.join(EXAMPLES_DIR, "bad-architecture")
|
||||||
|
|
||||||
|
const { stdout, exitCode } = await runCLI(`check ${badArchDir}`)
|
||||||
|
|
||||||
|
expect(stdout).toContain("Analyzing")
|
||||||
|
expect(exitCode).toBeGreaterThanOrEqual(0)
|
||||||
|
}, 30000)
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Spawn Process Tests", () => {
|
||||||
|
it("should spawn CLI process and capture output", (done) => {
|
||||||
|
const goodArchDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
const child = spawn("node", [CLI_PATH, "check", goodArchDir])
|
||||||
|
|
||||||
|
let stdout = ""
|
||||||
|
let stderr = ""
|
||||||
|
|
||||||
|
child.stdout.on("data", (data) => {
|
||||||
|
stdout += data.toString()
|
||||||
|
})
|
||||||
|
|
||||||
|
child.stderr.on("data", (data) => {
|
||||||
|
stderr += data.toString()
|
||||||
|
})
|
||||||
|
|
||||||
|
child.on("close", (code) => {
|
||||||
|
expect(code).toBe(0)
|
||||||
|
expect(stdout).toContain("Analyzing")
|
||||||
|
done()
|
||||||
|
})
|
||||||
|
}, 30000)
|
||||||
|
|
||||||
|
it("should handle large output without buffering issues", (done) => {
|
||||||
|
const badArchDir = path.join(EXAMPLES_DIR, "bad-architecture")
|
||||||
|
const child = spawn("node", [CLI_PATH, "check", badArchDir])
|
||||||
|
|
||||||
|
let stdout = ""
|
||||||
|
|
||||||
|
child.stdout.on("data", (data) => {
|
||||||
|
stdout += data.toString()
|
||||||
|
})
|
||||||
|
|
||||||
|
child.on("close", (code) => {
|
||||||
|
expect(code).toBe(0)
|
||||||
|
expect(stdout.length).toBeGreaterThan(0)
|
||||||
|
done()
|
||||||
|
})
|
||||||
|
}, 30000)
|
||||||
|
})
|
||||||
|
})
|
||||||
412
packages/guardian/tests/e2e/JSONOutput.e2e.test.ts
Normal file
412
packages/guardian/tests/e2e/JSONOutput.e2e.test.ts
Normal file
@@ -0,0 +1,412 @@
|
|||||||
|
import { describe, it, expect } from "vitest"
|
||||||
|
import { analyzeProject } from "../../src/api"
|
||||||
|
import path from "path"
|
||||||
|
import type {
|
||||||
|
AnalyzeProjectResponse,
|
||||||
|
HardcodeViolation,
|
||||||
|
CircularDependencyViolation,
|
||||||
|
NamingConventionViolation,
|
||||||
|
FrameworkLeakViolation,
|
||||||
|
EntityExposureViolation,
|
||||||
|
DependencyDirectionViolation,
|
||||||
|
RepositoryPatternViolation,
|
||||||
|
AggregateBoundaryViolation,
|
||||||
|
} from "../../src/api"
|
||||||
|
|
||||||
|
describe("JSON Output Format E2E", () => {
|
||||||
|
const EXAMPLES_DIR = path.join(__dirname, "../../examples")
|
||||||
|
|
||||||
|
describe("Response Structure", () => {
|
||||||
|
it("should return valid JSON structure", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
expect(result).toBeDefined()
|
||||||
|
expect(typeof result).toBe("object")
|
||||||
|
|
||||||
|
const json = JSON.stringify(result)
|
||||||
|
expect(() => JSON.parse(json)).not.toThrow()
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should include all required top-level fields", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result: AnalyzeProjectResponse = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
expect(result).toHaveProperty("hardcodeViolations")
|
||||||
|
expect(result).toHaveProperty("violations")
|
||||||
|
expect(result).toHaveProperty("circularDependencyViolations")
|
||||||
|
expect(result).toHaveProperty("namingViolations")
|
||||||
|
expect(result).toHaveProperty("frameworkLeakViolations")
|
||||||
|
expect(result).toHaveProperty("entityExposureViolations")
|
||||||
|
expect(result).toHaveProperty("dependencyDirectionViolations")
|
||||||
|
expect(result).toHaveProperty("repositoryPatternViolations")
|
||||||
|
expect(result).toHaveProperty("aggregateBoundaryViolations")
|
||||||
|
expect(result).toHaveProperty("metrics")
|
||||||
|
expect(result).toHaveProperty("dependencyGraph")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should have correct types for all fields", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
expect(Array.isArray(result.hardcodeViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.violations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.circularDependencyViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.namingViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.frameworkLeakViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.entityExposureViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.dependencyDirectionViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.repositoryPatternViolations)).toBe(true)
|
||||||
|
expect(Array.isArray(result.aggregateBoundaryViolations)).toBe(true)
|
||||||
|
expect(typeof result.metrics).toBe("object")
|
||||||
|
expect(typeof result.dependencyGraph).toBe("object")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Metrics Structure", () => {
|
||||||
|
it("should include all metric fields", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
const { metrics } = result
|
||||||
|
|
||||||
|
expect(metrics).toHaveProperty("totalFiles")
|
||||||
|
expect(metrics).toHaveProperty("totalFunctions")
|
||||||
|
expect(metrics).toHaveProperty("totalImports")
|
||||||
|
expect(metrics).toHaveProperty("layerDistribution")
|
||||||
|
|
||||||
|
expect(typeof metrics.totalFiles).toBe("number")
|
||||||
|
expect(typeof metrics.totalFunctions).toBe("number")
|
||||||
|
expect(typeof metrics.totalImports).toBe("number")
|
||||||
|
expect(typeof metrics.layerDistribution).toBe("object")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should have non-negative metric values", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
const { metrics } = result
|
||||||
|
|
||||||
|
expect(metrics.totalFiles).toBeGreaterThanOrEqual(0)
|
||||||
|
expect(metrics.totalFunctions).toBeGreaterThanOrEqual(0)
|
||||||
|
expect(metrics.totalImports).toBeGreaterThanOrEqual(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Hardcode Violation Structure", () => {
|
||||||
|
it("should have correct structure for hardcode violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture/hardcoded")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.hardcodeViolations.length > 0) {
|
||||||
|
const violation: HardcodeViolation = result.hardcodeViolations[0]
|
||||||
|
|
||||||
|
expect(violation).toHaveProperty("file")
|
||||||
|
expect(violation).toHaveProperty("line")
|
||||||
|
expect(violation).toHaveProperty("column")
|
||||||
|
expect(violation).toHaveProperty("type")
|
||||||
|
expect(violation).toHaveProperty("value")
|
||||||
|
expect(violation).toHaveProperty("context")
|
||||||
|
expect(violation).toHaveProperty("suggestion")
|
||||||
|
expect(violation).toHaveProperty("severity")
|
||||||
|
|
||||||
|
expect(typeof violation.file).toBe("string")
|
||||||
|
expect(typeof violation.line).toBe("number")
|
||||||
|
expect(typeof violation.column).toBe("number")
|
||||||
|
expect(typeof violation.type).toBe("string")
|
||||||
|
expect(typeof violation.context).toBe("string")
|
||||||
|
expect(typeof violation.severity).toBe("string")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Circular Dependency Violation Structure", () => {
|
||||||
|
it("should have correct structure for circular dependency violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture/circular")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.circularDependencyViolations.length > 0) {
|
||||||
|
const violation: CircularDependencyViolation =
|
||||||
|
result.circularDependencyViolations[0]
|
||||||
|
|
||||||
|
expect(violation).toHaveProperty("file")
|
||||||
|
expect(violation).toHaveProperty("cycle")
|
||||||
|
expect(violation).toHaveProperty("severity")
|
||||||
|
|
||||||
|
expect(typeof violation.file).toBe("string")
|
||||||
|
expect(Array.isArray(violation.cycle)).toBe(true)
|
||||||
|
expect(violation.cycle.length).toBeGreaterThanOrEqual(2)
|
||||||
|
expect(typeof violation.severity).toBe("string")
|
||||||
|
expect(violation.severity).toBe("critical")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Naming Convention Violation Structure", () => {
|
||||||
|
it("should have correct structure for naming violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture/naming")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.namingViolations.length > 0) {
|
||||||
|
const violation: NamingConventionViolation = result.namingViolations[0]
|
||||||
|
|
||||||
|
expect(violation).toHaveProperty("file")
|
||||||
|
expect(violation).toHaveProperty("fileName")
|
||||||
|
expect(violation).toHaveProperty("expected")
|
||||||
|
expect(violation).toHaveProperty("actual")
|
||||||
|
expect(violation).toHaveProperty("layer")
|
||||||
|
expect(violation).toHaveProperty("message")
|
||||||
|
expect(violation).toHaveProperty("severity")
|
||||||
|
|
||||||
|
expect(typeof violation.file).toBe("string")
|
||||||
|
expect(typeof violation.fileName).toBe("string")
|
||||||
|
expect(typeof violation.expected).toBe("string")
|
||||||
|
expect(typeof violation.actual).toBe("string")
|
||||||
|
expect(typeof violation.layer).toBe("string")
|
||||||
|
expect(typeof violation.message).toBe("string")
|
||||||
|
expect(typeof violation.severity).toBe("string")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Framework Leak Violation Structure", () => {
|
||||||
|
it("should have correct structure for framework leak violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture/framework-leaks")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.frameworkLeakViolations.length > 0) {
|
||||||
|
const violation: FrameworkLeakViolation = result.frameworkLeakViolations[0]
|
||||||
|
|
||||||
|
expect(violation).toHaveProperty("file")
|
||||||
|
expect(violation).toHaveProperty("packageName")
|
||||||
|
expect(violation).toHaveProperty("category")
|
||||||
|
expect(violation).toHaveProperty("categoryDescription")
|
||||||
|
expect(violation).toHaveProperty("layer")
|
||||||
|
expect(violation).toHaveProperty("message")
|
||||||
|
expect(violation).toHaveProperty("suggestion")
|
||||||
|
expect(violation).toHaveProperty("severity")
|
||||||
|
|
||||||
|
expect(typeof violation.file).toBe("string")
|
||||||
|
expect(typeof violation.packageName).toBe("string")
|
||||||
|
expect(typeof violation.category).toBe("string")
|
||||||
|
expect(typeof violation.categoryDescription).toBe("string")
|
||||||
|
expect(typeof violation.layer).toBe("string")
|
||||||
|
expect(typeof violation.message).toBe("string")
|
||||||
|
expect(typeof violation.suggestion).toBe("string")
|
||||||
|
expect(typeof violation.severity).toBe("string")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Entity Exposure Violation Structure", () => {
|
||||||
|
it("should have correct structure for entity exposure violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture/entity-exposure")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.entityExposureViolations.length > 0) {
|
||||||
|
const violation: EntityExposureViolation = result.entityExposureViolations[0]
|
||||||
|
|
||||||
|
expect(violation).toHaveProperty("file")
|
||||||
|
expect(violation).toHaveProperty("entityName")
|
||||||
|
expect(violation).toHaveProperty("returnType")
|
||||||
|
expect(violation).toHaveProperty("suggestion")
|
||||||
|
expect(violation).toHaveProperty("severity")
|
||||||
|
|
||||||
|
expect(typeof violation.file).toBe("string")
|
||||||
|
expect(typeof violation.entityName).toBe("string")
|
||||||
|
expect(typeof violation.returnType).toBe("string")
|
||||||
|
expect(typeof violation.suggestion).toBe("string")
|
||||||
|
expect(typeof violation.severity).toBe("string")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Dependency Direction Violation Structure", () => {
|
||||||
|
it("should have correct structure for dependency direction violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture/dependency-direction")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.dependencyDirectionViolations.length > 0) {
|
||||||
|
const violation: DependencyDirectionViolation =
|
||||||
|
result.dependencyDirectionViolations[0]
|
||||||
|
|
||||||
|
expect(violation).toHaveProperty("file")
|
||||||
|
expect(violation).toHaveProperty("fromLayer")
|
||||||
|
expect(violation).toHaveProperty("toLayer")
|
||||||
|
expect(violation).toHaveProperty("importPath")
|
||||||
|
expect(violation).toHaveProperty("suggestion")
|
||||||
|
expect(violation).toHaveProperty("severity")
|
||||||
|
|
||||||
|
expect(typeof violation.file).toBe("string")
|
||||||
|
expect(typeof violation.fromLayer).toBe("string")
|
||||||
|
expect(typeof violation.toLayer).toBe("string")
|
||||||
|
expect(typeof violation.importPath).toBe("string")
|
||||||
|
expect(typeof violation.suggestion).toBe("string")
|
||||||
|
expect(typeof violation.severity).toBe("string")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Repository Pattern Violation Structure", () => {
|
||||||
|
it("should have correct structure for repository pattern violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "repository-pattern")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
const badViolations = result.repositoryPatternViolations.filter((v) =>
|
||||||
|
v.file.includes("bad"),
|
||||||
|
)
|
||||||
|
|
||||||
|
if (badViolations.length > 0) {
|
||||||
|
const violation: RepositoryPatternViolation = badViolations[0]
|
||||||
|
|
||||||
|
expect(violation).toHaveProperty("file")
|
||||||
|
expect(violation).toHaveProperty("line")
|
||||||
|
expect(violation).toHaveProperty("violationType")
|
||||||
|
expect(violation).toHaveProperty("details")
|
||||||
|
expect(violation).toHaveProperty("suggestion")
|
||||||
|
expect(violation).toHaveProperty("severity")
|
||||||
|
|
||||||
|
expect(typeof violation.file).toBe("string")
|
||||||
|
expect(typeof violation.line).toBe("number")
|
||||||
|
expect(typeof violation.violationType).toBe("string")
|
||||||
|
expect(typeof violation.details).toBe("string")
|
||||||
|
expect(typeof violation.suggestion).toBe("string")
|
||||||
|
expect(typeof violation.severity).toBe("string")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Aggregate Boundary Violation Structure", () => {
|
||||||
|
it("should have correct structure for aggregate boundary violations", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "aggregate-boundary/bad")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
if (result.aggregateBoundaryViolations.length > 0) {
|
||||||
|
const violation: AggregateBoundaryViolation = result.aggregateBoundaryViolations[0]
|
||||||
|
|
||||||
|
expect(violation).toHaveProperty("file")
|
||||||
|
expect(violation).toHaveProperty("fromAggregate")
|
||||||
|
expect(violation).toHaveProperty("toAggregate")
|
||||||
|
expect(violation).toHaveProperty("entityName")
|
||||||
|
expect(violation).toHaveProperty("importPath")
|
||||||
|
expect(violation).toHaveProperty("suggestion")
|
||||||
|
expect(violation).toHaveProperty("severity")
|
||||||
|
|
||||||
|
expect(typeof violation.file).toBe("string")
|
||||||
|
expect(typeof violation.fromAggregate).toBe("string")
|
||||||
|
expect(typeof violation.toAggregate).toBe("string")
|
||||||
|
expect(typeof violation.entityName).toBe("string")
|
||||||
|
expect(typeof violation.importPath).toBe("string")
|
||||||
|
expect(typeof violation.suggestion).toBe("string")
|
||||||
|
expect(typeof violation.severity).toBe("string")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Dependency Graph Structure", () => {
|
||||||
|
it("should have dependency graph object", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
const { dependencyGraph } = result
|
||||||
|
|
||||||
|
expect(dependencyGraph).toBeDefined()
|
||||||
|
expect(typeof dependencyGraph).toBe("object")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should have getAllNodes method on dependency graph", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
const { dependencyGraph } = result
|
||||||
|
|
||||||
|
expect(typeof dependencyGraph.getAllNodes).toBe("function")
|
||||||
|
const nodes = dependencyGraph.getAllNodes()
|
||||||
|
expect(Array.isArray(nodes)).toBe(true)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("JSON Serialization", () => {
|
||||||
|
it("should serialize metrics without data loss", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
const json = JSON.stringify(result.metrics)
|
||||||
|
const parsed = JSON.parse(json)
|
||||||
|
|
||||||
|
expect(parsed.totalFiles).toBe(result.metrics.totalFiles)
|
||||||
|
expect(parsed.totalFunctions).toBe(result.metrics.totalFunctions)
|
||||||
|
expect(parsed.totalImports).toBe(result.metrics.totalImports)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should serialize violations without data loss", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "good-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
const json = JSON.stringify({
|
||||||
|
hardcodeViolations: result.hardcodeViolations,
|
||||||
|
violations: result.violations,
|
||||||
|
})
|
||||||
|
const parsed = JSON.parse(json)
|
||||||
|
|
||||||
|
expect(Array.isArray(parsed.violations)).toBe(true)
|
||||||
|
expect(Array.isArray(parsed.hardcodeViolations)).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should serialize violation arrays for large results", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
const json = JSON.stringify({
|
||||||
|
hardcodeViolations: result.hardcodeViolations,
|
||||||
|
violations: result.violations,
|
||||||
|
namingViolations: result.namingViolations,
|
||||||
|
})
|
||||||
|
|
||||||
|
expect(json.length).toBeGreaterThan(0)
|
||||||
|
expect(() => JSON.parse(json)).not.toThrow()
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("Severity Levels", () => {
|
||||||
|
it("should only contain valid severity levels", async () => {
|
||||||
|
const rootDir = path.join(EXAMPLES_DIR, "bad-architecture")
|
||||||
|
|
||||||
|
const result = await analyzeProject({ rootDir })
|
||||||
|
|
||||||
|
const validSeverities = ["critical", "high", "medium", "low"]
|
||||||
|
|
||||||
|
const allViolations = [
|
||||||
|
...result.hardcodeViolations,
|
||||||
|
...result.violations,
|
||||||
|
...result.circularDependencyViolations,
|
||||||
|
...result.namingViolations,
|
||||||
|
...result.frameworkLeakViolations,
|
||||||
|
...result.entityExposureViolations,
|
||||||
|
...result.dependencyDirectionViolations,
|
||||||
|
...result.repositoryPatternViolations,
|
||||||
|
...result.aggregateBoundaryViolations,
|
||||||
|
]
|
||||||
|
|
||||||
|
allViolations.forEach((violation) => {
|
||||||
|
if ("severity" in violation) {
|
||||||
|
expect(validSeverities).toContain(violation.severity)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
308
packages/guardian/tests/unit/domain/ProjectPath.test.ts
Normal file
308
packages/guardian/tests/unit/domain/ProjectPath.test.ts
Normal file
@@ -0,0 +1,308 @@
|
|||||||
|
import { describe, it, expect } from "vitest"
|
||||||
|
import { ProjectPath } from "../../../src/domain/value-objects/ProjectPath"
|
||||||
|
|
||||||
|
describe("ProjectPath", () => {
|
||||||
|
describe("create", () => {
|
||||||
|
it("should create a ProjectPath with absolute and relative paths", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/domain/User.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.absolute).toBe(absolutePath)
|
||||||
|
expect(projectPath.relative).toBe("src/domain/User.ts")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle paths with same directory", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/User.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.absolute).toBe(absolutePath)
|
||||||
|
expect(projectPath.relative).toBe("User.ts")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle nested directory structures", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/domain/entities/user/User.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.absolute).toBe(absolutePath)
|
||||||
|
expect(projectPath.relative).toBe("src/domain/entities/user/User.ts")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle Windows-style paths", () => {
|
||||||
|
const absolutePath = "C:\\Users\\dev\\project\\src\\domain\\User.ts"
|
||||||
|
const projectRoot = "C:\\Users\\dev\\project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.absolute).toBe(absolutePath)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("absolute getter", () => {
|
||||||
|
it("should return the absolute path", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/domain/User.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.absolute).toBe(absolutePath)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("relative getter", () => {
|
||||||
|
it("should return the relative path", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/domain/User.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.relative).toBe("src/domain/User.ts")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("extension getter", () => {
|
||||||
|
it("should return .ts for TypeScript files", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/domain/User.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.extension).toBe(".ts")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return .tsx for TypeScript JSX files", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/components/Button.tsx"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.extension).toBe(".tsx")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return .js for JavaScript files", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/utils/helper.js"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.extension).toBe(".js")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return .jsx for JavaScript JSX files", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/components/Button.jsx"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.extension).toBe(".jsx")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return empty string for files without extension", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/README"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.extension).toBe("")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("filename getter", () => {
|
||||||
|
it("should return the filename with extension", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/domain/User.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.filename).toBe("User.ts")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle filenames with multiple dots", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/domain/User.test.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.filename).toBe("User.test.ts")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle filenames without extension", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/README"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.filename).toBe("README")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("directory getter", () => {
|
||||||
|
it("should return the directory path relative to project root", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/domain/entities/User.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.directory).toBe("src/domain/entities")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return dot for files in project root", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/README.md"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.directory).toBe(".")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle single-level directories", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/User.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.directory).toBe("src")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("isTypeScript", () => {
|
||||||
|
it("should return true for .ts files", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/domain/User.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.isTypeScript()).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return true for .tsx files", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/components/Button.tsx"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.isTypeScript()).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false for .js files", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/utils/helper.js"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.isTypeScript()).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false for .jsx files", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/components/Button.jsx"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.isTypeScript()).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false for other file types", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/README.md"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.isTypeScript()).toBe(false)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("isJavaScript", () => {
|
||||||
|
it("should return true for .js files", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/utils/helper.js"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.isJavaScript()).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return true for .jsx files", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/components/Button.jsx"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.isJavaScript()).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false for .ts files", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/domain/User.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.isJavaScript()).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false for .tsx files", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/components/Button.tsx"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.isJavaScript()).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false for other file types", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/README.md"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const projectPath = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(projectPath.isJavaScript()).toBe(false)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("equals", () => {
|
||||||
|
it("should return true for identical paths", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/domain/User.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const path1 = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
const path2 = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(path1.equals(path2)).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false for different absolute paths", () => {
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
const path1 = ProjectPath.create("/Users/dev/project/src/domain/User.ts", projectRoot)
|
||||||
|
const path2 = ProjectPath.create("/Users/dev/project/src/domain/Order.ts", projectRoot)
|
||||||
|
|
||||||
|
expect(path1.equals(path2)).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false for different relative paths", () => {
|
||||||
|
const path1 = ProjectPath.create(
|
||||||
|
"/Users/dev/project1/src/User.ts",
|
||||||
|
"/Users/dev/project1",
|
||||||
|
)
|
||||||
|
const path2 = ProjectPath.create(
|
||||||
|
"/Users/dev/project2/src/User.ts",
|
||||||
|
"/Users/dev/project2",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(path1.equals(path2)).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false when comparing with undefined", () => {
|
||||||
|
const absolutePath = "/Users/dev/project/src/domain/User.ts"
|
||||||
|
const projectRoot = "/Users/dev/project"
|
||||||
|
|
||||||
|
const path1 = ProjectPath.create(absolutePath, projectRoot)
|
||||||
|
|
||||||
|
expect(path1.equals(undefined)).toBe(false)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
521
packages/guardian/tests/unit/domain/RepositoryViolation.test.ts
Normal file
521
packages/guardian/tests/unit/domain/RepositoryViolation.test.ts
Normal file
@@ -0,0 +1,521 @@
|
|||||||
|
import { describe, it, expect } from "vitest"
|
||||||
|
import { RepositoryViolation } from "../../../src/domain/value-objects/RepositoryViolation"
|
||||||
|
import { REPOSITORY_VIOLATION_TYPES } from "../../../src/shared/constants/rules"
|
||||||
|
|
||||||
|
describe("RepositoryViolation", () => {
|
||||||
|
describe("create", () => {
|
||||||
|
it("should create a repository violation for ORM type in interface", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Repository uses Prisma type",
|
||||||
|
"Prisma.UserWhereInput",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.violationType).toBe(REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE)
|
||||||
|
expect(violation.filePath).toBe("src/domain/repositories/IUserRepository.ts")
|
||||||
|
expect(violation.layer).toBe("domain")
|
||||||
|
expect(violation.line).toBe(15)
|
||||||
|
expect(violation.details).toBe("Repository uses Prisma type")
|
||||||
|
expect(violation.ormType).toBe("Prisma.UserWhereInput")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should create a repository violation for concrete repository in use case", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
||||||
|
"src/application/use-cases/CreateUser.ts",
|
||||||
|
"application",
|
||||||
|
10,
|
||||||
|
"Use case depends on concrete repository",
|
||||||
|
undefined,
|
||||||
|
"UserRepository",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.violationType).toBe(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
||||||
|
)
|
||||||
|
expect(violation.repositoryName).toBe("UserRepository")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should create a repository violation for new repository in use case", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NEW_REPOSITORY_IN_USE_CASE,
|
||||||
|
"src/application/use-cases/CreateUser.ts",
|
||||||
|
"application",
|
||||||
|
12,
|
||||||
|
"Use case creates repository with new",
|
||||||
|
undefined,
|
||||||
|
"UserRepository",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.violationType).toBe(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NEW_REPOSITORY_IN_USE_CASE,
|
||||||
|
)
|
||||||
|
expect(violation.repositoryName).toBe("UserRepository")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should create a repository violation for non-domain method name", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
8,
|
||||||
|
"Method uses technical name. Consider: findById()",
|
||||||
|
undefined,
|
||||||
|
undefined,
|
||||||
|
"findOne",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.violationType).toBe(REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME)
|
||||||
|
expect(violation.methodName).toBe("findOne")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle optional line parameter", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
undefined,
|
||||||
|
"Repository uses Prisma type",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.line).toBeUndefined()
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getters", () => {
|
||||||
|
it("should return violation type", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.violationType).toBe(REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return file path", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.filePath).toBe("src/domain/repositories/IUserRepository.ts")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return layer", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.layer).toBe("domain")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return line number", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.line).toBe(15)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return details", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Repository uses Prisma type",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.details).toBe("Repository uses Prisma type")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return ORM type", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
"Prisma.UserWhereInput",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.ormType).toBe("Prisma.UserWhereInput")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return repository name", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
||||||
|
"src/application/use-cases/CreateUser.ts",
|
||||||
|
"application",
|
||||||
|
10,
|
||||||
|
"Test",
|
||||||
|
undefined,
|
||||||
|
"UserRepository",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.repositoryName).toBe("UserRepository")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return method name", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
8,
|
||||||
|
"Test",
|
||||||
|
undefined,
|
||||||
|
undefined,
|
||||||
|
"findOne",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.methodName).toBe("findOne")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getMessage", () => {
|
||||||
|
it("should return message for ORM type in interface", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
"Prisma.UserWhereInput",
|
||||||
|
)
|
||||||
|
|
||||||
|
const message = violation.getMessage()
|
||||||
|
|
||||||
|
expect(message).toContain("ORM-specific type")
|
||||||
|
expect(message).toContain("Prisma.UserWhereInput")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return message for concrete repository in use case", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
||||||
|
"src/application/use-cases/CreateUser.ts",
|
||||||
|
"application",
|
||||||
|
10,
|
||||||
|
"Test",
|
||||||
|
undefined,
|
||||||
|
"UserRepository",
|
||||||
|
)
|
||||||
|
|
||||||
|
const message = violation.getMessage()
|
||||||
|
|
||||||
|
expect(message).toContain("depends on concrete repository")
|
||||||
|
expect(message).toContain("UserRepository")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return message for new repository in use case", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NEW_REPOSITORY_IN_USE_CASE,
|
||||||
|
"src/application/use-cases/CreateUser.ts",
|
||||||
|
"application",
|
||||||
|
12,
|
||||||
|
"Test",
|
||||||
|
undefined,
|
||||||
|
"UserRepository",
|
||||||
|
)
|
||||||
|
|
||||||
|
const message = violation.getMessage()
|
||||||
|
|
||||||
|
expect(message).toContain("creates repository with 'new")
|
||||||
|
expect(message).toContain("UserRepository")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return message for non-domain method name", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
8,
|
||||||
|
"Test",
|
||||||
|
undefined,
|
||||||
|
undefined,
|
||||||
|
"findOne",
|
||||||
|
)
|
||||||
|
|
||||||
|
const message = violation.getMessage()
|
||||||
|
|
||||||
|
expect(message).toContain("uses technical name")
|
||||||
|
expect(message).toContain("findOne")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle unknown ORM type gracefully", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const message = violation.getMessage()
|
||||||
|
|
||||||
|
expect(message).toContain("unknown")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getSuggestion", () => {
|
||||||
|
it("should return suggestion for ORM type in interface", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
"Prisma.UserWhereInput",
|
||||||
|
)
|
||||||
|
|
||||||
|
const suggestion = violation.getSuggestion()
|
||||||
|
|
||||||
|
expect(suggestion).toContain("Remove ORM-specific types")
|
||||||
|
expect(suggestion).toContain("Use domain types")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return suggestion for concrete repository in use case", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
||||||
|
"src/application/use-cases/CreateUser.ts",
|
||||||
|
"application",
|
||||||
|
10,
|
||||||
|
"Test",
|
||||||
|
undefined,
|
||||||
|
"UserRepository",
|
||||||
|
)
|
||||||
|
|
||||||
|
const suggestion = violation.getSuggestion()
|
||||||
|
|
||||||
|
expect(suggestion).toContain("Depend on repository interface")
|
||||||
|
expect(suggestion).toContain("IUserRepository")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return suggestion for new repository in use case", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NEW_REPOSITORY_IN_USE_CASE,
|
||||||
|
"src/application/use-cases/CreateUser.ts",
|
||||||
|
"application",
|
||||||
|
12,
|
||||||
|
"Test",
|
||||||
|
undefined,
|
||||||
|
"UserRepository",
|
||||||
|
)
|
||||||
|
|
||||||
|
const suggestion = violation.getSuggestion()
|
||||||
|
|
||||||
|
expect(suggestion).toContain("Remove 'new Repository()'")
|
||||||
|
expect(suggestion).toContain("dependency injection")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return suggestion for non-domain method name with smart suggestion", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
8,
|
||||||
|
"Method uses technical name. Consider: findById()",
|
||||||
|
undefined,
|
||||||
|
undefined,
|
||||||
|
"findOne",
|
||||||
|
)
|
||||||
|
|
||||||
|
const suggestion = violation.getSuggestion()
|
||||||
|
|
||||||
|
expect(suggestion).toContain("findById()")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return fallback suggestion for known technical method", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
8,
|
||||||
|
"Method uses technical name",
|
||||||
|
undefined,
|
||||||
|
undefined,
|
||||||
|
"insert",
|
||||||
|
)
|
||||||
|
|
||||||
|
const suggestion = violation.getSuggestion()
|
||||||
|
|
||||||
|
expect(suggestion).toContain("save or create")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return default suggestion for unknown method", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
8,
|
||||||
|
"Method uses technical name",
|
||||||
|
undefined,
|
||||||
|
undefined,
|
||||||
|
"unknownMethod",
|
||||||
|
)
|
||||||
|
|
||||||
|
const suggestion = violation.getSuggestion()
|
||||||
|
|
||||||
|
expect(suggestion).toBeDefined()
|
||||||
|
expect(suggestion.length).toBeGreaterThan(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getExampleFix", () => {
|
||||||
|
it("should return example fix for ORM type in interface", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("BAD")
|
||||||
|
expect(example).toContain("GOOD")
|
||||||
|
expect(example).toContain("IUserRepository")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return example fix for concrete repository in use case", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
||||||
|
"src/application/use-cases/CreateUser.ts",
|
||||||
|
"application",
|
||||||
|
10,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("BAD")
|
||||||
|
expect(example).toContain("GOOD")
|
||||||
|
expect(example).toContain("CreateUser")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return example fix for new repository in use case", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NEW_REPOSITORY_IN_USE_CASE,
|
||||||
|
"src/application/use-cases/CreateUser.ts",
|
||||||
|
"application",
|
||||||
|
12,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("BAD")
|
||||||
|
expect(example).toContain("GOOD")
|
||||||
|
expect(example).toContain("new UserRepository")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return example fix for non-domain method name", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
8,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("BAD")
|
||||||
|
expect(example).toContain("GOOD")
|
||||||
|
expect(example).toContain("findOne")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("equals", () => {
|
||||||
|
it("should return true for violations with identical properties", () => {
|
||||||
|
const violation1 = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
"Prisma.UserWhereInput",
|
||||||
|
)
|
||||||
|
|
||||||
|
const violation2 = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
"Prisma.UserWhereInput",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation1.equals(violation2)).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false for violations with different types", () => {
|
||||||
|
const violation1 = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const violation2 = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation1.equals(violation2)).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false for violations with different file paths", () => {
|
||||||
|
const violation1 = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const violation2 = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IOrderRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation1.equals(violation2)).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false when comparing with undefined", () => {
|
||||||
|
const violation = RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
"src/domain/repositories/IUserRepository.ts",
|
||||||
|
"domain",
|
||||||
|
15,
|
||||||
|
"Test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.equals(undefined)).toBe(false)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
320
packages/guardian/tests/unit/domain/SecretViolation.test.ts
Normal file
320
packages/guardian/tests/unit/domain/SecretViolation.test.ts
Normal file
@@ -0,0 +1,320 @@
|
|||||||
|
import { describe, it, expect } from "vitest"
|
||||||
|
import { SecretViolation } from "../../../src/domain/value-objects/SecretViolation"
|
||||||
|
|
||||||
|
describe("SecretViolation", () => {
|
||||||
|
describe("create", () => {
|
||||||
|
it("should create a secret violation with all properties", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"AKIA1234567890ABCDEF",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.file).toBe("src/config/aws.ts")
|
||||||
|
expect(violation.line).toBe(10)
|
||||||
|
expect(violation.column).toBe(15)
|
||||||
|
expect(violation.secretType).toBe("AWS Access Key")
|
||||||
|
expect(violation.matchedPattern).toBe("AKIA1234567890ABCDEF")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should create a secret violation with GitHub token", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/github.ts",
|
||||||
|
5,
|
||||||
|
20,
|
||||||
|
"GitHub Personal Access Token",
|
||||||
|
"ghp_1234567890abcdefghijklmnopqrstuv",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.secretType).toBe("GitHub Personal Access Token")
|
||||||
|
expect(violation.file).toBe("src/config/github.ts")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should create a secret violation with NPM token", () => {
|
||||||
|
const violation = SecretViolation.create(".npmrc", 1, 1, "NPM Token", "npm_abc123xyz")
|
||||||
|
|
||||||
|
expect(violation.secretType).toBe("NPM Token")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getters", () => {
|
||||||
|
it("should return file path", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.file).toBe("src/config/aws.ts")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return line number", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.line).toBe(10)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return column number", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.column).toBe(15)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return secret type", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.secretType).toBe("AWS Access Key")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return matched pattern", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"AKIA1234567890ABCDEF",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.matchedPattern).toBe("AKIA1234567890ABCDEF")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getMessage", () => {
|
||||||
|
it("should return formatted message for AWS Access Key", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.getMessage()).toBe("Hardcoded AWS Access Key detected")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return formatted message for GitHub token", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/github.ts",
|
||||||
|
5,
|
||||||
|
20,
|
||||||
|
"GitHub Token",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.getMessage()).toBe("Hardcoded GitHub Token detected")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return formatted message for NPM token", () => {
|
||||||
|
const violation = SecretViolation.create(".npmrc", 1, 1, "NPM Token", "test")
|
||||||
|
|
||||||
|
expect(violation.getMessage()).toBe("Hardcoded NPM Token detected")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getSuggestion", () => {
|
||||||
|
it("should return multi-line suggestion", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const suggestion = violation.getSuggestion()
|
||||||
|
|
||||||
|
expect(suggestion).toContain("1. Use environment variables")
|
||||||
|
expect(suggestion).toContain("2. Use secret management services")
|
||||||
|
expect(suggestion).toContain("3. Never commit secrets")
|
||||||
|
expect(suggestion).toContain("4. If secret was committed, rotate it immediately")
|
||||||
|
expect(suggestion).toContain("5. Add secret files to .gitignore")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return the same suggestion for all secret types", () => {
|
||||||
|
const awsViolation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const githubViolation = SecretViolation.create(
|
||||||
|
"src/config/github.ts",
|
||||||
|
5,
|
||||||
|
20,
|
||||||
|
"GitHub Token",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(awsViolation.getSuggestion()).toBe(githubViolation.getSuggestion())
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getExampleFix", () => {
|
||||||
|
it("should return AWS-specific example for AWS Access Key", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("AWS")
|
||||||
|
expect(example).toContain("process.env.AWS_ACCESS_KEY_ID")
|
||||||
|
expect(example).toContain("credentials provider")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return GitHub-specific example for GitHub token", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/github.ts",
|
||||||
|
5,
|
||||||
|
20,
|
||||||
|
"GitHub Token",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("GitHub")
|
||||||
|
expect(example).toContain("process.env.GITHUB_TOKEN")
|
||||||
|
expect(example).toContain("GitHub Apps")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return NPM-specific example for NPM token", () => {
|
||||||
|
const violation = SecretViolation.create(".npmrc", 1, 1, "NPM Token", "test")
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("NPM")
|
||||||
|
expect(example).toContain(".npmrc")
|
||||||
|
expect(example).toContain("process.env.NPM_TOKEN")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return SSH-specific example for SSH Private Key", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/ssh.ts",
|
||||||
|
1,
|
||||||
|
1,
|
||||||
|
"SSH Private Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("SSH")
|
||||||
|
expect(example).toContain("readFileSync")
|
||||||
|
expect(example).toContain("SSH_KEY_PATH")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return SSH RSA-specific example for SSH RSA Private Key", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/ssh.ts",
|
||||||
|
1,
|
||||||
|
1,
|
||||||
|
"SSH RSA Private Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("SSH")
|
||||||
|
expect(example).toContain("RSA PRIVATE KEY")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return Slack-specific example for Slack token", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/slack.ts",
|
||||||
|
1,
|
||||||
|
1,
|
||||||
|
"Slack Bot Token",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("Slack")
|
||||||
|
expect(example).toContain("process.env.SLACK_BOT_TOKEN")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return API Key example for generic API key", () => {
|
||||||
|
const violation = SecretViolation.create("src/config/api.ts", 1, 1, "API Key", "test")
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("API")
|
||||||
|
expect(example).toContain("process.env.API_KEY")
|
||||||
|
expect(example).toContain("secret management service")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return generic example for unknown secret type", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/unknown.ts",
|
||||||
|
1,
|
||||||
|
1,
|
||||||
|
"Unknown Secret",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("process.env.SECRET_KEY")
|
||||||
|
expect(example).toContain("secret management")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getSeverity", () => {
|
||||||
|
it("should always return critical severity", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.getSeverity()).toBe("critical")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return critical severity for all secret types", () => {
|
||||||
|
const types = [
|
||||||
|
"AWS Access Key",
|
||||||
|
"GitHub Token",
|
||||||
|
"NPM Token",
|
||||||
|
"SSH Private Key",
|
||||||
|
"Slack Token",
|
||||||
|
"API Key",
|
||||||
|
]
|
||||||
|
|
||||||
|
types.forEach((type) => {
|
||||||
|
const violation = SecretViolation.create("test.ts", 1, 1, type, "test")
|
||||||
|
expect(violation.getSeverity()).toBe("critical")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
329
packages/guardian/tests/unit/domain/SourceFile.test.ts
Normal file
329
packages/guardian/tests/unit/domain/SourceFile.test.ts
Normal file
@@ -0,0 +1,329 @@
|
|||||||
|
import { describe, it, expect } from "vitest"
|
||||||
|
import { SourceFile } from "../../../src/domain/entities/SourceFile"
|
||||||
|
import { ProjectPath } from "../../../src/domain/value-objects/ProjectPath"
|
||||||
|
import { LAYERS } from "../../../src/shared/constants/rules"
|
||||||
|
|
||||||
|
describe("SourceFile", () => {
|
||||||
|
describe("constructor", () => {
|
||||||
|
it("should create a SourceFile instance with all properties", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const content = "class User {}"
|
||||||
|
const imports = ["./BaseEntity"]
|
||||||
|
const exports = ["User"]
|
||||||
|
const id = "test-id"
|
||||||
|
|
||||||
|
const sourceFile = new SourceFile(path, content, imports, exports, id)
|
||||||
|
|
||||||
|
expect(sourceFile.path).toBe(path)
|
||||||
|
expect(sourceFile.content).toBe(content)
|
||||||
|
expect(sourceFile.imports).toEqual(imports)
|
||||||
|
expect(sourceFile.exports).toEqual(exports)
|
||||||
|
expect(sourceFile.id).toBe(id)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should create a SourceFile with empty imports and exports by default", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const content = "class User {}"
|
||||||
|
|
||||||
|
const sourceFile = new SourceFile(path, content)
|
||||||
|
|
||||||
|
expect(sourceFile.imports).toEqual([])
|
||||||
|
expect(sourceFile.exports).toEqual([])
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should generate an id if not provided", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const content = "class User {}"
|
||||||
|
|
||||||
|
const sourceFile = new SourceFile(path, content)
|
||||||
|
|
||||||
|
expect(sourceFile.id).toBeDefined()
|
||||||
|
expect(typeof sourceFile.id).toBe("string")
|
||||||
|
expect(sourceFile.id.length).toBeGreaterThan(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("layer detection", () => {
|
||||||
|
it("should detect domain layer from path", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/entities/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
expect(sourceFile.layer).toBe(LAYERS.DOMAIN)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect application layer from path", () => {
|
||||||
|
const path = ProjectPath.create(
|
||||||
|
"/project/src/application/use-cases/CreateUser.ts",
|
||||||
|
"/project",
|
||||||
|
)
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
expect(sourceFile.layer).toBe(LAYERS.APPLICATION)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect infrastructure layer from path", () => {
|
||||||
|
const path = ProjectPath.create(
|
||||||
|
"/project/src/infrastructure/database/UserRepository.ts",
|
||||||
|
"/project",
|
||||||
|
)
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
expect(sourceFile.layer).toBe(LAYERS.INFRASTRUCTURE)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect shared layer from path", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/shared/utils/helpers.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
expect(sourceFile.layer).toBe(LAYERS.SHARED)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return undefined for unknown layer", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/unknown/Test.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
expect(sourceFile.layer).toBeUndefined()
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle uppercase layer names in path", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/DOMAIN/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
expect(sourceFile.layer).toBe(LAYERS.DOMAIN)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle mixed case layer names in path", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/Application/UseCase.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
expect(sourceFile.layer).toBe(LAYERS.APPLICATION)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("path getter", () => {
|
||||||
|
it("should return the project path", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
expect(sourceFile.path).toBe(path)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("content getter", () => {
|
||||||
|
it("should return the file content", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const content = "class User { constructor(public name: string) {} }"
|
||||||
|
const sourceFile = new SourceFile(path, content)
|
||||||
|
|
||||||
|
expect(sourceFile.content).toBe(content)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("imports getter", () => {
|
||||||
|
it("should return a copy of imports array", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const imports = ["./BaseEntity", "./ValueObject"]
|
||||||
|
const sourceFile = new SourceFile(path, "", imports)
|
||||||
|
|
||||||
|
const returnedImports = sourceFile.imports
|
||||||
|
|
||||||
|
expect(returnedImports).toEqual(imports)
|
||||||
|
expect(returnedImports).not.toBe(imports)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should not allow mutations of internal imports array", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const imports = ["./BaseEntity"]
|
||||||
|
const sourceFile = new SourceFile(path, "", imports)
|
||||||
|
|
||||||
|
const returnedImports = sourceFile.imports
|
||||||
|
returnedImports.push("./NewImport")
|
||||||
|
|
||||||
|
expect(sourceFile.imports).toEqual(["./BaseEntity"])
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("exports getter", () => {
|
||||||
|
it("should return a copy of exports array", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const exports = ["User", "UserProps"]
|
||||||
|
const sourceFile = new SourceFile(path, "", [], exports)
|
||||||
|
|
||||||
|
const returnedExports = sourceFile.exports
|
||||||
|
|
||||||
|
expect(returnedExports).toEqual(exports)
|
||||||
|
expect(returnedExports).not.toBe(exports)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should not allow mutations of internal exports array", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const exports = ["User"]
|
||||||
|
const sourceFile = new SourceFile(path, "", [], exports)
|
||||||
|
|
||||||
|
const returnedExports = sourceFile.exports
|
||||||
|
returnedExports.push("NewExport")
|
||||||
|
|
||||||
|
expect(sourceFile.exports).toEqual(["User"])
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("addImport", () => {
|
||||||
|
it("should add a new import to the list", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
sourceFile.addImport("./BaseEntity")
|
||||||
|
|
||||||
|
expect(sourceFile.imports).toEqual(["./BaseEntity"])
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should not add duplicate imports", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "", ["./BaseEntity"])
|
||||||
|
|
||||||
|
sourceFile.addImport("./BaseEntity")
|
||||||
|
|
||||||
|
expect(sourceFile.imports).toEqual(["./BaseEntity"])
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should update updatedAt timestamp when adding new import", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
const originalUpdatedAt = sourceFile.updatedAt
|
||||||
|
|
||||||
|
setTimeout(() => {
|
||||||
|
sourceFile.addImport("./BaseEntity")
|
||||||
|
|
||||||
|
expect(sourceFile.updatedAt.getTime()).toBeGreaterThanOrEqual(
|
||||||
|
originalUpdatedAt.getTime(),
|
||||||
|
)
|
||||||
|
}, 10)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should not update timestamp when adding duplicate import", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "", ["./BaseEntity"])
|
||||||
|
|
||||||
|
const originalUpdatedAt = sourceFile.updatedAt
|
||||||
|
|
||||||
|
setTimeout(() => {
|
||||||
|
sourceFile.addImport("./BaseEntity")
|
||||||
|
|
||||||
|
expect(sourceFile.updatedAt).toBe(originalUpdatedAt)
|
||||||
|
}, 10)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should add multiple different imports", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
sourceFile.addImport("./BaseEntity")
|
||||||
|
sourceFile.addImport("./ValueObject")
|
||||||
|
sourceFile.addImport("./DomainEvent")
|
||||||
|
|
||||||
|
expect(sourceFile.imports).toEqual(["./BaseEntity", "./ValueObject", "./DomainEvent"])
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("addExport", () => {
|
||||||
|
it("should add a new export to the list", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
sourceFile.addExport("User")
|
||||||
|
|
||||||
|
expect(sourceFile.exports).toEqual(["User"])
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should not add duplicate exports", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "", [], ["User"])
|
||||||
|
|
||||||
|
sourceFile.addExport("User")
|
||||||
|
|
||||||
|
expect(sourceFile.exports).toEqual(["User"])
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should update updatedAt timestamp when adding new export", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
const originalUpdatedAt = sourceFile.updatedAt
|
||||||
|
|
||||||
|
setTimeout(() => {
|
||||||
|
sourceFile.addExport("User")
|
||||||
|
|
||||||
|
expect(sourceFile.updatedAt.getTime()).toBeGreaterThanOrEqual(
|
||||||
|
originalUpdatedAt.getTime(),
|
||||||
|
)
|
||||||
|
}, 10)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should not update timestamp when adding duplicate export", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "", [], ["User"])
|
||||||
|
|
||||||
|
const originalUpdatedAt = sourceFile.updatedAt
|
||||||
|
|
||||||
|
setTimeout(() => {
|
||||||
|
sourceFile.addExport("User")
|
||||||
|
|
||||||
|
expect(sourceFile.updatedAt).toBe(originalUpdatedAt)
|
||||||
|
}, 10)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should add multiple different exports", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/domain/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
sourceFile.addExport("User")
|
||||||
|
sourceFile.addExport("UserProps")
|
||||||
|
sourceFile.addExport("UserFactory")
|
||||||
|
|
||||||
|
expect(sourceFile.exports).toEqual(["User", "UserProps", "UserFactory"])
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("importsFrom", () => {
|
||||||
|
it("should return true if imports contain the specified layer", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/application/User.ts", "/project")
|
||||||
|
const imports = ["../../domain/entities/User", "../use-cases/CreateUser"]
|
||||||
|
const sourceFile = new SourceFile(path, "", imports)
|
||||||
|
|
||||||
|
expect(sourceFile.importsFrom("domain")).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false if imports do not contain the specified layer", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/application/User.ts", "/project")
|
||||||
|
const imports = ["../use-cases/CreateUser", "../dtos/UserDto"]
|
||||||
|
const sourceFile = new SourceFile(path, "", imports)
|
||||||
|
|
||||||
|
expect(sourceFile.importsFrom("domain")).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should be case-insensitive", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/application/User.ts", "/project")
|
||||||
|
const imports = ["../../DOMAIN/entities/User"]
|
||||||
|
const sourceFile = new SourceFile(path, "", imports)
|
||||||
|
|
||||||
|
expect(sourceFile.importsFrom("domain")).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false for empty imports", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/application/User.ts", "/project")
|
||||||
|
const sourceFile = new SourceFile(path, "")
|
||||||
|
|
||||||
|
expect(sourceFile.importsFrom("domain")).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle partial matches in import paths", () => {
|
||||||
|
const path = ProjectPath.create("/project/src/application/User.ts", "/project")
|
||||||
|
const imports = ["../../infrastructure/database/UserRepository"]
|
||||||
|
const sourceFile = new SourceFile(path, "", imports)
|
||||||
|
|
||||||
|
expect(sourceFile.importsFrom("infrastructure")).toBe(true)
|
||||||
|
expect(sourceFile.importsFrom("domain")).toBe(false)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
199
packages/guardian/tests/unit/domain/ValueObject.test.ts
Normal file
199
packages/guardian/tests/unit/domain/ValueObject.test.ts
Normal file
@@ -0,0 +1,199 @@
|
|||||||
|
import { describe, it, expect } from "vitest"
|
||||||
|
import { ValueObject } from "../../../src/domain/value-objects/ValueObject"
|
||||||
|
|
||||||
|
interface TestProps {
|
||||||
|
readonly value: string
|
||||||
|
readonly count: number
|
||||||
|
}
|
||||||
|
|
||||||
|
class TestValueObject extends ValueObject<TestProps> {
|
||||||
|
constructor(value: string, count: number) {
|
||||||
|
super({ value, count })
|
||||||
|
}
|
||||||
|
|
||||||
|
public get value(): string {
|
||||||
|
return this.props.value
|
||||||
|
}
|
||||||
|
|
||||||
|
public get count(): number {
|
||||||
|
return this.props.count
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ComplexProps {
|
||||||
|
readonly name: string
|
||||||
|
readonly items: string[]
|
||||||
|
readonly metadata: { key: string; value: number }
|
||||||
|
}
|
||||||
|
|
||||||
|
class ComplexValueObject extends ValueObject<ComplexProps> {
|
||||||
|
constructor(name: string, items: string[], metadata: { key: string; value: number }) {
|
||||||
|
super({ name, items, metadata })
|
||||||
|
}
|
||||||
|
|
||||||
|
public get name(): string {
|
||||||
|
return this.props.name
|
||||||
|
}
|
||||||
|
|
||||||
|
public get items(): string[] {
|
||||||
|
return this.props.items
|
||||||
|
}
|
||||||
|
|
||||||
|
public get metadata(): { key: string; value: number } {
|
||||||
|
return this.props.metadata
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("ValueObject", () => {
|
||||||
|
describe("constructor", () => {
|
||||||
|
it("should create a value object with provided properties", () => {
|
||||||
|
const vo = new TestValueObject("test", 42)
|
||||||
|
|
||||||
|
expect(vo.value).toBe("test")
|
||||||
|
expect(vo.count).toBe(42)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should freeze the properties object", () => {
|
||||||
|
const vo = new TestValueObject("test", 42)
|
||||||
|
|
||||||
|
expect(Object.isFrozen(vo["props"])).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should prevent modification of properties", () => {
|
||||||
|
const vo = new TestValueObject("test", 42)
|
||||||
|
|
||||||
|
expect(() => {
|
||||||
|
;(vo["props"] as any).value = "modified"
|
||||||
|
}).toThrow()
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle complex nested properties", () => {
|
||||||
|
const vo = new ComplexValueObject("test", ["item1", "item2"], {
|
||||||
|
key: "key1",
|
||||||
|
value: 100,
|
||||||
|
})
|
||||||
|
|
||||||
|
expect(vo.name).toBe("test")
|
||||||
|
expect(vo.items).toEqual(["item1", "item2"])
|
||||||
|
expect(vo.metadata).toEqual({ key: "key1", value: 100 })
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("equals", () => {
|
||||||
|
it("should return true for value objects with identical properties", () => {
|
||||||
|
const vo1 = new TestValueObject("test", 42)
|
||||||
|
const vo2 = new TestValueObject("test", 42)
|
||||||
|
|
||||||
|
expect(vo1.equals(vo2)).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false for value objects with different values", () => {
|
||||||
|
const vo1 = new TestValueObject("test1", 42)
|
||||||
|
const vo2 = new TestValueObject("test2", 42)
|
||||||
|
|
||||||
|
expect(vo1.equals(vo2)).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false for value objects with different counts", () => {
|
||||||
|
const vo1 = new TestValueObject("test", 42)
|
||||||
|
const vo2 = new TestValueObject("test", 43)
|
||||||
|
|
||||||
|
expect(vo1.equals(vo2)).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false when comparing with undefined", () => {
|
||||||
|
const vo1 = new TestValueObject("test", 42)
|
||||||
|
|
||||||
|
expect(vo1.equals(undefined)).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return false when comparing with null", () => {
|
||||||
|
const vo1 = new TestValueObject("test", 42)
|
||||||
|
|
||||||
|
expect(vo1.equals(null as any)).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle complex nested property comparisons", () => {
|
||||||
|
const vo1 = new ComplexValueObject("test", ["item1", "item2"], {
|
||||||
|
key: "key1",
|
||||||
|
value: 100,
|
||||||
|
})
|
||||||
|
const vo2 = new ComplexValueObject("test", ["item1", "item2"], {
|
||||||
|
key: "key1",
|
||||||
|
value: 100,
|
||||||
|
})
|
||||||
|
|
||||||
|
expect(vo1.equals(vo2)).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect differences in nested arrays", () => {
|
||||||
|
const vo1 = new ComplexValueObject("test", ["item1", "item2"], {
|
||||||
|
key: "key1",
|
||||||
|
value: 100,
|
||||||
|
})
|
||||||
|
const vo2 = new ComplexValueObject("test", ["item1", "item3"], {
|
||||||
|
key: "key1",
|
||||||
|
value: 100,
|
||||||
|
})
|
||||||
|
|
||||||
|
expect(vo1.equals(vo2)).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect differences in nested objects", () => {
|
||||||
|
const vo1 = new ComplexValueObject("test", ["item1", "item2"], {
|
||||||
|
key: "key1",
|
||||||
|
value: 100,
|
||||||
|
})
|
||||||
|
const vo2 = new ComplexValueObject("test", ["item1", "item2"], {
|
||||||
|
key: "key2",
|
||||||
|
value: 100,
|
||||||
|
})
|
||||||
|
|
||||||
|
expect(vo1.equals(vo2)).toBe(false)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return true for same instance", () => {
|
||||||
|
const vo1 = new TestValueObject("test", 42)
|
||||||
|
|
||||||
|
expect(vo1.equals(vo1)).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle empty string values", () => {
|
||||||
|
const vo1 = new TestValueObject("", 0)
|
||||||
|
const vo2 = new TestValueObject("", 0)
|
||||||
|
|
||||||
|
expect(vo1.equals(vo2)).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should distinguish between zero and undefined in comparisons", () => {
|
||||||
|
const vo1 = new TestValueObject("test", 0)
|
||||||
|
const vo2 = new TestValueObject("test", 0)
|
||||||
|
|
||||||
|
expect(vo1.equals(vo2)).toBe(true)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("immutability", () => {
|
||||||
|
it("should freeze props object after creation", () => {
|
||||||
|
const vo = new TestValueObject("original", 42)
|
||||||
|
|
||||||
|
expect(Object.isFrozen(vo["props"])).toBe(true)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should not allow adding new properties", () => {
|
||||||
|
const vo = new TestValueObject("test", 42)
|
||||||
|
|
||||||
|
expect(() => {
|
||||||
|
;(vo["props"] as any).newProp = "new"
|
||||||
|
}).toThrow()
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should not allow deleting properties", () => {
|
||||||
|
const vo = new TestValueObject("test", 42)
|
||||||
|
|
||||||
|
expect(() => {
|
||||||
|
delete (vo["props"] as any).value
|
||||||
|
}).toThrow()
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
@@ -468,4 +468,102 @@ const b = 2`
|
|||||||
expect(result[0].context).toContain("5000")
|
expect(result[0].context).toContain("5000")
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
|
|
||||||
|
describe("TypeScript type contexts (false positive reduction)", () => {
|
||||||
|
it("should NOT detect strings in union types", () => {
|
||||||
|
const code = `type Status = 'active' | 'inactive' | 'pending'`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in interface property types", () => {
|
||||||
|
const code = `interface Config { mode: 'development' | 'production' }`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in type aliases", () => {
|
||||||
|
const code = `type Theme = 'light' | 'dark'`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in type assertions", () => {
|
||||||
|
const code = `const mode = getMode() as 'read' | 'write'`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in Symbol() calls", () => {
|
||||||
|
const code = `const TOKEN = Symbol('MY_TOKEN')`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in multiple Symbol() calls", () => {
|
||||||
|
const code = `
|
||||||
|
export const LOGGER = Symbol('LOGGER')
|
||||||
|
export const DATABASE = Symbol('DATABASE')
|
||||||
|
export const CACHE = Symbol('CACHE')
|
||||||
|
`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in import() calls", () => {
|
||||||
|
const code = `const module = import('../../path/to/module.js')`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in typeof checks", () => {
|
||||||
|
const code = `if (typeof x === 'string') { }`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should NOT detect strings in reverse typeof checks", () => {
|
||||||
|
const code = `if ('number' === typeof count) { }`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should skip tokens.ts files completely", () => {
|
||||||
|
const code = `
|
||||||
|
export const LOGGER = Symbol('LOGGER')
|
||||||
|
export const DATABASE = Symbol('DATABASE')
|
||||||
|
const url = "http://localhost:8080"
|
||||||
|
`
|
||||||
|
const result = detector.detectAll(code, "src/di/tokens.ts")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should skip tokens.js files completely", () => {
|
||||||
|
const code = `const TOKEN = Symbol('TOKEN')`
|
||||||
|
const result = detector.detectAll(code, "src/di/tokens.js")
|
||||||
|
|
||||||
|
expect(result).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should detect real magic strings even with type contexts nearby", () => {
|
||||||
|
const code = `
|
||||||
|
type Mode = 'read' | 'write'
|
||||||
|
const apiKey = "secret-key-12345"
|
||||||
|
`
|
||||||
|
const result = detector.detectMagicStrings(code, "test.ts")
|
||||||
|
|
||||||
|
expect(result.length).toBeGreaterThan(0)
|
||||||
|
expect(result.some((r) => r.value === "secret-key-12345")).toBe(true)
|
||||||
|
})
|
||||||
|
})
|
||||||
})
|
})
|
||||||
|
|||||||
@@ -0,0 +1,277 @@
|
|||||||
|
import { describe, it, expect, beforeEach } from "vitest"
|
||||||
|
import { SecretDetector } from "../../../src/infrastructure/analyzers/SecretDetector"
|
||||||
|
|
||||||
|
describe("SecretDetector", () => {
|
||||||
|
let detector: SecretDetector
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
detector = new SecretDetector()
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("detectAll", () => {
|
||||||
|
it("should return empty array for code without secrets", async () => {
|
||||||
|
const code = `
|
||||||
|
const greeting = "Hello World"
|
||||||
|
const count = 42
|
||||||
|
function test() {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "test.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return empty array for normal environment variable usage", async () => {
|
||||||
|
const code = `
|
||||||
|
const apiKey = process.env.API_KEY
|
||||||
|
const dbUrl = process.env.DATABASE_URL
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "config.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle empty code", async () => {
|
||||||
|
const violations = await detector.detectAll("", "empty.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle code with only comments", async () => {
|
||||||
|
const code = `
|
||||||
|
// This is a comment
|
||||||
|
/* Multi-line
|
||||||
|
comment */
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "comments.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle multiline strings without secrets", async () => {
|
||||||
|
const code = `
|
||||||
|
const template = \`
|
||||||
|
Hello World
|
||||||
|
This is a test
|
||||||
|
No secrets here
|
||||||
|
\`
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "template.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle code with URLs", async () => {
|
||||||
|
const code = `
|
||||||
|
const apiUrl = "https://api.example.com/v1"
|
||||||
|
const websiteUrl = "http://localhost:3000"
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "urls.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle imports and requires", async () => {
|
||||||
|
const code = `
|
||||||
|
import { something } from "some-package"
|
||||||
|
const fs = require('fs')
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "imports.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return violations with correct file path", async () => {
|
||||||
|
const code = `const secret = "test-secret-value"`
|
||||||
|
const filePath = "src/config/secrets.ts"
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, filePath)
|
||||||
|
|
||||||
|
violations.forEach((v) => {
|
||||||
|
expect(v.file).toBe(filePath)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle .js files", async () => {
|
||||||
|
const code = `const test = "value"`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "test.js")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle .jsx files", async () => {
|
||||||
|
const code = `const Component = () => <div>Test</div>`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "Component.jsx")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle .tsx files", async () => {
|
||||||
|
const code = `const Component: React.FC = () => <div>Test</div>`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "Component.tsx")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle errors gracefully", async () => {
|
||||||
|
const code = null as unknown as string
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "test.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle malformed code gracefully", async () => {
|
||||||
|
const code = "const = = ="
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "malformed.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("parseOutputToViolations", () => {
|
||||||
|
it("should parse empty output", async () => {
|
||||||
|
const code = ""
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "test.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle whitespace-only output", async () => {
|
||||||
|
const code = " \n \n "
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "test.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("extractSecretType", () => {
|
||||||
|
it("should handle various secret types correctly", async () => {
|
||||||
|
const code = `const value = "test"`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "test.ts")
|
||||||
|
|
||||||
|
violations.forEach((v) => {
|
||||||
|
expect(v.secretType).toBeTruthy()
|
||||||
|
expect(typeof v.secretType).toBe("string")
|
||||||
|
expect(v.secretType.length).toBeGreaterThan(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("integration", () => {
|
||||||
|
it("should work with TypeScript code", async () => {
|
||||||
|
const code = `
|
||||||
|
interface Config {
|
||||||
|
apiKey: string
|
||||||
|
}
|
||||||
|
|
||||||
|
const config: Config = {
|
||||||
|
apiKey: process.env.API_KEY || "default"
|
||||||
|
}
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "config.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should work with ES6+ syntax", async () => {
|
||||||
|
const code = `
|
||||||
|
const fetchData = async () => {
|
||||||
|
const response = await fetch(url)
|
||||||
|
return response.json()
|
||||||
|
}
|
||||||
|
|
||||||
|
const [data, setData] = useState(null)
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "hooks.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should work with JSX/TSX", async () => {
|
||||||
|
const code = `
|
||||||
|
export const Button = ({ onClick }: Props) => {
|
||||||
|
return <button onClick={onClick}>Click me</button>
|
||||||
|
}
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "Button.tsx")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle concurrent detections", async () => {
|
||||||
|
const code1 = "const test1 = 'value1'"
|
||||||
|
const code2 = "const test2 = 'value2'"
|
||||||
|
const code3 = "const test3 = 'value3'"
|
||||||
|
|
||||||
|
const [result1, result2, result3] = await Promise.all([
|
||||||
|
detector.detectAll(code1, "file1.ts"),
|
||||||
|
detector.detectAll(code2, "file2.ts"),
|
||||||
|
detector.detectAll(code3, "file3.ts"),
|
||||||
|
])
|
||||||
|
|
||||||
|
expect(result1).toBeInstanceOf(Array)
|
||||||
|
expect(result2).toBeInstanceOf(Array)
|
||||||
|
expect(result3).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("edge cases", () => {
|
||||||
|
it("should handle very long code", async () => {
|
||||||
|
const longCode = "const value = 'test'\n".repeat(1000)
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(longCode, "long.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle special characters in code", async () => {
|
||||||
|
const code = `
|
||||||
|
const special = "!@#$%^&*()_+-=[]{}|;:',.<>?"
|
||||||
|
const unicode = "日本語 🚀"
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "special.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle code with regex patterns", async () => {
|
||||||
|
const code = `
|
||||||
|
const pattern = /^[A-Z0-9._%+-]+@[A-Z0-9.-]+\\.[A-Z]{2,}$/i
|
||||||
|
const matches = text.match(pattern)
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "regex.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle code with template literals", async () => {
|
||||||
|
const code = `
|
||||||
|
const message = \`Hello \${name}, your balance is \${balance}\`
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "template.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
315
pnpm-lock.yaml
generated
315
pnpm-lock.yaml
generated
@@ -80,6 +80,18 @@ importers:
|
|||||||
|
|
||||||
packages/guardian:
|
packages/guardian:
|
||||||
dependencies:
|
dependencies:
|
||||||
|
'@secretlint/core':
|
||||||
|
specifier: ^11.2.5
|
||||||
|
version: 11.2.5
|
||||||
|
'@secretlint/node':
|
||||||
|
specifier: ^11.2.5
|
||||||
|
version: 11.2.5
|
||||||
|
'@secretlint/secretlint-rule-preset-recommend':
|
||||||
|
specifier: ^11.2.5
|
||||||
|
version: 11.2.5
|
||||||
|
'@secretlint/types':
|
||||||
|
specifier: ^11.2.5
|
||||||
|
version: 11.2.5
|
||||||
commander:
|
commander:
|
||||||
specifier: ^12.1.0
|
specifier: ^12.1.0
|
||||||
version: 12.1.0
|
version: 12.1.0
|
||||||
@@ -154,6 +166,12 @@ packages:
|
|||||||
resolution: {integrity: sha512-J4Jarr0SohdrHcb40gTL4wGPCQ952IMWF1G/MSAQfBAPvA9ZKApYhpxcY7PmehVePve+ujpus1dGsJ7dPxz8Kg==}
|
resolution: {integrity: sha512-J4Jarr0SohdrHcb40gTL4wGPCQ952IMWF1G/MSAQfBAPvA9ZKApYhpxcY7PmehVePve+ujpus1dGsJ7dPxz8Kg==}
|
||||||
engines: {node: ^18.19.1 || ^20.11.1 || >=22.0.0, npm: ^6.11.0 || ^7.5.6 || >=8.0.0, yarn: '>= 1.13.0'}
|
engines: {node: ^18.19.1 || ^20.11.1 || >=22.0.0, npm: ^6.11.0 || ^7.5.6 || >=8.0.0, yarn: '>= 1.13.0'}
|
||||||
|
|
||||||
|
'@azu/format-text@1.0.2':
|
||||||
|
resolution: {integrity: sha512-Swi4N7Edy1Eqq82GxgEECXSSLyn6GOb5htRFPzBDdUkECGXtlf12ynO5oJSpWKPwCaUssOu7NfhDcCWpIC6Ywg==}
|
||||||
|
|
||||||
|
'@azu/style-format@1.0.1':
|
||||||
|
resolution: {integrity: sha512-AHcTojlNBdD/3/KxIKlg8sxIWHfOtQszLvOpagLTO+bjC3u7SAszu1lf//u7JJC50aUSH+BVWDD/KvaA6Gfn5g==}
|
||||||
|
|
||||||
'@babel/code-frame@7.27.1':
|
'@babel/code-frame@7.27.1':
|
||||||
resolution: {integrity: sha512-cjQ7ZlQ0Mv3b47hABuTevyTuYN4i+loJKGeV9flcCgIK37cCXRh+L1bd3iBHlynerhQ7BhCkn2BPbQUL+rGqFg==}
|
resolution: {integrity: sha512-cjQ7ZlQ0Mv3b47hABuTevyTuYN4i+loJKGeV9flcCgIK37cCXRh+L1bd3iBHlynerhQ7BhCkn2BPbQUL+rGqFg==}
|
||||||
engines: {node: '>=6.9.0'}
|
engines: {node: '>=6.9.0'}
|
||||||
@@ -1040,6 +1058,40 @@ packages:
|
|||||||
cpu: [x64]
|
cpu: [x64]
|
||||||
os: [win32]
|
os: [win32]
|
||||||
|
|
||||||
|
'@secretlint/config-loader@11.2.5':
|
||||||
|
resolution: {integrity: sha512-pUiH5xc3x8RLEDq+0dCz65v4kohtfp68I7qmYPuymTwHodzjyJ089ZbNdN1ZX8SZV4xZLQsFIrRLn1lJ55QyyQ==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
|
'@secretlint/core@11.2.5':
|
||||||
|
resolution: {integrity: sha512-PZNpBd6+KVya2tA3o1oC2kTWYKju8lZG9phXyQY7geWKf+a+fInN4/HSYfCQS495oyTSjhc9qI0mNQEw83PY2Q==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
|
'@secretlint/formatter@11.2.5':
|
||||||
|
resolution: {integrity: sha512-9XBMeveo1eKXMC9zLjA6nd2lb5JjUgjj8NUpCo1Il8jO4YJ12k7qXZk3T/QJup+Kh0ThpHO03D9C1xLDIPIEPQ==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
|
'@secretlint/node@11.2.5':
|
||||||
|
resolution: {integrity: sha512-nPdtUsTzDzBJzFiKh80/H5+2ZRRogtDuHhnNiGtF7LSHp8YsQHU5piAVbESdV0AmUjbWijAjscIsWqvtU+2JUQ==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
|
'@secretlint/profiler@11.2.5':
|
||||||
|
resolution: {integrity: sha512-evQ2PeO3Ub0apWIPaXJy8lMDO1OFgvgQhZd+MhYLcLHgR559EtJ9V02Sh5c10wTLkLAtJ+czlJg2kmlt0nm8fw==}
|
||||||
|
|
||||||
|
'@secretlint/resolver@11.2.5':
|
||||||
|
resolution: {integrity: sha512-Zn9+Gj7cRNjEDX8d1NYZNjTG9/Wjlc8N+JvARFYYYu6JxfbtkabhFxzwxBLkRZ2ZCkPCCnuXJwepcgfVXSPsng==}
|
||||||
|
|
||||||
|
'@secretlint/secretlint-rule-preset-recommend@11.2.5':
|
||||||
|
resolution: {integrity: sha512-FAnp/dPdbvHEw50aF9JMPF/OwW58ULvVXEsk+mXTtBD09VJZhG0vFum8WzxMbB98Eo4xDddGzYtE3g27pBOaQA==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
|
'@secretlint/source-creator@11.2.5':
|
||||||
|
resolution: {integrity: sha512-+ApoNDS4uIaLb2PG9PPEP9Zu1HDBWpxSd/+Qlb3MzKTwp2BG9sbUhvpGgxuIHFn7pMWQU60DhzYJJUBpbXZEHQ==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
|
'@secretlint/types@11.2.5':
|
||||||
|
resolution: {integrity: sha512-iA7E+uXuiEydOwv8glEYM4tCHnl8C7wTgLxg+3upHhH/iSSnefWfoRqrJwVBhwxPg4MDoypVI7Oal7bX7/ne+w==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
'@sinclair/typebox@0.34.41':
|
'@sinclair/typebox@0.34.41':
|
||||||
resolution: {integrity: sha512-6gS8pZzSXdyRHTIqoqSVknxolr1kzfy4/CeDnrzsVz8TTIWUbOBr6gnzOmTYJ3eXQNh4IYHIGi5aIL7sOZ2G/g==}
|
resolution: {integrity: sha512-6gS8pZzSXdyRHTIqoqSVknxolr1kzfy4/CeDnrzsVz8TTIWUbOBr6gnzOmTYJ3eXQNh4IYHIGi5aIL7sOZ2G/g==}
|
||||||
|
|
||||||
@@ -1052,6 +1104,21 @@ packages:
|
|||||||
'@standard-schema/spec@1.0.0':
|
'@standard-schema/spec@1.0.0':
|
||||||
resolution: {integrity: sha512-m2bOd0f2RT9k8QJx1JN85cZYyH1RqFBdlwtkSlf4tBDYLCiiZnv1fIIwacK6cqwXavOydf0NPToMQgpKq+dVlA==}
|
resolution: {integrity: sha512-m2bOd0f2RT9k8QJx1JN85cZYyH1RqFBdlwtkSlf4tBDYLCiiZnv1fIIwacK6cqwXavOydf0NPToMQgpKq+dVlA==}
|
||||||
|
|
||||||
|
'@textlint/ast-node-types@15.4.0':
|
||||||
|
resolution: {integrity: sha512-IqY8i7IOGuvy05wZxISB7Me1ZyrvhaQGgx6DavfQjH3cfwpPFdDbDYmMXMuSv2xLS1kDB1kYKBV7fL2Vi16lRA==}
|
||||||
|
|
||||||
|
'@textlint/linter-formatter@15.4.0':
|
||||||
|
resolution: {integrity: sha512-rfqOZmnI1Wwc/Pa4LK+vagvVPmvxf9oRsBRqIOB04DwhucingZyAIJI/TyG18DIDYbP2aFXBZ3oOvyAxHe/8PQ==}
|
||||||
|
|
||||||
|
'@textlint/module-interop@15.4.0':
|
||||||
|
resolution: {integrity: sha512-uGf+SFIfzOLCbZI0gp+2NLsrkSArsvEWulPP6lJuKp7yRHadmy7Xf/YHORe46qhNyyxc8PiAfiixHJSaHGUrGg==}
|
||||||
|
|
||||||
|
'@textlint/resolver@15.4.0':
|
||||||
|
resolution: {integrity: sha512-Vh/QceKZQHFJFG4GxxIsKM1Xhwv93mbtKHmFE5/ybal1mIKHdqF03Z9Guaqt6Sx/AeNUshq0hkMOEhEyEWnehQ==}
|
||||||
|
|
||||||
|
'@textlint/types@15.4.0':
|
||||||
|
resolution: {integrity: sha512-ZMwJgw/xjxJufOD+IB7I2Enl9Si4Hxo04B76RwUZ5cKBKzOPcmd6WvGe2F7jqdgmTdGnfMU+Bo/joQrjPNIWqg==}
|
||||||
|
|
||||||
'@tokenizer/inflate@0.3.1':
|
'@tokenizer/inflate@0.3.1':
|
||||||
resolution: {integrity: sha512-4oeoZEBQdLdt5WmP/hx1KZ6D3/Oid/0cUb2nk4F0pTDAWy+KCH3/EnAkZF/bvckWo8I33EqBm01lIPgmgc8rCA==}
|
resolution: {integrity: sha512-4oeoZEBQdLdt5WmP/hx1KZ6D3/Oid/0cUb2nk4F0pTDAWy+KCH3/EnAkZF/bvckWo8I33EqBm01lIPgmgc8rCA==}
|
||||||
engines: {node: '>=18'}
|
engines: {node: '>=18'}
|
||||||
@@ -1488,6 +1555,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-gKXj5ALrKWQLsYG9jlTRmR/xKluxHV+Z9QEwNIgCfM1/uwPMCuzVVnh5mwTd+OuBZcwSIMbqssNWRm1lE51QaQ==}
|
resolution: {integrity: sha512-gKXj5ALrKWQLsYG9jlTRmR/xKluxHV+Z9QEwNIgCfM1/uwPMCuzVVnh5mwTd+OuBZcwSIMbqssNWRm1lE51QaQ==}
|
||||||
engines: {node: '>=8'}
|
engines: {node: '>=8'}
|
||||||
|
|
||||||
|
ansi-escapes@7.2.0:
|
||||||
|
resolution: {integrity: sha512-g6LhBsl+GBPRWGWsBtutpzBYuIIdBkLEvad5C/va/74Db018+5TZiyA26cZJAr3Rft5lprVqOIPxf5Vid6tqAw==}
|
||||||
|
engines: {node: '>=18'}
|
||||||
|
|
||||||
ansi-regex@5.0.1:
|
ansi-regex@5.0.1:
|
||||||
resolution: {integrity: sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==}
|
resolution: {integrity: sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==}
|
||||||
engines: {node: '>=8'}
|
engines: {node: '>=8'}
|
||||||
@@ -1538,6 +1609,10 @@ packages:
|
|||||||
ast-v8-to-istanbul@0.3.8:
|
ast-v8-to-istanbul@0.3.8:
|
||||||
resolution: {integrity: sha512-szgSZqUxI5T8mLKvS7WTjF9is+MVbOeLADU73IseOcrqhxr/VAvy6wfoVE39KnKzA7JRhjF5eUagNlHwvZPlKQ==}
|
resolution: {integrity: sha512-szgSZqUxI5T8mLKvS7WTjF9is+MVbOeLADU73IseOcrqhxr/VAvy6wfoVE39KnKzA7JRhjF5eUagNlHwvZPlKQ==}
|
||||||
|
|
||||||
|
astral-regex@2.0.0:
|
||||||
|
resolution: {integrity: sha512-Z7tMw1ytTXt5jqMcOP+OQteU1VuNK9Y02uuJtKQ1Sv69jXQKKg5cibLwGJow8yzZP+eAc18EmLGPal0bp36rvQ==}
|
||||||
|
engines: {node: '>=8'}
|
||||||
|
|
||||||
asynckit@0.4.0:
|
asynckit@0.4.0:
|
||||||
resolution: {integrity: sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==}
|
resolution: {integrity: sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==}
|
||||||
|
|
||||||
@@ -1576,9 +1651,16 @@ packages:
|
|||||||
resolution: {integrity: sha512-a28v2eWrrRWPpJSzxc+mKwm0ZtVx/G8SepdQZDArnXYU/XS+IF6mp8aB/4E+hH1tyGCoDo3KlUCdlSxGDsRkAw==}
|
resolution: {integrity: sha512-a28v2eWrrRWPpJSzxc+mKwm0ZtVx/G8SepdQZDArnXYU/XS+IF6mp8aB/4E+hH1tyGCoDo3KlUCdlSxGDsRkAw==}
|
||||||
hasBin: true
|
hasBin: true
|
||||||
|
|
||||||
|
binaryextensions@6.11.0:
|
||||||
|
resolution: {integrity: sha512-sXnYK/Ij80TO3lcqZVV2YgfKN5QjUWIRk/XSm2J/4bd/lPko3lvk0O4ZppH6m+6hB2/GTu+ptNwVFe1xh+QLQw==}
|
||||||
|
engines: {node: '>=4'}
|
||||||
|
|
||||||
bl@4.1.0:
|
bl@4.1.0:
|
||||||
resolution: {integrity: sha512-1W07cM9gS6DcLperZfFSj+bWLtaPGSOHWhPiGzXmvVJbRLdG82sH/Kn8EtW1VqWVA54AKf2h5k5BbnIbwF3h6w==}
|
resolution: {integrity: sha512-1W07cM9gS6DcLperZfFSj+bWLtaPGSOHWhPiGzXmvVJbRLdG82sH/Kn8EtW1VqWVA54AKf2h5k5BbnIbwF3h6w==}
|
||||||
|
|
||||||
|
boundary@2.0.0:
|
||||||
|
resolution: {integrity: sha512-rJKn5ooC9u8q13IMCrW0RSp31pxBCHE3y9V/tp3TdWSLf8Em3p6Di4NBpfzbJge9YjjFEsD0RtFEjtvHL5VyEA==}
|
||||||
|
|
||||||
brace-expansion@1.1.12:
|
brace-expansion@1.1.12:
|
||||||
resolution: {integrity: sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==}
|
resolution: {integrity: sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==}
|
||||||
|
|
||||||
@@ -1638,6 +1720,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==}
|
resolution: {integrity: sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==}
|
||||||
engines: {node: '>=10'}
|
engines: {node: '>=10'}
|
||||||
|
|
||||||
|
chalk@5.6.2:
|
||||||
|
resolution: {integrity: sha512-7NzBL0rN6fMUW+f7A6Io4h40qQlG+xGmtMxfbnH/K7TAtt8JQWVQK+6g0UXKMeVJoyV5EkkNsErQ8pVD3bLHbA==}
|
||||||
|
engines: {node: ^12.17.0 || ^14.13 || >=16.0.0}
|
||||||
|
|
||||||
char-regex@1.0.2:
|
char-regex@1.0.2:
|
||||||
resolution: {integrity: sha512-kWWXztvZ5SBQV+eRgKFeh8q5sLuZY2+8WUIzlxWVTg+oGwY14qylx1KbKzHd8P6ZYkAg0xyIDU9JMHhyJMZ1jw==}
|
resolution: {integrity: sha512-kWWXztvZ5SBQV+eRgKFeh8q5sLuZY2+8WUIzlxWVTg+oGwY14qylx1KbKzHd8P6ZYkAg0xyIDU9JMHhyJMZ1jw==}
|
||||||
engines: {node: '>=10'}
|
engines: {node: '>=10'}
|
||||||
@@ -1801,6 +1887,10 @@ packages:
|
|||||||
eastasianwidth@0.2.0:
|
eastasianwidth@0.2.0:
|
||||||
resolution: {integrity: sha512-I88TYZWc9XiYHRQ4/3c5rjjfgkjhLyW2luGIheGERbNQ6OY7yTybanSpDXZa8y7VUP9YmDcYa+eyq4ca7iLqWA==}
|
resolution: {integrity: sha512-I88TYZWc9XiYHRQ4/3c5rjjfgkjhLyW2luGIheGERbNQ6OY7yTybanSpDXZa8y7VUP9YmDcYa+eyq4ca7iLqWA==}
|
||||||
|
|
||||||
|
editions@6.22.0:
|
||||||
|
resolution: {integrity: sha512-UgGlf8IW75je7HZjNDpJdCv4cGJWIi6yumFdZ0R7A8/CIhQiWUjyGLCxdHpd8bmyD1gnkfUNK0oeOXqUS2cpfQ==}
|
||||||
|
engines: {ecmascript: '>= es5', node: '>=4'}
|
||||||
|
|
||||||
electron-to-chromium@1.5.259:
|
electron-to-chromium@1.5.259:
|
||||||
resolution: {integrity: sha512-I+oLXgpEJzD6Cwuwt1gYjxsDmu/S/Kd41mmLA3O+/uH2pFRO/DvOjUyGozL8j3KeLV6WyZ7ssPwELMsXCcsJAQ==}
|
resolution: {integrity: sha512-I+oLXgpEJzD6Cwuwt1gYjxsDmu/S/Kd41mmLA3O+/uH2pFRO/DvOjUyGozL8j3KeLV6WyZ7ssPwELMsXCcsJAQ==}
|
||||||
|
|
||||||
@@ -1818,6 +1908,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-d4lC8xfavMeBjzGr2vECC3fsGXziXZQyJxD868h2M/mBI3PwAuODxAkLkq5HYuvrPYcUtiLzsTo8U3PgX3Ocww==}
|
resolution: {integrity: sha512-d4lC8xfavMeBjzGr2vECC3fsGXziXZQyJxD868h2M/mBI3PwAuODxAkLkq5HYuvrPYcUtiLzsTo8U3PgX3Ocww==}
|
||||||
engines: {node: '>=10.13.0'}
|
engines: {node: '>=10.13.0'}
|
||||||
|
|
||||||
|
environment@1.1.0:
|
||||||
|
resolution: {integrity: sha512-xUtoPkMggbz0MPyPiIWr1Kp4aeWJjDZ6SMvURhimjdZgsRuDplF5/s9hcgGhyXMhs+6vpnuoiZ2kFiu3FMnS8Q==}
|
||||||
|
engines: {node: '>=18'}
|
||||||
|
|
||||||
error-ex@1.3.4:
|
error-ex@1.3.4:
|
||||||
resolution: {integrity: sha512-sqQamAnR14VgCr1A618A3sGrygcpK+HEbenA/HiEAkkUwcZIIB/tgWqHFxWgOyDh4nB4JCRimh79dR5Ywc9MDQ==}
|
resolution: {integrity: sha512-sqQamAnR14VgCr1A618A3sGrygcpK+HEbenA/HiEAkkUwcZIIB/tgWqHFxWgOyDh4nB4JCRimh79dR5Ywc9MDQ==}
|
||||||
|
|
||||||
@@ -2249,6 +2343,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-HGYWWS/ehqTV3xN10i23tkPkpH46MLCIMFNCaaKNavAXTF1RkqxawEPtnjnGZ6XKSInBKkiOA5BKS+aZiY3AvA==}
|
resolution: {integrity: sha512-HGYWWS/ehqTV3xN10i23tkPkpH46MLCIMFNCaaKNavAXTF1RkqxawEPtnjnGZ6XKSInBKkiOA5BKS+aZiY3AvA==}
|
||||||
engines: {node: '>=8'}
|
engines: {node: '>=8'}
|
||||||
|
|
||||||
|
istextorbinary@9.5.0:
|
||||||
|
resolution: {integrity: sha512-5mbUj3SiZXCuRf9fT3ibzbSSEWiy63gFfksmGfdOzujPjW3k+z8WvIBxcJHBoQNlaZaiyB25deviif2+osLmLw==}
|
||||||
|
engines: {node: '>=4'}
|
||||||
|
|
||||||
iterare@1.2.1:
|
iterare@1.2.1:
|
||||||
resolution: {integrity: sha512-RKYVTCjAnRthyJes037NX/IiqeidgN1xc3j1RjFfECFp28A1GVwK9nA+i0rJPaHqSZwygLzRnFlzUuHFoWWy+Q==}
|
resolution: {integrity: sha512-RKYVTCjAnRthyJes037NX/IiqeidgN1xc3j1RjFfECFp28A1GVwK9nA+i0rJPaHqSZwygLzRnFlzUuHFoWWy+Q==}
|
||||||
engines: {node: '>=6'}
|
engines: {node: '>=6'}
|
||||||
@@ -2473,6 +2571,9 @@ packages:
|
|||||||
lodash.merge@4.6.2:
|
lodash.merge@4.6.2:
|
||||||
resolution: {integrity: sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ==}
|
resolution: {integrity: sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ==}
|
||||||
|
|
||||||
|
lodash.truncate@4.4.2:
|
||||||
|
resolution: {integrity: sha512-jttmRe7bRse52OsWIMDLaXxWqRAmtIUccAQ3garviCqJjafXOfNMO0yMfNpdD6zbGaTU0P5Nz7e7gAT6cKmJRw==}
|
||||||
|
|
||||||
lodash@4.17.21:
|
lodash@4.17.21:
|
||||||
resolution: {integrity: sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==}
|
resolution: {integrity: sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==}
|
||||||
|
|
||||||
@@ -2657,6 +2758,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-LaNjtRWUBY++zB5nE/NwcaoMylSPk+S+ZHNB1TzdbMJMny6dynpAGt7X/tl/QYq3TIeE6nxHppbo2LGymrG5Pw==}
|
resolution: {integrity: sha512-LaNjtRWUBY++zB5nE/NwcaoMylSPk+S+ZHNB1TzdbMJMny6dynpAGt7X/tl/QYq3TIeE6nxHppbo2LGymrG5Pw==}
|
||||||
engines: {node: '>=10'}
|
engines: {node: '>=10'}
|
||||||
|
|
||||||
|
p-map@7.0.4:
|
||||||
|
resolution: {integrity: sha512-tkAQEw8ysMzmkhgw8k+1U/iPhWNhykKnSk4Rd5zLoPJCuJaGRPo6YposrZgaxHKzDHdDWWZvE/Sk7hsL2X/CpQ==}
|
||||||
|
engines: {node: '>=18'}
|
||||||
|
|
||||||
p-try@2.2.0:
|
p-try@2.2.0:
|
||||||
resolution: {integrity: sha512-R4nPAVTAU0B9D35/Gk3uJf/7XYbQcyohSKdvAxIRSNghFl4e71hVoGnBNQz9cWaXxO2I10KTC+3jMdvvoKw6dQ==}
|
resolution: {integrity: sha512-R4nPAVTAU0B9D35/Gk3uJf/7XYbQcyohSKdvAxIRSNghFl4e71hVoGnBNQz9cWaXxO2I10KTC+3jMdvvoKw6dQ==}
|
||||||
engines: {node: '>=6'}
|
engines: {node: '>=6'}
|
||||||
@@ -2725,6 +2830,9 @@ packages:
|
|||||||
resolution: {integrity: sha512-HRDzbaKjC+AOWVXxAU/x54COGeIv9eb+6CkDSQoNTt4XyWoIJvuPsXizxu/Fr23EiekbtZwmh1IcIG/l/a10GQ==}
|
resolution: {integrity: sha512-HRDzbaKjC+AOWVXxAU/x54COGeIv9eb+6CkDSQoNTt4XyWoIJvuPsXizxu/Fr23EiekbtZwmh1IcIG/l/a10GQ==}
|
||||||
engines: {node: '>=8'}
|
engines: {node: '>=8'}
|
||||||
|
|
||||||
|
pluralize@2.0.0:
|
||||||
|
resolution: {integrity: sha512-TqNZzQCD4S42De9IfnnBvILN7HAW7riLqsCyp8lgjXeysyPlX5HhqKAcJHHHb9XskE4/a+7VGC9zzx8Ls0jOAw==}
|
||||||
|
|
||||||
pluralize@8.0.0:
|
pluralize@8.0.0:
|
||||||
resolution: {integrity: sha512-Nc3IT5yHzflTfbjgqWcCPpo7DaKy4FnpB0l/zCAW0Tc7jxAiuqSxHasntB3D7887LSrA93kDJ9IXovxJYxyLCA==}
|
resolution: {integrity: sha512-Nc3IT5yHzflTfbjgqWcCPpo7DaKy4FnpB0l/zCAW0Tc7jxAiuqSxHasntB3D7887LSrA93kDJ9IXovxJYxyLCA==}
|
||||||
engines: {node: '>=4'}
|
engines: {node: '>=4'}
|
||||||
@@ -2767,6 +2875,9 @@ packages:
|
|||||||
randombytes@2.1.0:
|
randombytes@2.1.0:
|
||||||
resolution: {integrity: sha512-vYl3iOX+4CKUWuxGi9Ukhie6fsqXqS9FE2Zaic4tNFD2N2QQaXOMFbuKK4QmDHC0JO6B1Zp41J0LpT0oR68amQ==}
|
resolution: {integrity: sha512-vYl3iOX+4CKUWuxGi9Ukhie6fsqXqS9FE2Zaic4tNFD2N2QQaXOMFbuKK4QmDHC0JO6B1Zp41J0LpT0oR68amQ==}
|
||||||
|
|
||||||
|
rc-config-loader@4.1.3:
|
||||||
|
resolution: {integrity: sha512-kD7FqML7l800i6pS6pvLyIE2ncbk9Du8Q0gp/4hMPhJU6ZxApkoLcGD8ZeqgiAlfwZ6BlETq6qqe+12DUL207w==}
|
||||||
|
|
||||||
react-is@18.3.1:
|
react-is@18.3.1:
|
||||||
resolution: {integrity: sha512-/LLMVyas0ljjAtoYiPqYiL8VWXzUUdThrmU5+n20DZv+a+ClRoevUzw5JxU+Ieh5/c87ytoTBV9G1FiKfNJdmg==}
|
resolution: {integrity: sha512-/LLMVyas0ljjAtoYiPqYiL8VWXzUUdThrmU5+n20DZv+a+ClRoevUzw5JxU+Ieh5/c87ytoTBV9G1FiKfNJdmg==}
|
||||||
|
|
||||||
@@ -2894,6 +3005,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-g9Q1haeby36OSStwb4ntCGGGaKsaVSjQ68fBxoQcutl5fS1vuY18H3wSt3jFyFtrkx+Kz0V1G85A4MyAdDMi2Q==}
|
resolution: {integrity: sha512-g9Q1haeby36OSStwb4ntCGGGaKsaVSjQ68fBxoQcutl5fS1vuY18H3wSt3jFyFtrkx+Kz0V1G85A4MyAdDMi2Q==}
|
||||||
engines: {node: '>=8'}
|
engines: {node: '>=8'}
|
||||||
|
|
||||||
|
slice-ansi@4.0.0:
|
||||||
|
resolution: {integrity: sha512-qMCMfhY040cVHT43K9BFygqYbUPFZKHOg7K73mtTWJRb8pyP3fzf4Ixd5SzdEJQ6MRUg/WBnOLxghZtKKurENQ==}
|
||||||
|
engines: {node: '>=10'}
|
||||||
|
|
||||||
source-map-js@1.2.1:
|
source-map-js@1.2.1:
|
||||||
resolution: {integrity: sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==}
|
resolution: {integrity: sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==}
|
||||||
engines: {node: '>=0.10.0'}
|
engines: {node: '>=0.10.0'}
|
||||||
@@ -2972,6 +3087,9 @@ packages:
|
|||||||
resolution: {integrity: sha512-KIy5nylvC5le1OdaaoCJ07L+8iQzJHGH6pWDuzS+d07Cu7n1MZ2x26P8ZKIWfbK02+XIL8Mp4RkWeqdUCrDMfg==}
|
resolution: {integrity: sha512-KIy5nylvC5le1OdaaoCJ07L+8iQzJHGH6pWDuzS+d07Cu7n1MZ2x26P8ZKIWfbK02+XIL8Mp4RkWeqdUCrDMfg==}
|
||||||
engines: {node: '>=18'}
|
engines: {node: '>=18'}
|
||||||
|
|
||||||
|
structured-source@4.0.0:
|
||||||
|
resolution: {integrity: sha512-qGzRFNJDjFieQkl/sVOI2dUjHKRyL9dAJi2gCPGJLbJHBIkyOHxjuocpIEfbLioX+qSJpvbYdT49/YCdMznKxA==}
|
||||||
|
|
||||||
superagent@10.2.3:
|
superagent@10.2.3:
|
||||||
resolution: {integrity: sha512-y/hkYGeXAj7wUMjxRbB21g/l6aAEituGXM9Rwl4o20+SX3e8YOSV6BxFXl+dL3Uk0mjSL3kCbNkwURm8/gEDig==}
|
resolution: {integrity: sha512-y/hkYGeXAj7wUMjxRbB21g/l6aAEituGXM9Rwl4o20+SX3e8YOSV6BxFXl+dL3Uk0mjSL3kCbNkwURm8/gEDig==}
|
||||||
engines: {node: '>=14.18.0'}
|
engines: {node: '>=14.18.0'}
|
||||||
@@ -2988,6 +3106,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-MpUEN2OodtUzxvKQl72cUF7RQ5EiHsGvSsVG0ia9c5RbWGL2CI4C7EpPS8UTBIplnlzZiNuV56w+FuNxy3ty2Q==}
|
resolution: {integrity: sha512-MpUEN2OodtUzxvKQl72cUF7RQ5EiHsGvSsVG0ia9c5RbWGL2CI4C7EpPS8UTBIplnlzZiNuV56w+FuNxy3ty2Q==}
|
||||||
engines: {node: '>=10'}
|
engines: {node: '>=10'}
|
||||||
|
|
||||||
|
supports-hyperlinks@3.2.0:
|
||||||
|
resolution: {integrity: sha512-zFObLMyZeEwzAoKCyu1B91U79K2t7ApXuQfo8OuxwXLDgcKxuwM+YvcbIhm6QWqz7mHUH1TVytR1PwVVjEuMig==}
|
||||||
|
engines: {node: '>=14.18'}
|
||||||
|
|
||||||
symbol-observable@4.0.0:
|
symbol-observable@4.0.0:
|
||||||
resolution: {integrity: sha512-b19dMThMV4HVFynSAM1++gBHAbk2Tc/osgLIBZMKsyqh34jb2e8Os7T6ZW/Bt3pJFdBTd2JwAnAAEQV7rSNvcQ==}
|
resolution: {integrity: sha512-b19dMThMV4HVFynSAM1++gBHAbk2Tc/osgLIBZMKsyqh34jb2e8Os7T6ZW/Bt3pJFdBTd2JwAnAAEQV7rSNvcQ==}
|
||||||
engines: {node: '>=0.10'}
|
engines: {node: '>=0.10'}
|
||||||
@@ -2996,10 +3118,18 @@ packages:
|
|||||||
resolution: {integrity: sha512-MeQTA1r0litLUf0Rp/iisCaL8761lKAZHaimlbGK4j0HysC4PLfqygQj9srcs0m2RdtDYnF8UuYyKpbjHYp7Jw==}
|
resolution: {integrity: sha512-MeQTA1r0litLUf0Rp/iisCaL8761lKAZHaimlbGK4j0HysC4PLfqygQj9srcs0m2RdtDYnF8UuYyKpbjHYp7Jw==}
|
||||||
engines: {node: ^14.18.0 || >=16.0.0}
|
engines: {node: ^14.18.0 || >=16.0.0}
|
||||||
|
|
||||||
|
table@6.9.0:
|
||||||
|
resolution: {integrity: sha512-9kY+CygyYM6j02t5YFHbNz2FN5QmYGv9zAjVp4lCDjlCw7amdckXlEt/bjMhUIfj4ThGRE4gCUH5+yGnNuPo5A==}
|
||||||
|
engines: {node: '>=10.0.0'}
|
||||||
|
|
||||||
tapable@2.3.0:
|
tapable@2.3.0:
|
||||||
resolution: {integrity: sha512-g9ljZiwki/LfxmQADO3dEY1CbpmXT5Hm2fJ+QaGKwSXUylMybePR7/67YW7jOrrvjEgL1Fmz5kzyAjWVWLlucg==}
|
resolution: {integrity: sha512-g9ljZiwki/LfxmQADO3dEY1CbpmXT5Hm2fJ+QaGKwSXUylMybePR7/67YW7jOrrvjEgL1Fmz5kzyAjWVWLlucg==}
|
||||||
engines: {node: '>=6'}
|
engines: {node: '>=6'}
|
||||||
|
|
||||||
|
terminal-link@4.0.0:
|
||||||
|
resolution: {integrity: sha512-lk+vH+MccxNqgVqSnkMVKx4VLJfnLjDBGzH16JVZjKE2DoxP57s6/vt6JmXV5I3jBcfGrxNrYtC+mPtU7WJztA==}
|
||||||
|
engines: {node: '>=18'}
|
||||||
|
|
||||||
terser-webpack-plugin@5.3.14:
|
terser-webpack-plugin@5.3.14:
|
||||||
resolution: {integrity: sha512-vkZjpUjb6OMS7dhV+tILUW6BhpDR7P2L/aQSAv+Uwk+m8KATX9EccViHTJR2qDtACKPIYndLGCyl3FMo+r2LMw==}
|
resolution: {integrity: sha512-vkZjpUjb6OMS7dhV+tILUW6BhpDR7P2L/aQSAv+Uwk+m8KATX9EccViHTJR2qDtACKPIYndLGCyl3FMo+r2LMw==}
|
||||||
engines: {node: '>= 10.13.0'}
|
engines: {node: '>= 10.13.0'}
|
||||||
@@ -3025,6 +3155,13 @@ packages:
|
|||||||
resolution: {integrity: sha512-cAGWPIyOHU6zlmg88jwm7VRyXnMN7iV68OGAbYDk/Mh/xC/pzVPlQtY6ngoIH/5/tciuhGfvESU8GrHrcxD56w==}
|
resolution: {integrity: sha512-cAGWPIyOHU6zlmg88jwm7VRyXnMN7iV68OGAbYDk/Mh/xC/pzVPlQtY6ngoIH/5/tciuhGfvESU8GrHrcxD56w==}
|
||||||
engines: {node: '>=8'}
|
engines: {node: '>=8'}
|
||||||
|
|
||||||
|
text-table@0.2.0:
|
||||||
|
resolution: {integrity: sha512-N+8UisAXDGk8PFXP4HAzVR9nbfmVJ3zYLAWiTIoqC5v5isinhr+r5uaO8+7r3BMfuNIufIsA7RdpVgacC2cSpw==}
|
||||||
|
|
||||||
|
textextensions@6.11.0:
|
||||||
|
resolution: {integrity: sha512-tXJwSr9355kFJI3lbCkPpUH5cP8/M0GGy2xLO34aZCjMXBaK3SoPnZwr/oWmo1FdCnELcs4npdCIOFtq9W3ruQ==}
|
||||||
|
engines: {node: '>=4'}
|
||||||
|
|
||||||
tinybench@2.9.0:
|
tinybench@2.9.0:
|
||||||
resolution: {integrity: sha512-0+DUvqWMValLmha6lr4kD8iAMK1HzV0/aKnCtWb9v9641TnP/MFb7Pc2bxoxQjTXAErryXVgUOfv2YqNllqGeg==}
|
resolution: {integrity: sha512-0+DUvqWMValLmha6lr4kD8iAMK1HzV0/aKnCtWb9v9641TnP/MFb7Pc2bxoxQjTXAErryXVgUOfv2YqNllqGeg==}
|
||||||
|
|
||||||
@@ -3217,6 +3354,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-kiGUalWN+rgBJ/1OHZsBtU4rXZOfj/7rKQxULKlIzwzQSvMJUUNgPwJEEh7gU6xEVxC0ahoOBvN2YI8GH6FNgA==}
|
resolution: {integrity: sha512-kiGUalWN+rgBJ/1OHZsBtU4rXZOfj/7rKQxULKlIzwzQSvMJUUNgPwJEEh7gU6xEVxC0ahoOBvN2YI8GH6FNgA==}
|
||||||
engines: {node: '>=10.12.0'}
|
engines: {node: '>=10.12.0'}
|
||||||
|
|
||||||
|
version-range@4.15.0:
|
||||||
|
resolution: {integrity: sha512-Ck0EJbAGxHwprkzFO966t4/5QkRuzh+/I1RxhLgUKKwEn+Cd8NwM60mE3AqBZg5gYODoXW0EFsQvbZjRlvdqbg==}
|
||||||
|
engines: {node: '>=4'}
|
||||||
|
|
||||||
vite@7.2.4:
|
vite@7.2.4:
|
||||||
resolution: {integrity: sha512-NL8jTlbo0Tn4dUEXEsUg8KeyG/Lkmc4Fnzb8JXN/Ykm9G4HNImjtABMJgkQoVjOBN/j2WAwDTRytdqJbZsah7w==}
|
resolution: {integrity: sha512-NL8jTlbo0Tn4dUEXEsUg8KeyG/Lkmc4Fnzb8JXN/Ykm9G4HNImjtABMJgkQoVjOBN/j2WAwDTRytdqJbZsah7w==}
|
||||||
engines: {node: ^20.19.0 || >=22.12.0}
|
engines: {node: ^20.19.0 || >=22.12.0}
|
||||||
@@ -3441,6 +3582,12 @@ snapshots:
|
|||||||
transitivePeerDependencies:
|
transitivePeerDependencies:
|
||||||
- chokidar
|
- chokidar
|
||||||
|
|
||||||
|
'@azu/format-text@1.0.2': {}
|
||||||
|
|
||||||
|
'@azu/style-format@1.0.1':
|
||||||
|
dependencies:
|
||||||
|
'@azu/format-text': 1.0.2
|
||||||
|
|
||||||
'@babel/code-frame@7.27.1':
|
'@babel/code-frame@7.27.1':
|
||||||
dependencies:
|
dependencies:
|
||||||
'@babel/helper-validator-identifier': 7.28.5
|
'@babel/helper-validator-identifier': 7.28.5
|
||||||
@@ -4344,6 +4491,68 @@ snapshots:
|
|||||||
'@rollup/rollup-win32-x64-msvc@4.53.3':
|
'@rollup/rollup-win32-x64-msvc@4.53.3':
|
||||||
optional: true
|
optional: true
|
||||||
|
|
||||||
|
'@secretlint/config-loader@11.2.5':
|
||||||
|
dependencies:
|
||||||
|
'@secretlint/profiler': 11.2.5
|
||||||
|
'@secretlint/resolver': 11.2.5
|
||||||
|
'@secretlint/types': 11.2.5
|
||||||
|
ajv: 8.17.1
|
||||||
|
debug: 4.4.3
|
||||||
|
rc-config-loader: 4.1.3
|
||||||
|
transitivePeerDependencies:
|
||||||
|
- supports-color
|
||||||
|
|
||||||
|
'@secretlint/core@11.2.5':
|
||||||
|
dependencies:
|
||||||
|
'@secretlint/profiler': 11.2.5
|
||||||
|
'@secretlint/types': 11.2.5
|
||||||
|
debug: 4.4.3
|
||||||
|
structured-source: 4.0.0
|
||||||
|
transitivePeerDependencies:
|
||||||
|
- supports-color
|
||||||
|
|
||||||
|
'@secretlint/formatter@11.2.5':
|
||||||
|
dependencies:
|
||||||
|
'@secretlint/resolver': 11.2.5
|
||||||
|
'@secretlint/types': 11.2.5
|
||||||
|
'@textlint/linter-formatter': 15.4.0
|
||||||
|
'@textlint/module-interop': 15.4.0
|
||||||
|
'@textlint/types': 15.4.0
|
||||||
|
chalk: 5.6.2
|
||||||
|
debug: 4.4.3
|
||||||
|
pluralize: 8.0.0
|
||||||
|
strip-ansi: 7.1.2
|
||||||
|
table: 6.9.0
|
||||||
|
terminal-link: 4.0.0
|
||||||
|
transitivePeerDependencies:
|
||||||
|
- supports-color
|
||||||
|
|
||||||
|
'@secretlint/node@11.2.5':
|
||||||
|
dependencies:
|
||||||
|
'@secretlint/config-loader': 11.2.5
|
||||||
|
'@secretlint/core': 11.2.5
|
||||||
|
'@secretlint/formatter': 11.2.5
|
||||||
|
'@secretlint/profiler': 11.2.5
|
||||||
|
'@secretlint/source-creator': 11.2.5
|
||||||
|
'@secretlint/types': 11.2.5
|
||||||
|
debug: 4.4.3
|
||||||
|
p-map: 7.0.4
|
||||||
|
transitivePeerDependencies:
|
||||||
|
- supports-color
|
||||||
|
|
||||||
|
'@secretlint/profiler@11.2.5': {}
|
||||||
|
|
||||||
|
'@secretlint/resolver@11.2.5': {}
|
||||||
|
|
||||||
|
'@secretlint/secretlint-rule-preset-recommend@11.2.5': {}
|
||||||
|
|
||||||
|
'@secretlint/source-creator@11.2.5':
|
||||||
|
dependencies:
|
||||||
|
'@secretlint/types': 11.2.5
|
||||||
|
istextorbinary: 9.5.0
|
||||||
|
|
||||||
|
'@secretlint/types@11.2.5': {}
|
||||||
|
|
||||||
'@sinclair/typebox@0.34.41': {}
|
'@sinclair/typebox@0.34.41': {}
|
||||||
|
|
||||||
'@sinonjs/commons@3.0.1':
|
'@sinonjs/commons@3.0.1':
|
||||||
@@ -4356,6 +4565,35 @@ snapshots:
|
|||||||
|
|
||||||
'@standard-schema/spec@1.0.0': {}
|
'@standard-schema/spec@1.0.0': {}
|
||||||
|
|
||||||
|
'@textlint/ast-node-types@15.4.0': {}
|
||||||
|
|
||||||
|
'@textlint/linter-formatter@15.4.0':
|
||||||
|
dependencies:
|
||||||
|
'@azu/format-text': 1.0.2
|
||||||
|
'@azu/style-format': 1.0.1
|
||||||
|
'@textlint/module-interop': 15.4.0
|
||||||
|
'@textlint/resolver': 15.4.0
|
||||||
|
'@textlint/types': 15.4.0
|
||||||
|
chalk: 4.1.2
|
||||||
|
debug: 4.4.3
|
||||||
|
js-yaml: 3.14.2
|
||||||
|
lodash: 4.17.21
|
||||||
|
pluralize: 2.0.0
|
||||||
|
string-width: 4.2.3
|
||||||
|
strip-ansi: 6.0.1
|
||||||
|
table: 6.9.0
|
||||||
|
text-table: 0.2.0
|
||||||
|
transitivePeerDependencies:
|
||||||
|
- supports-color
|
||||||
|
|
||||||
|
'@textlint/module-interop@15.4.0': {}
|
||||||
|
|
||||||
|
'@textlint/resolver@15.4.0': {}
|
||||||
|
|
||||||
|
'@textlint/types@15.4.0':
|
||||||
|
dependencies:
|
||||||
|
'@textlint/ast-node-types': 15.4.0
|
||||||
|
|
||||||
'@tokenizer/inflate@0.3.1':
|
'@tokenizer/inflate@0.3.1':
|
||||||
dependencies:
|
dependencies:
|
||||||
debug: 4.4.3
|
debug: 4.4.3
|
||||||
@@ -4865,6 +5103,10 @@ snapshots:
|
|||||||
dependencies:
|
dependencies:
|
||||||
type-fest: 0.21.3
|
type-fest: 0.21.3
|
||||||
|
|
||||||
|
ansi-escapes@7.2.0:
|
||||||
|
dependencies:
|
||||||
|
environment: 1.1.0
|
||||||
|
|
||||||
ansi-regex@5.0.1: {}
|
ansi-regex@5.0.1: {}
|
||||||
|
|
||||||
ansi-regex@6.2.2: {}
|
ansi-regex@6.2.2: {}
|
||||||
@@ -4904,6 +5146,8 @@ snapshots:
|
|||||||
estree-walker: 3.0.3
|
estree-walker: 3.0.3
|
||||||
js-tokens: 9.0.1
|
js-tokens: 9.0.1
|
||||||
|
|
||||||
|
astral-regex@2.0.0: {}
|
||||||
|
|
||||||
asynckit@0.4.0: {}
|
asynckit@0.4.0: {}
|
||||||
|
|
||||||
babel-jest@30.2.0(@babel/core@7.28.5):
|
babel-jest@30.2.0(@babel/core@7.28.5):
|
||||||
@@ -4964,12 +5208,18 @@ snapshots:
|
|||||||
|
|
||||||
baseline-browser-mapping@2.8.31: {}
|
baseline-browser-mapping@2.8.31: {}
|
||||||
|
|
||||||
|
binaryextensions@6.11.0:
|
||||||
|
dependencies:
|
||||||
|
editions: 6.22.0
|
||||||
|
|
||||||
bl@4.1.0:
|
bl@4.1.0:
|
||||||
dependencies:
|
dependencies:
|
||||||
buffer: 5.7.1
|
buffer: 5.7.1
|
||||||
inherits: 2.0.4
|
inherits: 2.0.4
|
||||||
readable-stream: 3.6.2
|
readable-stream: 3.6.2
|
||||||
|
|
||||||
|
boundary@2.0.0: {}
|
||||||
|
|
||||||
brace-expansion@1.1.12:
|
brace-expansion@1.1.12:
|
||||||
dependencies:
|
dependencies:
|
||||||
balanced-match: 1.0.2
|
balanced-match: 1.0.2
|
||||||
@@ -5031,6 +5281,8 @@ snapshots:
|
|||||||
ansi-styles: 4.3.0
|
ansi-styles: 4.3.0
|
||||||
supports-color: 7.2.0
|
supports-color: 7.2.0
|
||||||
|
|
||||||
|
chalk@5.6.2: {}
|
||||||
|
|
||||||
char-regex@1.0.2: {}
|
char-regex@1.0.2: {}
|
||||||
|
|
||||||
chardet@2.1.1: {}
|
chardet@2.1.1: {}
|
||||||
@@ -5155,6 +5407,10 @@ snapshots:
|
|||||||
|
|
||||||
eastasianwidth@0.2.0: {}
|
eastasianwidth@0.2.0: {}
|
||||||
|
|
||||||
|
editions@6.22.0:
|
||||||
|
dependencies:
|
||||||
|
version-range: 4.15.0
|
||||||
|
|
||||||
electron-to-chromium@1.5.259: {}
|
electron-to-chromium@1.5.259: {}
|
||||||
|
|
||||||
emittery@0.13.1: {}
|
emittery@0.13.1: {}
|
||||||
@@ -5168,6 +5424,8 @@ snapshots:
|
|||||||
graceful-fs: 4.2.11
|
graceful-fs: 4.2.11
|
||||||
tapable: 2.3.0
|
tapable: 2.3.0
|
||||||
|
|
||||||
|
environment@1.1.0: {}
|
||||||
|
|
||||||
error-ex@1.3.4:
|
error-ex@1.3.4:
|
||||||
dependencies:
|
dependencies:
|
||||||
is-arrayish: 0.2.1
|
is-arrayish: 0.2.1
|
||||||
@@ -5647,6 +5905,12 @@ snapshots:
|
|||||||
html-escaper: 2.0.2
|
html-escaper: 2.0.2
|
||||||
istanbul-lib-report: 3.0.1
|
istanbul-lib-report: 3.0.1
|
||||||
|
|
||||||
|
istextorbinary@9.5.0:
|
||||||
|
dependencies:
|
||||||
|
binaryextensions: 6.11.0
|
||||||
|
editions: 6.22.0
|
||||||
|
textextensions: 6.11.0
|
||||||
|
|
||||||
iterare@1.2.1: {}
|
iterare@1.2.1: {}
|
||||||
|
|
||||||
jackspeak@3.4.3:
|
jackspeak@3.4.3:
|
||||||
@@ -6041,6 +6305,8 @@ snapshots:
|
|||||||
|
|
||||||
lodash.merge@4.6.2: {}
|
lodash.merge@4.6.2: {}
|
||||||
|
|
||||||
|
lodash.truncate@4.4.2: {}
|
||||||
|
|
||||||
lodash@4.17.21: {}
|
lodash@4.17.21: {}
|
||||||
|
|
||||||
log-symbols@4.1.0:
|
log-symbols@4.1.0:
|
||||||
@@ -6204,6 +6470,8 @@ snapshots:
|
|||||||
dependencies:
|
dependencies:
|
||||||
p-limit: 3.1.0
|
p-limit: 3.1.0
|
||||||
|
|
||||||
|
p-map@7.0.4: {}
|
||||||
|
|
||||||
p-try@2.2.0: {}
|
p-try@2.2.0: {}
|
||||||
|
|
||||||
package-json-from-dist@1.0.1: {}
|
package-json-from-dist@1.0.1: {}
|
||||||
@@ -6255,6 +6523,8 @@ snapshots:
|
|||||||
dependencies:
|
dependencies:
|
||||||
find-up: 4.1.0
|
find-up: 4.1.0
|
||||||
|
|
||||||
|
pluralize@2.0.0: {}
|
||||||
|
|
||||||
pluralize@8.0.0: {}
|
pluralize@8.0.0: {}
|
||||||
|
|
||||||
postcss@8.5.6:
|
postcss@8.5.6:
|
||||||
@@ -6291,6 +6561,15 @@ snapshots:
|
|||||||
dependencies:
|
dependencies:
|
||||||
safe-buffer: 5.2.1
|
safe-buffer: 5.2.1
|
||||||
|
|
||||||
|
rc-config-loader@4.1.3:
|
||||||
|
dependencies:
|
||||||
|
debug: 4.4.3
|
||||||
|
js-yaml: 4.1.1
|
||||||
|
json5: 2.2.3
|
||||||
|
require-from-string: 2.0.2
|
||||||
|
transitivePeerDependencies:
|
||||||
|
- supports-color
|
||||||
|
|
||||||
react-is@18.3.1: {}
|
react-is@18.3.1: {}
|
||||||
|
|
||||||
readable-stream@3.6.2:
|
readable-stream@3.6.2:
|
||||||
@@ -6441,6 +6720,12 @@ snapshots:
|
|||||||
|
|
||||||
slash@3.0.0: {}
|
slash@3.0.0: {}
|
||||||
|
|
||||||
|
slice-ansi@4.0.0:
|
||||||
|
dependencies:
|
||||||
|
ansi-styles: 4.3.0
|
||||||
|
astral-regex: 2.0.0
|
||||||
|
is-fullwidth-code-point: 3.0.0
|
||||||
|
|
||||||
source-map-js@1.2.1: {}
|
source-map-js@1.2.1: {}
|
||||||
|
|
||||||
source-map-support@0.5.13:
|
source-map-support@0.5.13:
|
||||||
@@ -6510,6 +6795,10 @@ snapshots:
|
|||||||
dependencies:
|
dependencies:
|
||||||
'@tokenizer/token': 0.3.0
|
'@tokenizer/token': 0.3.0
|
||||||
|
|
||||||
|
structured-source@4.0.0:
|
||||||
|
dependencies:
|
||||||
|
boundary: 2.0.0
|
||||||
|
|
||||||
superagent@10.2.3:
|
superagent@10.2.3:
|
||||||
dependencies:
|
dependencies:
|
||||||
component-emitter: 1.3.1
|
component-emitter: 1.3.1
|
||||||
@@ -6539,14 +6828,32 @@ snapshots:
|
|||||||
dependencies:
|
dependencies:
|
||||||
has-flag: 4.0.0
|
has-flag: 4.0.0
|
||||||
|
|
||||||
|
supports-hyperlinks@3.2.0:
|
||||||
|
dependencies:
|
||||||
|
has-flag: 4.0.0
|
||||||
|
supports-color: 7.2.0
|
||||||
|
|
||||||
symbol-observable@4.0.0: {}
|
symbol-observable@4.0.0: {}
|
||||||
|
|
||||||
synckit@0.11.11:
|
synckit@0.11.11:
|
||||||
dependencies:
|
dependencies:
|
||||||
'@pkgr/core': 0.2.9
|
'@pkgr/core': 0.2.9
|
||||||
|
|
||||||
|
table@6.9.0:
|
||||||
|
dependencies:
|
||||||
|
ajv: 8.17.1
|
||||||
|
lodash.truncate: 4.4.2
|
||||||
|
slice-ansi: 4.0.0
|
||||||
|
string-width: 4.2.3
|
||||||
|
strip-ansi: 6.0.1
|
||||||
|
|
||||||
tapable@2.3.0: {}
|
tapable@2.3.0: {}
|
||||||
|
|
||||||
|
terminal-link@4.0.0:
|
||||||
|
dependencies:
|
||||||
|
ansi-escapes: 7.2.0
|
||||||
|
supports-hyperlinks: 3.2.0
|
||||||
|
|
||||||
terser-webpack-plugin@5.3.14(webpack@5.100.2):
|
terser-webpack-plugin@5.3.14(webpack@5.100.2):
|
||||||
dependencies:
|
dependencies:
|
||||||
'@jridgewell/trace-mapping': 0.3.31
|
'@jridgewell/trace-mapping': 0.3.31
|
||||||
@@ -6569,6 +6876,12 @@ snapshots:
|
|||||||
glob: 7.2.3
|
glob: 7.2.3
|
||||||
minimatch: 3.1.2
|
minimatch: 3.1.2
|
||||||
|
|
||||||
|
text-table@0.2.0: {}
|
||||||
|
|
||||||
|
textextensions@6.11.0:
|
||||||
|
dependencies:
|
||||||
|
editions: 6.22.0
|
||||||
|
|
||||||
tinybench@2.9.0: {}
|
tinybench@2.9.0: {}
|
||||||
|
|
||||||
tinyexec@0.3.2: {}
|
tinyexec@0.3.2: {}
|
||||||
@@ -6770,6 +7083,8 @@ snapshots:
|
|||||||
'@types/istanbul-lib-coverage': 2.0.6
|
'@types/istanbul-lib-coverage': 2.0.6
|
||||||
convert-source-map: 2.0.0
|
convert-source-map: 2.0.0
|
||||||
|
|
||||||
|
version-range@4.15.0: {}
|
||||||
|
|
||||||
vite@7.2.4(@types/node@22.19.1)(terser@5.44.1)(tsx@4.20.6):
|
vite@7.2.4(@types/node@22.19.1)(terser@5.44.1)(tsx@4.20.6):
|
||||||
dependencies:
|
dependencies:
|
||||||
esbuild: 0.25.12
|
esbuild: 0.25.12
|
||||||
|
|||||||
Reference in New Issue
Block a user