mirror of
https://github.com/samiyev/puaros.git
synced 2025-12-28 07:16:53 +05:00
Compare commits
5 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
1d6c2a0e00 | ||
|
|
db8a97202e | ||
|
|
0b1cc5a79a | ||
|
|
8d400c9517 | ||
|
|
9fb9beb311 |
@@ -5,6 +5,118 @@ All notable changes to @samiyev/guardian will be documented in this file.
|
|||||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||||
|
|
||||||
|
## [0.8.1] - 2025-11-25
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- 🧹 **Code quality improvements** - Fixed all 63 hardcoded value issues detected by Guardian self-check:
|
||||||
|
- Fixed 1 CRITICAL: Removed hardcoded Slack token from documentation examples
|
||||||
|
- Fixed 1 HIGH: Removed aws-sdk framework leak from domain layer examples
|
||||||
|
- Fixed 4 MEDIUM: Renamed pipeline files to follow verb-noun convention
|
||||||
|
- Fixed 57 LOW: Extracted all magic strings to reusable constants
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- 📦 **New constants file** - `domain/constants/SecretExamples.ts`:
|
||||||
|
- 32 secret keyword constants (AWS, GitHub, NPM, SSH, Slack, etc.)
|
||||||
|
- 15 secret type name constants
|
||||||
|
- 7 example secret values for documentation
|
||||||
|
- Regex patterns and encoding constants
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- ♻️ **Refactored pipeline naming** - Updated use case files to follow naming conventions:
|
||||||
|
- `DetectionPipeline.ts` → `ExecuteDetection.ts`
|
||||||
|
- `FileCollectionStep.ts` → `CollectFiles.ts`
|
||||||
|
- `ParsingStep.ts` → `ParseSourceFiles.ts`
|
||||||
|
- `ResultAggregator.ts` → `AggregateResults.ts`
|
||||||
|
- Added `Aggregate`, `Collect`, `Parse` to `USE_CASE_VERBS` list
|
||||||
|
- 🔧 **Updated 3 core files to use constants**:
|
||||||
|
- `SecretViolation.ts`: All secret examples use constants, `getSeverity()` returns `typeof SEVERITY_LEVELS.CRITICAL`
|
||||||
|
- `SecretDetector.ts`: All secret keywords use constants
|
||||||
|
- `MagicStringMatcher.ts`: Regex patterns extracted to constants
|
||||||
|
- 📝 **Test updates** - Updated 2 tests to match new example fix messages
|
||||||
|
|
||||||
|
### Quality
|
||||||
|
|
||||||
|
- ✅ **Guardian self-check** - 0 issues (was 63) - 100% clean codebase
|
||||||
|
- ✅ **All tests pass** - 566/566 tests passing
|
||||||
|
- ✅ **Build successful** - TypeScript compilation with no errors
|
||||||
|
- ✅ **Linter clean** - 0 errors, 2 acceptable warnings (complexity, params)
|
||||||
|
- ✅ **Format verified** - All files properly formatted with 4-space indentation
|
||||||
|
|
||||||
|
## [0.8.0] - 2025-11-25
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- 🔐 **Secret Detection** - NEW CRITICAL security feature using industry-standard Secretlint:
|
||||||
|
- Detects 350+ types of hardcoded secrets (AWS keys, GitHub tokens, NPM tokens, SSH keys, API keys, etc.)
|
||||||
|
- All secrets marked as **CRITICAL severity** for immediate attention
|
||||||
|
- Context-aware remediation suggestions for each secret type
|
||||||
|
- Integrated seamlessly with existing detectors
|
||||||
|
- New `SecretDetector` infrastructure component using `@secretlint/node`
|
||||||
|
- New `SecretViolation` value object with rich examples
|
||||||
|
- New `ISecretDetector` domain interface
|
||||||
|
- CLI output with "🔐 Found X hardcoded secrets - CRITICAL SECURITY RISK" section
|
||||||
|
- Added dependencies: `@secretlint/node`, `@secretlint/core`, `@secretlint/types`, `@secretlint/secretlint-rule-preset-recommend`
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- 🔄 **Pipeline async support** - `DetectionPipeline.execute()` now async for secret detection
|
||||||
|
- 📊 **Test suite expanded** - Added 47 new tests (23 for SecretViolation, 24 for SecretDetector)
|
||||||
|
- Total: 566 tests (was 519), 100% pass rate
|
||||||
|
- Coverage: 93.3% statements, 83.74% branches, 98.17% functions
|
||||||
|
- SecretViolation: 100% coverage
|
||||||
|
- 📝 **Documentation updated**:
|
||||||
|
- README.md: Added Secret Detection section with examples
|
||||||
|
- ROADMAP.md: Marked v0.8.0 as released
|
||||||
|
- Updated package description to mention secrets detection
|
||||||
|
|
||||||
|
### Security
|
||||||
|
|
||||||
|
- 🛡️ **Prevents credentials in version control** - catches AWS, GitHub, NPM, SSH, Slack, GCP secrets before commit
|
||||||
|
- ⚠️ **CRITICAL violations** - all hardcoded secrets immediately flagged with highest severity
|
||||||
|
- 💡 **Smart remediation** - provides specific guidance per secret type (environment variables, secret managers, etc.)
|
||||||
|
|
||||||
|
## [0.7.9] - 2025-11-25
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- ♻️ **Refactored large detectors** - significantly improved maintainability and reduced complexity:
|
||||||
|
- **AggregateBoundaryDetector**: Reduced from 381 to 162 lines (57% reduction)
|
||||||
|
- **HardcodeDetector**: Reduced from 459 to 89 lines (81% reduction)
|
||||||
|
- **RepositoryPatternDetector**: Reduced from 479 to 106 lines (78% reduction)
|
||||||
|
- Extracted 13 focused strategy classes for single responsibilities
|
||||||
|
- All 519 tests pass, no breaking changes
|
||||||
|
- Zero ESLint errors (1 pre-existing warning unrelated to refactoring)
|
||||||
|
- Improved code organization and separation of concerns
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- 🏗️ **13 new strategy classes** for focused responsibilities:
|
||||||
|
- `FolderRegistry` - Centralized DDD folder name management
|
||||||
|
- `AggregatePathAnalyzer` - Path parsing and aggregate extraction
|
||||||
|
- `ImportValidator` - Import validation logic
|
||||||
|
- `BraceTracker` - Brace and bracket counting
|
||||||
|
- `ConstantsFileChecker` - Constants file detection
|
||||||
|
- `ExportConstantAnalyzer` - Export const analysis
|
||||||
|
- `MagicNumberMatcher` - Magic number detection
|
||||||
|
- `MagicStringMatcher` - Magic string detection
|
||||||
|
- `OrmTypeMatcher` - ORM type matching
|
||||||
|
- `MethodNameValidator` - Repository method validation
|
||||||
|
- `RepositoryFileAnalyzer` - File role detection
|
||||||
|
- `RepositoryViolationDetector` - Violation detection logic
|
||||||
|
- Enhanced testability with smaller, focused classes
|
||||||
|
|
||||||
|
### Improved
|
||||||
|
|
||||||
|
- 📊 **Code quality metrics**:
|
||||||
|
- Reduced cyclomatic complexity across all three detectors
|
||||||
|
- Better separation of concerns with strategy pattern
|
||||||
|
- More maintainable and extensible codebase
|
||||||
|
- Easier to add new detection patterns
|
||||||
|
- Improved code readability and self-documentation
|
||||||
|
|
||||||
## [0.7.8] - 2025-11-25
|
## [0.7.8] - 2025-11-25
|
||||||
|
|
||||||
### Added
|
### Added
|
||||||
|
|||||||
@@ -72,7 +72,7 @@ Code quality guardian for vibe coders and enterprise teams - because AI writes f
|
|||||||
- Prevents "new Repository()" anti-pattern
|
- Prevents "new Repository()" anti-pattern
|
||||||
- 📚 *Based on: Martin Fowler's Repository Pattern, DDD (Evans 2003)* → [Why?](./docs/WHY.md#repository-pattern)
|
- 📚 *Based on: Martin Fowler's Repository Pattern, DDD (Evans 2003)* → [Why?](./docs/WHY.md#repository-pattern)
|
||||||
|
|
||||||
🔒 **Aggregate Boundary Validation** ✨ NEW
|
🔒 **Aggregate Boundary Validation**
|
||||||
- Detects direct entity references across DDD aggregates
|
- Detects direct entity references across DDD aggregates
|
||||||
- Enforces reference-by-ID or Value Object pattern
|
- Enforces reference-by-ID or Value Object pattern
|
||||||
- Prevents tight coupling between aggregates
|
- Prevents tight coupling between aggregates
|
||||||
@@ -81,6 +81,15 @@ Code quality guardian for vibe coders and enterprise teams - because AI writes f
|
|||||||
- Critical severity for maintaining aggregate independence
|
- Critical severity for maintaining aggregate independence
|
||||||
- 📚 *Based on: Domain-Driven Design (Evans 2003), Implementing DDD (Vernon 2013)* → [Why?](./docs/WHY.md#aggregate-boundaries)
|
- 📚 *Based on: Domain-Driven Design (Evans 2003), Implementing DDD (Vernon 2013)* → [Why?](./docs/WHY.md#aggregate-boundaries)
|
||||||
|
|
||||||
|
🔐 **Secret Detection** ✨ NEW in v0.8.0
|
||||||
|
- Detects 350+ types of hardcoded secrets using industry-standard Secretlint
|
||||||
|
- Catches AWS keys, GitHub tokens, NPM tokens, SSH keys, API keys, and more
|
||||||
|
- All secrets marked as **CRITICAL severity** - immediate security risk
|
||||||
|
- Context-aware remediation suggestions for each secret type
|
||||||
|
- Prevents credentials from reaching version control
|
||||||
|
- Integrates seamlessly with existing detectors
|
||||||
|
- 📚 *Based on: OWASP Top 10, CWE-798 (Hardcoded Credentials), NIST Security Guidelines* → [Learn more](https://owasp.org/www-community/vulnerabilities/Use_of_hard-coded_password)
|
||||||
|
|
||||||
🏗️ **Clean Architecture Enforcement**
|
🏗️ **Clean Architecture Enforcement**
|
||||||
- Built with DDD principles
|
- Built with DDD principles
|
||||||
- Layered architecture (Domain, Application, Infrastructure)
|
- Layered architecture (Domain, Application, Infrastructure)
|
||||||
@@ -366,6 +375,15 @@ const result = await analyzeProject({
|
|||||||
})
|
})
|
||||||
|
|
||||||
console.log(`Found ${result.hardcodeViolations.length} hardcoded values`)
|
console.log(`Found ${result.hardcodeViolations.length} hardcoded values`)
|
||||||
|
console.log(`Found ${result.secretViolations.length} hardcoded secrets 🔐`)
|
||||||
|
|
||||||
|
// Check for critical security issues first!
|
||||||
|
result.secretViolations.forEach((violation) => {
|
||||||
|
console.log(`🔐 CRITICAL: ${violation.file}:${violation.line}`)
|
||||||
|
console.log(` Secret Type: ${violation.secretType}`)
|
||||||
|
console.log(` ${violation.message}`)
|
||||||
|
console.log(` ⚠️ Rotate this secret immediately!`)
|
||||||
|
})
|
||||||
|
|
||||||
result.hardcodeViolations.forEach((violation) => {
|
result.hardcodeViolations.forEach((violation) => {
|
||||||
console.log(`${violation.file}:${violation.line}`)
|
console.log(`${violation.file}:${violation.line}`)
|
||||||
@@ -394,9 +412,9 @@ npx @samiyev/guardian check ./src --verbose
|
|||||||
npx @samiyev/guardian check ./src --no-hardcode # Skip hardcode detection
|
npx @samiyev/guardian check ./src --no-hardcode # Skip hardcode detection
|
||||||
npx @samiyev/guardian check ./src --no-architecture # Skip architecture checks
|
npx @samiyev/guardian check ./src --no-architecture # Skip architecture checks
|
||||||
|
|
||||||
# Filter by severity
|
# Filter by severity (perfect for finding secrets first!)
|
||||||
npx @samiyev/guardian check ./src --min-severity high # Show high, critical only
|
npx @samiyev/guardian check ./src --only-critical # Show only critical issues (secrets, circular deps)
|
||||||
npx @samiyev/guardian check ./src --only-critical # Show only critical issues
|
npx @samiyev/guardian check ./src --min-severity high # Show high and critical only
|
||||||
|
|
||||||
# Limit detailed output (useful for large codebases)
|
# Limit detailed output (useful for large codebases)
|
||||||
npx @samiyev/guardian check ./src --limit 10 # Show first 10 violations per category
|
npx @samiyev/guardian check ./src --limit 10 # Show first 10 violations per category
|
||||||
|
|||||||
@@ -409,33 +409,52 @@ Add integration tests for full pipeline and CLI.
|
|||||||
- ✅ 519 total tests (519 passing, **100% pass rate** 🎉)
|
- ✅ 519 total tests (519 passing, **100% pass rate** 🎉)
|
||||||
- ✅ Comprehensive E2E coverage for API and CLI
|
- ✅ Comprehensive E2E coverage for API and CLI
|
||||||
- ✅ 3 new E2E test files with full pipeline coverage
|
- ✅ 3 new E2E test files with full pipeline coverage
|
||||||
- [ ] Publish to npm
|
- ✅ Publish to npm
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.7.9 - Refactor Large Detectors 🔧 (Optional)
|
### Version 0.7.9 - Refactor Large Detectors 🔧 ✅ RELEASED
|
||||||
|
|
||||||
|
**Released:** 2025-11-25
|
||||||
**Priority:** LOW
|
**Priority:** LOW
|
||||||
**Scope:** Single session (~128K tokens)
|
**Scope:** Single session (~128K tokens)
|
||||||
|
|
||||||
Refactor largest detectors to reduce complexity.
|
Refactored largest detectors to reduce complexity and improve maintainability.
|
||||||
|
|
||||||
**Targets:**
|
**Results:**
|
||||||
| Detector | Lines | Complexity |
|
| Detector | Before | After | Reduction |
|
||||||
|----------|-------|------------|
|
|----------|--------|-------|-----------|
|
||||||
| RepositoryPatternDetector | 479 | 35 |
|
| AggregateBoundaryDetector | 381 lines | 162 lines | 57% ✅ |
|
||||||
| HardcodeDetector | 459 | 41 |
|
| HardcodeDetector | 459 lines | 89 lines | 81% ✅ |
|
||||||
| AggregateBoundaryDetector | 381 | 47 |
|
| RepositoryPatternDetector | 479 lines | 106 lines | 78% ✅ |
|
||||||
|
|
||||||
**Deliverables:**
|
**Implemented Features:**
|
||||||
- [ ] Extract regex patterns into strategies
|
- ✅ Extracted 13 strategy classes for focused responsibilities
|
||||||
- [ ] Reduce cyclomatic complexity < 25
|
- ✅ Reduced file sizes by 57-81%
|
||||||
- [ ] Publish to npm
|
- ✅ Improved code organization and maintainability
|
||||||
|
- ✅ All 519 tests passing
|
||||||
|
- ✅ Zero ESLint errors, 1 pre-existing warning
|
||||||
|
- ✅ No breaking changes
|
||||||
|
|
||||||
|
**New Strategy Classes:**
|
||||||
|
- `FolderRegistry` - Centralized DDD folder name management
|
||||||
|
- `AggregatePathAnalyzer` - Path parsing and aggregate extraction
|
||||||
|
- `ImportValidator` - Import validation logic
|
||||||
|
- `BraceTracker` - Brace and bracket counting
|
||||||
|
- `ConstantsFileChecker` - Constants file detection
|
||||||
|
- `ExportConstantAnalyzer` - Export const analysis
|
||||||
|
- `MagicNumberMatcher` - Magic number detection
|
||||||
|
- `MagicStringMatcher` - Magic string detection
|
||||||
|
- `OrmTypeMatcher` - ORM type matching
|
||||||
|
- `MethodNameValidator` - Repository method validation
|
||||||
|
- `RepositoryFileAnalyzer` - File role detection
|
||||||
|
- `RepositoryViolationDetector` - Violation detection logic
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Version 0.8.0 - Secret Detection 🔐
|
### Version 0.8.0 - Secret Detection 🔐 ✅ RELEASED
|
||||||
**Target:** Q1 2025
|
|
||||||
|
**Released:** 2025-11-25
|
||||||
**Priority:** CRITICAL
|
**Priority:** CRITICAL
|
||||||
|
|
||||||
Detect hardcoded secrets (API keys, tokens, credentials) using industry-standard Secretlint library.
|
Detect hardcoded secrets (API keys, tokens, credentials) using industry-standard Secretlint library.
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
"name": "@samiyev/guardian",
|
"name": "@samiyev/guardian",
|
||||||
"version": "0.7.8",
|
"version": "0.8.1",
|
||||||
"description": "Research-backed code quality guardian for AI-assisted development. Detects hardcodes, circular deps, framework leaks, entity exposure, and 8 architecture violations. Enforces Clean Architecture/DDD principles. Works with GitHub Copilot, Cursor, Windsurf, Claude, ChatGPT, Cline, and any AI coding tool.",
|
"description": "Research-backed code quality guardian for AI-assisted development. Detects hardcodes, secrets, circular deps, framework leaks, entity exposure, and 9 architecture violations. Enforces Clean Architecture/DDD principles. Works with GitHub Copilot, Cursor, Windsurf, Claude, ChatGPT, Cline, and any AI coding tool.",
|
||||||
"keywords": [
|
"keywords": [
|
||||||
"puaros",
|
"puaros",
|
||||||
"guardian",
|
"guardian",
|
||||||
@@ -82,6 +82,10 @@
|
|||||||
"guardian": "./bin/guardian.js"
|
"guardian": "./bin/guardian.js"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
|
"@secretlint/core": "^11.2.5",
|
||||||
|
"@secretlint/node": "^11.2.5",
|
||||||
|
"@secretlint/secretlint-rule-preset-recommend": "^11.2.5",
|
||||||
|
"@secretlint/types": "^11.2.5",
|
||||||
"commander": "^12.1.0",
|
"commander": "^12.1.0",
|
||||||
"simple-git": "^3.30.0",
|
"simple-git": "^3.30.0",
|
||||||
"tree-sitter": "^0.21.1",
|
"tree-sitter": "^0.21.1",
|
||||||
|
|||||||
@@ -12,6 +12,7 @@ import { IEntityExposureDetector } from "./domain/services/IEntityExposureDetect
|
|||||||
import { IDependencyDirectionDetector } from "./domain/services/IDependencyDirectionDetector"
|
import { IDependencyDirectionDetector } from "./domain/services/IDependencyDirectionDetector"
|
||||||
import { IRepositoryPatternDetector } from "./domain/services/RepositoryPatternDetectorService"
|
import { IRepositoryPatternDetector } from "./domain/services/RepositoryPatternDetectorService"
|
||||||
import { IAggregateBoundaryDetector } from "./domain/services/IAggregateBoundaryDetector"
|
import { IAggregateBoundaryDetector } from "./domain/services/IAggregateBoundaryDetector"
|
||||||
|
import { ISecretDetector } from "./domain/services/ISecretDetector"
|
||||||
import { FileScanner } from "./infrastructure/scanners/FileScanner"
|
import { FileScanner } from "./infrastructure/scanners/FileScanner"
|
||||||
import { CodeParser } from "./infrastructure/parsers/CodeParser"
|
import { CodeParser } from "./infrastructure/parsers/CodeParser"
|
||||||
import { HardcodeDetector } from "./infrastructure/analyzers/HardcodeDetector"
|
import { HardcodeDetector } from "./infrastructure/analyzers/HardcodeDetector"
|
||||||
@@ -21,6 +22,7 @@ import { EntityExposureDetector } from "./infrastructure/analyzers/EntityExposur
|
|||||||
import { DependencyDirectionDetector } from "./infrastructure/analyzers/DependencyDirectionDetector"
|
import { DependencyDirectionDetector } from "./infrastructure/analyzers/DependencyDirectionDetector"
|
||||||
import { RepositoryPatternDetector } from "./infrastructure/analyzers/RepositoryPatternDetector"
|
import { RepositoryPatternDetector } from "./infrastructure/analyzers/RepositoryPatternDetector"
|
||||||
import { AggregateBoundaryDetector } from "./infrastructure/analyzers/AggregateBoundaryDetector"
|
import { AggregateBoundaryDetector } from "./infrastructure/analyzers/AggregateBoundaryDetector"
|
||||||
|
import { SecretDetector } from "./infrastructure/analyzers/SecretDetector"
|
||||||
import { ERROR_MESSAGES } from "./shared/constants"
|
import { ERROR_MESSAGES } from "./shared/constants"
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -79,6 +81,7 @@ export async function analyzeProject(
|
|||||||
new DependencyDirectionDetector()
|
new DependencyDirectionDetector()
|
||||||
const repositoryPatternDetector: IRepositoryPatternDetector = new RepositoryPatternDetector()
|
const repositoryPatternDetector: IRepositoryPatternDetector = new RepositoryPatternDetector()
|
||||||
const aggregateBoundaryDetector: IAggregateBoundaryDetector = new AggregateBoundaryDetector()
|
const aggregateBoundaryDetector: IAggregateBoundaryDetector = new AggregateBoundaryDetector()
|
||||||
|
const secretDetector: ISecretDetector = new SecretDetector()
|
||||||
const useCase = new AnalyzeProject(
|
const useCase = new AnalyzeProject(
|
||||||
fileScanner,
|
fileScanner,
|
||||||
codeParser,
|
codeParser,
|
||||||
@@ -89,6 +92,7 @@ export async function analyzeProject(
|
|||||||
dependencyDirectionDetector,
|
dependencyDirectionDetector,
|
||||||
repositoryPatternDetector,
|
repositoryPatternDetector,
|
||||||
aggregateBoundaryDetector,
|
aggregateBoundaryDetector,
|
||||||
|
secretDetector,
|
||||||
)
|
)
|
||||||
|
|
||||||
const result = await useCase.execute(options)
|
const result = await useCase.execute(options)
|
||||||
|
|||||||
@@ -9,12 +9,13 @@ import { IEntityExposureDetector } from "../../domain/services/IEntityExposureDe
|
|||||||
import { IDependencyDirectionDetector } from "../../domain/services/IDependencyDirectionDetector"
|
import { IDependencyDirectionDetector } from "../../domain/services/IDependencyDirectionDetector"
|
||||||
import { IRepositoryPatternDetector } from "../../domain/services/RepositoryPatternDetectorService"
|
import { IRepositoryPatternDetector } from "../../domain/services/RepositoryPatternDetectorService"
|
||||||
import { IAggregateBoundaryDetector } from "../../domain/services/IAggregateBoundaryDetector"
|
import { IAggregateBoundaryDetector } from "../../domain/services/IAggregateBoundaryDetector"
|
||||||
|
import { ISecretDetector } from "../../domain/services/ISecretDetector"
|
||||||
import { SourceFile } from "../../domain/entities/SourceFile"
|
import { SourceFile } from "../../domain/entities/SourceFile"
|
||||||
import { DependencyGraph } from "../../domain/entities/DependencyGraph"
|
import { DependencyGraph } from "../../domain/entities/DependencyGraph"
|
||||||
import { FileCollectionStep } from "./pipeline/FileCollectionStep"
|
import { CollectFiles } from "./pipeline/CollectFiles"
|
||||||
import { ParsingStep } from "./pipeline/ParsingStep"
|
import { ParseSourceFiles } from "./pipeline/ParseSourceFiles"
|
||||||
import { DetectionPipeline } from "./pipeline/DetectionPipeline"
|
import { ExecuteDetection } from "./pipeline/ExecuteDetection"
|
||||||
import { ResultAggregator } from "./pipeline/ResultAggregator"
|
import { AggregateResults } from "./pipeline/AggregateResults"
|
||||||
import {
|
import {
|
||||||
ERROR_MESSAGES,
|
ERROR_MESSAGES,
|
||||||
HARDCODE_TYPES,
|
HARDCODE_TYPES,
|
||||||
@@ -42,6 +43,7 @@ export interface AnalyzeProjectResponse {
|
|||||||
dependencyDirectionViolations: DependencyDirectionViolation[]
|
dependencyDirectionViolations: DependencyDirectionViolation[]
|
||||||
repositoryPatternViolations: RepositoryPatternViolation[]
|
repositoryPatternViolations: RepositoryPatternViolation[]
|
||||||
aggregateBoundaryViolations: AggregateBoundaryViolation[]
|
aggregateBoundaryViolations: AggregateBoundaryViolation[]
|
||||||
|
secretViolations: SecretViolation[]
|
||||||
metrics: ProjectMetrics
|
metrics: ProjectMetrics
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -163,6 +165,17 @@ export interface AggregateBoundaryViolation {
|
|||||||
severity: SeverityLevel
|
severity: SeverityLevel
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export interface SecretViolation {
|
||||||
|
rule: typeof RULES.SECRET_EXPOSURE
|
||||||
|
secretType: string
|
||||||
|
file: string
|
||||||
|
line: number
|
||||||
|
column: number
|
||||||
|
message: string
|
||||||
|
suggestion: string
|
||||||
|
severity: SeverityLevel
|
||||||
|
}
|
||||||
|
|
||||||
export interface ProjectMetrics {
|
export interface ProjectMetrics {
|
||||||
totalFiles: number
|
totalFiles: number
|
||||||
totalFunctions: number
|
totalFunctions: number
|
||||||
@@ -178,10 +191,10 @@ export class AnalyzeProject extends UseCase<
|
|||||||
AnalyzeProjectRequest,
|
AnalyzeProjectRequest,
|
||||||
ResponseDto<AnalyzeProjectResponse>
|
ResponseDto<AnalyzeProjectResponse>
|
||||||
> {
|
> {
|
||||||
private readonly fileCollectionStep: FileCollectionStep
|
private readonly fileCollectionStep: CollectFiles
|
||||||
private readonly parsingStep: ParsingStep
|
private readonly parsingStep: ParseSourceFiles
|
||||||
private readonly detectionPipeline: DetectionPipeline
|
private readonly detectionPipeline: ExecuteDetection
|
||||||
private readonly resultAggregator: ResultAggregator
|
private readonly resultAggregator: AggregateResults
|
||||||
|
|
||||||
constructor(
|
constructor(
|
||||||
fileScanner: IFileScanner,
|
fileScanner: IFileScanner,
|
||||||
@@ -193,11 +206,12 @@ export class AnalyzeProject extends UseCase<
|
|||||||
dependencyDirectionDetector: IDependencyDirectionDetector,
|
dependencyDirectionDetector: IDependencyDirectionDetector,
|
||||||
repositoryPatternDetector: IRepositoryPatternDetector,
|
repositoryPatternDetector: IRepositoryPatternDetector,
|
||||||
aggregateBoundaryDetector: IAggregateBoundaryDetector,
|
aggregateBoundaryDetector: IAggregateBoundaryDetector,
|
||||||
|
secretDetector: ISecretDetector,
|
||||||
) {
|
) {
|
||||||
super()
|
super()
|
||||||
this.fileCollectionStep = new FileCollectionStep(fileScanner)
|
this.fileCollectionStep = new CollectFiles(fileScanner)
|
||||||
this.parsingStep = new ParsingStep(codeParser)
|
this.parsingStep = new ParseSourceFiles(codeParser)
|
||||||
this.detectionPipeline = new DetectionPipeline(
|
this.detectionPipeline = new ExecuteDetection(
|
||||||
hardcodeDetector,
|
hardcodeDetector,
|
||||||
namingConventionDetector,
|
namingConventionDetector,
|
||||||
frameworkLeakDetector,
|
frameworkLeakDetector,
|
||||||
@@ -205,8 +219,9 @@ export class AnalyzeProject extends UseCase<
|
|||||||
dependencyDirectionDetector,
|
dependencyDirectionDetector,
|
||||||
repositoryPatternDetector,
|
repositoryPatternDetector,
|
||||||
aggregateBoundaryDetector,
|
aggregateBoundaryDetector,
|
||||||
|
secretDetector,
|
||||||
)
|
)
|
||||||
this.resultAggregator = new ResultAggregator()
|
this.resultAggregator = new AggregateResults()
|
||||||
}
|
}
|
||||||
|
|
||||||
public async execute(
|
public async execute(
|
||||||
@@ -224,7 +239,7 @@ export class AnalyzeProject extends UseCase<
|
|||||||
rootDir: request.rootDir,
|
rootDir: request.rootDir,
|
||||||
})
|
})
|
||||||
|
|
||||||
const detectionResult = this.detectionPipeline.execute({
|
const detectionResult = await this.detectionPipeline.execute({
|
||||||
sourceFiles,
|
sourceFiles,
|
||||||
dependencyGraph,
|
dependencyGraph,
|
||||||
})
|
})
|
||||||
|
|||||||
@@ -12,6 +12,7 @@ import type {
|
|||||||
NamingConventionViolation,
|
NamingConventionViolation,
|
||||||
ProjectMetrics,
|
ProjectMetrics,
|
||||||
RepositoryPatternViolation,
|
RepositoryPatternViolation,
|
||||||
|
SecretViolation,
|
||||||
} from "../AnalyzeProject"
|
} from "../AnalyzeProject"
|
||||||
|
|
||||||
export interface AggregationRequest {
|
export interface AggregationRequest {
|
||||||
@@ -27,12 +28,13 @@ export interface AggregationRequest {
|
|||||||
dependencyDirectionViolations: DependencyDirectionViolation[]
|
dependencyDirectionViolations: DependencyDirectionViolation[]
|
||||||
repositoryPatternViolations: RepositoryPatternViolation[]
|
repositoryPatternViolations: RepositoryPatternViolation[]
|
||||||
aggregateBoundaryViolations: AggregateBoundaryViolation[]
|
aggregateBoundaryViolations: AggregateBoundaryViolation[]
|
||||||
|
secretViolations: SecretViolation[]
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Pipeline step responsible for building final response DTO
|
* Pipeline step responsible for building final response DTO
|
||||||
*/
|
*/
|
||||||
export class ResultAggregator {
|
export class AggregateResults {
|
||||||
public execute(request: AggregationRequest): AnalyzeProjectResponse {
|
public execute(request: AggregationRequest): AnalyzeProjectResponse {
|
||||||
const metrics = this.calculateMetrics(
|
const metrics = this.calculateMetrics(
|
||||||
request.sourceFiles,
|
request.sourceFiles,
|
||||||
@@ -52,6 +54,7 @@ export class ResultAggregator {
|
|||||||
dependencyDirectionViolations: request.dependencyDirectionViolations,
|
dependencyDirectionViolations: request.dependencyDirectionViolations,
|
||||||
repositoryPatternViolations: request.repositoryPatternViolations,
|
repositoryPatternViolations: request.repositoryPatternViolations,
|
||||||
aggregateBoundaryViolations: request.aggregateBoundaryViolations,
|
aggregateBoundaryViolations: request.aggregateBoundaryViolations,
|
||||||
|
secretViolations: request.secretViolations,
|
||||||
metrics,
|
metrics,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -16,7 +16,7 @@ export interface FileCollectionResult {
|
|||||||
/**
|
/**
|
||||||
* Pipeline step responsible for file collection and basic parsing
|
* Pipeline step responsible for file collection and basic parsing
|
||||||
*/
|
*/
|
||||||
export class FileCollectionStep {
|
export class CollectFiles {
|
||||||
constructor(private readonly fileScanner: IFileScanner) {}
|
constructor(private readonly fileScanner: IFileScanner) {}
|
||||||
|
|
||||||
public async execute(request: FileCollectionRequest): Promise<FileCollectionResult> {
|
public async execute(request: FileCollectionRequest): Promise<FileCollectionResult> {
|
||||||
@@ -5,6 +5,7 @@ import { IEntityExposureDetector } from "../../../domain/services/IEntityExposur
|
|||||||
import { IDependencyDirectionDetector } from "../../../domain/services/IDependencyDirectionDetector"
|
import { IDependencyDirectionDetector } from "../../../domain/services/IDependencyDirectionDetector"
|
||||||
import { IRepositoryPatternDetector } from "../../../domain/services/RepositoryPatternDetectorService"
|
import { IRepositoryPatternDetector } from "../../../domain/services/RepositoryPatternDetectorService"
|
||||||
import { IAggregateBoundaryDetector } from "../../../domain/services/IAggregateBoundaryDetector"
|
import { IAggregateBoundaryDetector } from "../../../domain/services/IAggregateBoundaryDetector"
|
||||||
|
import { ISecretDetector } from "../../../domain/services/ISecretDetector"
|
||||||
import { SourceFile } from "../../../domain/entities/SourceFile"
|
import { SourceFile } from "../../../domain/entities/SourceFile"
|
||||||
import { DependencyGraph } from "../../../domain/entities/DependencyGraph"
|
import { DependencyGraph } from "../../../domain/entities/DependencyGraph"
|
||||||
import {
|
import {
|
||||||
@@ -25,6 +26,7 @@ import type {
|
|||||||
HardcodeViolation,
|
HardcodeViolation,
|
||||||
NamingConventionViolation,
|
NamingConventionViolation,
|
||||||
RepositoryPatternViolation,
|
RepositoryPatternViolation,
|
||||||
|
SecretViolation,
|
||||||
} from "../AnalyzeProject"
|
} from "../AnalyzeProject"
|
||||||
|
|
||||||
export interface DetectionRequest {
|
export interface DetectionRequest {
|
||||||
@@ -42,12 +44,13 @@ export interface DetectionResult {
|
|||||||
dependencyDirectionViolations: DependencyDirectionViolation[]
|
dependencyDirectionViolations: DependencyDirectionViolation[]
|
||||||
repositoryPatternViolations: RepositoryPatternViolation[]
|
repositoryPatternViolations: RepositoryPatternViolation[]
|
||||||
aggregateBoundaryViolations: AggregateBoundaryViolation[]
|
aggregateBoundaryViolations: AggregateBoundaryViolation[]
|
||||||
|
secretViolations: SecretViolation[]
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Pipeline step responsible for running all detectors
|
* Pipeline step responsible for running all detectors
|
||||||
*/
|
*/
|
||||||
export class DetectionPipeline {
|
export class ExecuteDetection {
|
||||||
constructor(
|
constructor(
|
||||||
private readonly hardcodeDetector: IHardcodeDetector,
|
private readonly hardcodeDetector: IHardcodeDetector,
|
||||||
private readonly namingConventionDetector: INamingConventionDetector,
|
private readonly namingConventionDetector: INamingConventionDetector,
|
||||||
@@ -56,9 +59,12 @@ export class DetectionPipeline {
|
|||||||
private readonly dependencyDirectionDetector: IDependencyDirectionDetector,
|
private readonly dependencyDirectionDetector: IDependencyDirectionDetector,
|
||||||
private readonly repositoryPatternDetector: IRepositoryPatternDetector,
|
private readonly repositoryPatternDetector: IRepositoryPatternDetector,
|
||||||
private readonly aggregateBoundaryDetector: IAggregateBoundaryDetector,
|
private readonly aggregateBoundaryDetector: IAggregateBoundaryDetector,
|
||||||
|
private readonly secretDetector: ISecretDetector,
|
||||||
) {}
|
) {}
|
||||||
|
|
||||||
public execute(request: DetectionRequest): DetectionResult {
|
public async execute(request: DetectionRequest): Promise<DetectionResult> {
|
||||||
|
const secretViolations = await this.detectSecrets(request.sourceFiles)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
violations: this.sortBySeverity(this.detectViolations(request.sourceFiles)),
|
violations: this.sortBySeverity(this.detectViolations(request.sourceFiles)),
|
||||||
hardcodeViolations: this.sortBySeverity(this.detectHardcode(request.sourceFiles)),
|
hardcodeViolations: this.sortBySeverity(this.detectHardcode(request.sourceFiles)),
|
||||||
@@ -83,6 +89,7 @@ export class DetectionPipeline {
|
|||||||
aggregateBoundaryViolations: this.sortBySeverity(
|
aggregateBoundaryViolations: this.sortBySeverity(
|
||||||
this.detectAggregateBoundaryViolations(request.sourceFiles),
|
this.detectAggregateBoundaryViolations(request.sourceFiles),
|
||||||
),
|
),
|
||||||
|
secretViolations: this.sortBySeverity(secretViolations),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -365,6 +372,32 @@ export class DetectionPipeline {
|
|||||||
return violations
|
return violations
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private async detectSecrets(sourceFiles: SourceFile[]): Promise<SecretViolation[]> {
|
||||||
|
const violations: SecretViolation[] = []
|
||||||
|
|
||||||
|
for (const file of sourceFiles) {
|
||||||
|
const secretViolations = await this.secretDetector.detectAll(
|
||||||
|
file.content,
|
||||||
|
file.path.relative,
|
||||||
|
)
|
||||||
|
|
||||||
|
for (const secret of secretViolations) {
|
||||||
|
violations.push({
|
||||||
|
rule: RULES.SECRET_EXPOSURE,
|
||||||
|
secretType: secret.secretType,
|
||||||
|
file: file.path.relative,
|
||||||
|
line: secret.line,
|
||||||
|
column: secret.column,
|
||||||
|
message: secret.getMessage(),
|
||||||
|
suggestion: secret.getSuggestion(),
|
||||||
|
severity: "critical",
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
private sortBySeverity<T extends { severity: SeverityLevel }>(violations: T[]): T[] {
|
private sortBySeverity<T extends { severity: SeverityLevel }>(violations: T[]): T[] {
|
||||||
return violations.sort((a, b) => {
|
return violations.sort((a, b) => {
|
||||||
return SEVERITY_ORDER[a.severity] - SEVERITY_ORDER[b.severity]
|
return SEVERITY_ORDER[a.severity] - SEVERITY_ORDER[b.severity]
|
||||||
@@ -15,7 +15,7 @@ export interface ParsingResult {
|
|||||||
/**
|
/**
|
||||||
* Pipeline step responsible for AST parsing and dependency graph construction
|
* Pipeline step responsible for AST parsing and dependency graph construction
|
||||||
*/
|
*/
|
||||||
export class ParsingStep {
|
export class ParseSourceFiles {
|
||||||
constructor(private readonly codeParser: ICodeParser) {}
|
constructor(private readonly codeParser: ICodeParser) {}
|
||||||
|
|
||||||
public execute(request: ParsingRequest): ParsingResult {
|
public execute(request: ParsingRequest): ParsingResult {
|
||||||
@@ -9,6 +9,7 @@ import type {
|
|||||||
HardcodeViolation,
|
HardcodeViolation,
|
||||||
NamingConventionViolation,
|
NamingConventionViolation,
|
||||||
RepositoryPatternViolation,
|
RepositoryPatternViolation,
|
||||||
|
SecretViolation,
|
||||||
} from "../../application/use-cases/AnalyzeProject"
|
} from "../../application/use-cases/AnalyzeProject"
|
||||||
import { SEVERITY_DISPLAY_LABELS, SEVERITY_SECTION_HEADERS } from "../constants"
|
import { SEVERITY_DISPLAY_LABELS, SEVERITY_SECTION_HEADERS } from "../constants"
|
||||||
import { ViolationGrouper } from "../groupers/ViolationGrouper"
|
import { ViolationGrouper } from "../groupers/ViolationGrouper"
|
||||||
@@ -177,6 +178,22 @@ export class OutputFormatter {
|
|||||||
console.log("")
|
console.log("")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
formatSecretViolation(sv: SecretViolation, index: number): void {
|
||||||
|
const location = `${sv.file}:${String(sv.line)}:${String(sv.column)}`
|
||||||
|
console.log(`${String(index + 1)}. ${location}`)
|
||||||
|
console.log(` Severity: ${SEVERITY_LABELS[sv.severity]} ⚠️`)
|
||||||
|
console.log(` Secret Type: ${sv.secretType}`)
|
||||||
|
console.log(` ${sv.message}`)
|
||||||
|
console.log(" 🔐 CRITICAL: Rotate this secret immediately!")
|
||||||
|
console.log(" 💡 Suggestion:")
|
||||||
|
sv.suggestion.split("\n").forEach((line) => {
|
||||||
|
if (line.trim()) {
|
||||||
|
console.log(` ${line}`)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
console.log("")
|
||||||
|
}
|
||||||
|
|
||||||
formatHardcodeViolation(hc: HardcodeViolation, index: number): void {
|
formatHardcodeViolation(hc: HardcodeViolation, index: number): void {
|
||||||
console.log(`${String(index + 1)}. ${hc.file}:${String(hc.line)}:${String(hc.column)}`)
|
console.log(`${String(index + 1)}. ${hc.file}:${String(hc.line)}:${String(hc.column)}`)
|
||||||
console.log(` Severity: ${SEVERITY_LABELS[hc.severity]}`)
|
console.log(` Severity: ${SEVERITY_LABELS[hc.severity]}`)
|
||||||
|
|||||||
@@ -92,6 +92,7 @@ program
|
|||||||
dependencyDirectionViolations,
|
dependencyDirectionViolations,
|
||||||
repositoryPatternViolations,
|
repositoryPatternViolations,
|
||||||
aggregateBoundaryViolations,
|
aggregateBoundaryViolations,
|
||||||
|
secretViolations,
|
||||||
} = result
|
} = result
|
||||||
|
|
||||||
const minSeverity: SeverityLevel | undefined = options.onlyCritical
|
const minSeverity: SeverityLevel | undefined = options.onlyCritical
|
||||||
@@ -132,6 +133,7 @@ program
|
|||||||
aggregateBoundaryViolations,
|
aggregateBoundaryViolations,
|
||||||
minSeverity,
|
minSeverity,
|
||||||
)
|
)
|
||||||
|
secretViolations = grouper.filterBySeverity(secretViolations, minSeverity)
|
||||||
|
|
||||||
statsFormatter.displaySeverityFilterMessage(
|
statsFormatter.displaySeverityFilterMessage(
|
||||||
options.onlyCritical,
|
options.onlyCritical,
|
||||||
@@ -245,6 +247,19 @@ program
|
|||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (secretViolations.length > 0) {
|
||||||
|
console.log(
|
||||||
|
`\n🔐 Found ${String(secretViolations.length)} hardcoded secret(s) - CRITICAL SECURITY RISK`,
|
||||||
|
)
|
||||||
|
outputFormatter.displayGroupedViolations(
|
||||||
|
secretViolations,
|
||||||
|
(sv, i) => {
|
||||||
|
outputFormatter.formatSecretViolation(sv, i)
|
||||||
|
},
|
||||||
|
limit,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
if (options.hardcode && hardcodeViolations.length > 0) {
|
if (options.hardcode && hardcodeViolations.length > 0) {
|
||||||
console.log(
|
console.log(
|
||||||
`\n${CLI_MESSAGES.HARDCODE_VIOLATIONS_HEADER} ${String(hardcodeViolations.length)} ${CLI_LABELS.HARDCODE_VIOLATIONS}`,
|
`\n${CLI_MESSAGES.HARDCODE_VIOLATIONS_HEADER} ${String(hardcodeViolations.length)} ${CLI_LABELS.HARDCODE_VIOLATIONS}`,
|
||||||
@@ -267,7 +282,8 @@ program
|
|||||||
entityExposureViolations.length +
|
entityExposureViolations.length +
|
||||||
dependencyDirectionViolations.length +
|
dependencyDirectionViolations.length +
|
||||||
repositoryPatternViolations.length +
|
repositoryPatternViolations.length +
|
||||||
aggregateBoundaryViolations.length
|
aggregateBoundaryViolations.length +
|
||||||
|
secretViolations.length
|
||||||
|
|
||||||
statsFormatter.displaySummary(totalIssues, options.verbose)
|
statsFormatter.displaySummary(totalIssues, options.verbose)
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
|
|||||||
@@ -60,3 +60,12 @@ export const AGGREGATE_VIOLATION_MESSAGES = {
|
|||||||
AVOID_DIRECT_REFERENCE: "3. Avoid direct entity references to maintain aggregate independence",
|
AVOID_DIRECT_REFERENCE: "3. Avoid direct entity references to maintain aggregate independence",
|
||||||
MAINTAIN_INDEPENDENCE: "4. Each aggregate should be independently modifiable and deployable",
|
MAINTAIN_INDEPENDENCE: "4. Each aggregate should be independently modifiable and deployable",
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export const SECRET_VIOLATION_MESSAGES = {
|
||||||
|
USE_ENV_VARIABLES: "1. Use environment variables for sensitive data (process.env.API_KEY)",
|
||||||
|
USE_SECRET_MANAGER:
|
||||||
|
"2. Use secret management services (AWS Secrets Manager, HashiCorp Vault, etc.)",
|
||||||
|
NEVER_COMMIT_SECRETS: "3. Never commit secrets to version control",
|
||||||
|
ROTATE_IF_EXPOSED: "4. If secret was committed, rotate it immediately",
|
||||||
|
USE_GITIGNORE: "5. Add secret files to .gitignore (.env, credentials.json, etc.)",
|
||||||
|
}
|
||||||
|
|||||||
79
packages/guardian/src/domain/constants/SecretExamples.ts
Normal file
79
packages/guardian/src/domain/constants/SecretExamples.ts
Normal file
@@ -0,0 +1,79 @@
|
|||||||
|
/**
|
||||||
|
* Secret detection constants
|
||||||
|
* All hardcoded strings related to secret detection and examples
|
||||||
|
*/
|
||||||
|
|
||||||
|
export const SECRET_KEYWORDS = {
|
||||||
|
AWS: "aws",
|
||||||
|
GITHUB: "github",
|
||||||
|
NPM: "npm",
|
||||||
|
SSH: "ssh",
|
||||||
|
PRIVATE_KEY: "private key",
|
||||||
|
SLACK: "slack",
|
||||||
|
API_KEY: "api key",
|
||||||
|
APIKEY: "apikey",
|
||||||
|
ACCESS_KEY: "access key",
|
||||||
|
SECRET: "secret",
|
||||||
|
TOKEN: "token",
|
||||||
|
PASSWORD: "password",
|
||||||
|
USER: "user",
|
||||||
|
BOT: "bot",
|
||||||
|
RSA: "rsa",
|
||||||
|
DSA: "dsa",
|
||||||
|
ECDSA: "ecdsa",
|
||||||
|
ED25519: "ed25519",
|
||||||
|
BASICAUTH: "basicauth",
|
||||||
|
GCP: "gcp",
|
||||||
|
GOOGLE: "google",
|
||||||
|
PRIVATEKEY: "privatekey",
|
||||||
|
PERSONAL_ACCESS_TOKEN: "personal access token",
|
||||||
|
OAUTH: "oauth",
|
||||||
|
} as const
|
||||||
|
|
||||||
|
export const SECRET_TYPE_NAMES = {
|
||||||
|
AWS_ACCESS_KEY: "AWS Access Key",
|
||||||
|
AWS_SECRET_KEY: "AWS Secret Key",
|
||||||
|
AWS_CREDENTIAL: "AWS Credential",
|
||||||
|
GITHUB_PERSONAL_ACCESS_TOKEN: "GitHub Personal Access Token",
|
||||||
|
GITHUB_OAUTH_TOKEN: "GitHub OAuth Token",
|
||||||
|
GITHUB_TOKEN: "GitHub Token",
|
||||||
|
NPM_TOKEN: "NPM Token",
|
||||||
|
GCP_SERVICE_ACCOUNT_KEY: "GCP Service Account Key",
|
||||||
|
SSH_RSA_PRIVATE_KEY: "SSH RSA Private Key",
|
||||||
|
SSH_DSA_PRIVATE_KEY: "SSH DSA Private Key",
|
||||||
|
SSH_ECDSA_PRIVATE_KEY: "SSH ECDSA Private Key",
|
||||||
|
SSH_ED25519_PRIVATE_KEY: "SSH Ed25519 Private Key",
|
||||||
|
SSH_PRIVATE_KEY: "SSH Private Key",
|
||||||
|
SLACK_BOT_TOKEN: "Slack Bot Token",
|
||||||
|
SLACK_USER_TOKEN: "Slack User Token",
|
||||||
|
SLACK_TOKEN: "Slack Token",
|
||||||
|
BASIC_AUTH_CREDENTIALS: "Basic Authentication Credentials",
|
||||||
|
API_KEY: "API Key",
|
||||||
|
AUTHENTICATION_TOKEN: "Authentication Token",
|
||||||
|
PASSWORD: "Password",
|
||||||
|
SECRET: "Secret",
|
||||||
|
SENSITIVE_DATA: "Sensitive Data",
|
||||||
|
} as const
|
||||||
|
|
||||||
|
export const SECRET_EXAMPLE_VALUES = {
|
||||||
|
AWS_ACCESS_KEY_ID: "AKIA1234567890ABCDEF",
|
||||||
|
AWS_SECRET_ACCESS_KEY: "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY",
|
||||||
|
GITHUB_TOKEN: "ghp_1234567890abcdefghijklmnopqrstuv",
|
||||||
|
NPM_TOKEN: "npm_abc123xyz",
|
||||||
|
SLACK_TOKEN: "xoxb-<token-here>",
|
||||||
|
API_KEY: "sk_live_XXXXXXXXXXXXXXXXXXXX_example_key",
|
||||||
|
HARDCODED_SECRET: "hardcoded-secret-value",
|
||||||
|
} as const
|
||||||
|
|
||||||
|
export const FILE_ENCODING = {
|
||||||
|
UTF8: "utf-8",
|
||||||
|
} as const
|
||||||
|
|
||||||
|
export const REGEX_ESCAPE_PATTERN = {
|
||||||
|
DOLLAR_AMPERSAND: "\\$&",
|
||||||
|
} as const
|
||||||
|
|
||||||
|
export const DYNAMIC_IMPORT_PATTERN_PARTS = {
|
||||||
|
QUOTE_START: '"`][^',
|
||||||
|
QUOTE_END: "`]+['\"",
|
||||||
|
} as const
|
||||||
34
packages/guardian/src/domain/services/ISecretDetector.ts
Normal file
34
packages/guardian/src/domain/services/ISecretDetector.ts
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
import { SecretViolation } from "../value-objects/SecretViolation"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Interface for detecting hardcoded secrets in source code
|
||||||
|
*
|
||||||
|
* Detects sensitive data like API keys, tokens, passwords, and credentials
|
||||||
|
* that should never be hardcoded in source code. Uses industry-standard
|
||||||
|
* Secretlint library for pattern matching.
|
||||||
|
*
|
||||||
|
* All detected secrets are marked as CRITICAL severity violations.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* const detector: ISecretDetector = new SecretDetector()
|
||||||
|
* const violations = await detector.detectAll(
|
||||||
|
* 'const AWS_KEY = "AKIA1234567890ABCDEF"',
|
||||||
|
* 'src/config/aws.ts'
|
||||||
|
* )
|
||||||
|
*
|
||||||
|
* violations.forEach(v => {
|
||||||
|
* console.log(v.getMessage()) // "Hardcoded AWS Access Key detected"
|
||||||
|
* })
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export interface ISecretDetector {
|
||||||
|
/**
|
||||||
|
* Detect all types of hardcoded secrets in the provided code
|
||||||
|
*
|
||||||
|
* @param code - Source code to analyze
|
||||||
|
* @param filePath - Path to the file being analyzed
|
||||||
|
* @returns Array of secret violations found
|
||||||
|
*/
|
||||||
|
detectAll(code: string, filePath: string): Promise<SecretViolation[]>
|
||||||
|
}
|
||||||
204
packages/guardian/src/domain/value-objects/SecretViolation.ts
Normal file
204
packages/guardian/src/domain/value-objects/SecretViolation.ts
Normal file
@@ -0,0 +1,204 @@
|
|||||||
|
import { ValueObject } from "./ValueObject"
|
||||||
|
import { SECRET_VIOLATION_MESSAGES } from "../constants/Messages"
|
||||||
|
import { SEVERITY_LEVELS } from "../../shared/constants"
|
||||||
|
import { FILE_ENCODING, SECRET_EXAMPLE_VALUES, SECRET_KEYWORDS } from "../constants/SecretExamples"
|
||||||
|
|
||||||
|
interface SecretViolationProps {
|
||||||
|
readonly file: string
|
||||||
|
readonly line: number
|
||||||
|
readonly column: number
|
||||||
|
readonly secretType: string
|
||||||
|
readonly matchedPattern: string
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Represents a secret exposure violation in the codebase
|
||||||
|
*
|
||||||
|
* Secret violations occur when sensitive data like API keys, tokens, passwords,
|
||||||
|
* or credentials are hardcoded in the source code instead of being stored
|
||||||
|
* in secure environment variables or secret management systems.
|
||||||
|
*
|
||||||
|
* All secret violations are marked as CRITICAL severity because they represent
|
||||||
|
* serious security risks that could lead to unauthorized access, data breaches,
|
||||||
|
* or service compromise.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* const violation = SecretViolation.create(
|
||||||
|
* 'src/config/aws.ts',
|
||||||
|
* 10,
|
||||||
|
* 15,
|
||||||
|
* 'AWS Access Key',
|
||||||
|
* 'AKIA1234567890ABCDEF'
|
||||||
|
* )
|
||||||
|
*
|
||||||
|
* console.log(violation.getMessage())
|
||||||
|
* // "Hardcoded AWS Access Key detected"
|
||||||
|
*
|
||||||
|
* console.log(violation.getSeverity())
|
||||||
|
* // "critical"
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export class SecretViolation extends ValueObject<SecretViolationProps> {
|
||||||
|
private constructor(props: SecretViolationProps) {
|
||||||
|
super(props)
|
||||||
|
}
|
||||||
|
|
||||||
|
public static create(
|
||||||
|
file: string,
|
||||||
|
line: number,
|
||||||
|
column: number,
|
||||||
|
secretType: string,
|
||||||
|
matchedPattern: string,
|
||||||
|
): SecretViolation {
|
||||||
|
return new SecretViolation({
|
||||||
|
file,
|
||||||
|
line,
|
||||||
|
column,
|
||||||
|
secretType,
|
||||||
|
matchedPattern,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
public get file(): string {
|
||||||
|
return this.props.file
|
||||||
|
}
|
||||||
|
|
||||||
|
public get line(): number {
|
||||||
|
return this.props.line
|
||||||
|
}
|
||||||
|
|
||||||
|
public get column(): number {
|
||||||
|
return this.props.column
|
||||||
|
}
|
||||||
|
|
||||||
|
public get secretType(): string {
|
||||||
|
return this.props.secretType
|
||||||
|
}
|
||||||
|
|
||||||
|
public get matchedPattern(): string {
|
||||||
|
return this.props.matchedPattern
|
||||||
|
}
|
||||||
|
|
||||||
|
public getMessage(): string {
|
||||||
|
return `Hardcoded ${this.props.secretType} detected`
|
||||||
|
}
|
||||||
|
|
||||||
|
public getSuggestion(): string {
|
||||||
|
const suggestions: string[] = [
|
||||||
|
SECRET_VIOLATION_MESSAGES.USE_ENV_VARIABLES,
|
||||||
|
SECRET_VIOLATION_MESSAGES.USE_SECRET_MANAGER,
|
||||||
|
SECRET_VIOLATION_MESSAGES.NEVER_COMMIT_SECRETS,
|
||||||
|
SECRET_VIOLATION_MESSAGES.ROTATE_IF_EXPOSED,
|
||||||
|
SECRET_VIOLATION_MESSAGES.USE_GITIGNORE,
|
||||||
|
]
|
||||||
|
|
||||||
|
return suggestions.join("\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
public getExampleFix(): string {
|
||||||
|
return this.getExampleFixForSecretType(this.props.secretType)
|
||||||
|
}
|
||||||
|
|
||||||
|
public getSeverity(): typeof SEVERITY_LEVELS.CRITICAL {
|
||||||
|
return SEVERITY_LEVELS.CRITICAL
|
||||||
|
}
|
||||||
|
|
||||||
|
private getExampleFixForSecretType(secretType: string): string {
|
||||||
|
const lowerType = secretType.toLowerCase()
|
||||||
|
|
||||||
|
if (lowerType.includes(SECRET_KEYWORDS.AWS)) {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded AWS credentials
|
||||||
|
const AWS_ACCESS_KEY_ID = "${SECRET_EXAMPLE_VALUES.AWS_ACCESS_KEY_ID}"
|
||||||
|
const AWS_SECRET_ACCESS_KEY = "${SECRET_EXAMPLE_VALUES.AWS_SECRET_ACCESS_KEY}"
|
||||||
|
|
||||||
|
// ✅ Good: Use environment variables
|
||||||
|
const AWS_ACCESS_KEY_ID = process.env.AWS_ACCESS_KEY_ID
|
||||||
|
const AWS_SECRET_ACCESS_KEY = process.env.AWS_SECRET_ACCESS_KEY
|
||||||
|
|
||||||
|
// ✅ Good: Use credentials provider (in infrastructure layer)
|
||||||
|
// Load credentials from environment or credentials file`
|
||||||
|
}
|
||||||
|
|
||||||
|
if (lowerType.includes(SECRET_KEYWORDS.GITHUB)) {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded GitHub token
|
||||||
|
const GITHUB_TOKEN = "${SECRET_EXAMPLE_VALUES.GITHUB_TOKEN}"
|
||||||
|
|
||||||
|
// ✅ Good: Use environment variables
|
||||||
|
const GITHUB_TOKEN = process.env.GITHUB_TOKEN
|
||||||
|
|
||||||
|
// ✅ Good: GitHub Apps with temporary tokens
|
||||||
|
// Use GitHub Apps for automated workflows instead of personal access tokens`
|
||||||
|
}
|
||||||
|
|
||||||
|
if (lowerType.includes(SECRET_KEYWORDS.NPM)) {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded NPM token in code
|
||||||
|
const NPM_TOKEN = "${SECRET_EXAMPLE_VALUES.NPM_TOKEN}"
|
||||||
|
|
||||||
|
// ✅ Good: Use .npmrc file (add to .gitignore)
|
||||||
|
// .npmrc
|
||||||
|
//registry.npmjs.org/:_authToken=\${NPM_TOKEN}
|
||||||
|
|
||||||
|
// ✅ Good: Use environment variable
|
||||||
|
const NPM_TOKEN = process.env.NPM_TOKEN`
|
||||||
|
}
|
||||||
|
|
||||||
|
if (
|
||||||
|
lowerType.includes(SECRET_KEYWORDS.SSH) ||
|
||||||
|
lowerType.includes(SECRET_KEYWORDS.PRIVATE_KEY)
|
||||||
|
) {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded SSH private key
|
||||||
|
const privateKey = \`-----BEGIN RSA PRIVATE KEY-----
|
||||||
|
MIIEpAIBAAKCAQEA...\`
|
||||||
|
|
||||||
|
// ✅ Good: Load from secure file (not in repository)
|
||||||
|
import fs from "fs"
|
||||||
|
const privateKey = fs.readFileSync(process.env.SSH_KEY_PATH, "${FILE_ENCODING.UTF8}")
|
||||||
|
|
||||||
|
// ✅ Good: Use SSH agent
|
||||||
|
// Configure SSH agent to handle keys securely`
|
||||||
|
}
|
||||||
|
|
||||||
|
if (lowerType.includes(SECRET_KEYWORDS.SLACK)) {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded Slack token
|
||||||
|
const SLACK_TOKEN = "${SECRET_EXAMPLE_VALUES.SLACK_TOKEN}"
|
||||||
|
|
||||||
|
// ✅ Good: Use environment variables
|
||||||
|
const SLACK_TOKEN = process.env.SLACK_BOT_TOKEN
|
||||||
|
|
||||||
|
// ✅ Good: Use OAuth flow for user tokens
|
||||||
|
// Implement OAuth 2.0 flow instead of hardcoding tokens`
|
||||||
|
}
|
||||||
|
|
||||||
|
if (
|
||||||
|
lowerType.includes(SECRET_KEYWORDS.API_KEY) ||
|
||||||
|
lowerType.includes(SECRET_KEYWORDS.APIKEY)
|
||||||
|
) {
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded API key
|
||||||
|
const API_KEY = "${SECRET_EXAMPLE_VALUES.API_KEY}"
|
||||||
|
|
||||||
|
// ✅ Good: Use environment variables
|
||||||
|
const API_KEY = process.env.API_KEY
|
||||||
|
|
||||||
|
// ✅ Good: Use secret management service (in infrastructure layer)
|
||||||
|
// AWS Secrets Manager, HashiCorp Vault, Azure Key Vault
|
||||||
|
// Implement secret retrieval in infrastructure and inject via DI`
|
||||||
|
}
|
||||||
|
|
||||||
|
return `
|
||||||
|
// ❌ Bad: Hardcoded secret
|
||||||
|
const SECRET = "${SECRET_EXAMPLE_VALUES.HARDCODED_SECRET}"
|
||||||
|
|
||||||
|
// ✅ Good: Use environment variables
|
||||||
|
const SECRET = process.env.SECRET_KEY
|
||||||
|
|
||||||
|
// ✅ Good: Use secret management
|
||||||
|
// AWS Secrets Manager, HashiCorp Vault, Azure Key Vault, etc.`
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,8 +1,9 @@
|
|||||||
import { IAggregateBoundaryDetector } from "../../domain/services/IAggregateBoundaryDetector"
|
import { IAggregateBoundaryDetector } from "../../domain/services/IAggregateBoundaryDetector"
|
||||||
import { AggregateBoundaryViolation } from "../../domain/value-objects/AggregateBoundaryViolation"
|
import { AggregateBoundaryViolation } from "../../domain/value-objects/AggregateBoundaryViolation"
|
||||||
import { LAYERS } from "../../shared/constants/rules"
|
import { LAYERS } from "../../shared/constants/rules"
|
||||||
import { IMPORT_PATTERNS } from "../constants/paths"
|
import { AggregatePathAnalyzer } from "../strategies/AggregatePathAnalyzer"
|
||||||
import { DDD_FOLDER_NAMES } from "../constants/detectorPatterns"
|
import { FolderRegistry } from "../strategies/FolderRegistry"
|
||||||
|
import { ImportValidator } from "../strategies/ImportValidator"
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Detects aggregate boundary violations in Domain-Driven Design
|
* Detects aggregate boundary violations in Domain-Driven Design
|
||||||
@@ -38,42 +39,15 @@ import { DDD_FOLDER_NAMES } from "../constants/detectorPatterns"
|
|||||||
* ```
|
* ```
|
||||||
*/
|
*/
|
||||||
export class AggregateBoundaryDetector implements IAggregateBoundaryDetector {
|
export class AggregateBoundaryDetector implements IAggregateBoundaryDetector {
|
||||||
private readonly entityFolderNames = new Set<string>([
|
private readonly folderRegistry: FolderRegistry
|
||||||
DDD_FOLDER_NAMES.ENTITIES,
|
private readonly pathAnalyzer: AggregatePathAnalyzer
|
||||||
DDD_FOLDER_NAMES.AGGREGATES,
|
private readonly importValidator: ImportValidator
|
||||||
])
|
|
||||||
private readonly valueObjectFolderNames = new Set<string>([
|
constructor() {
|
||||||
DDD_FOLDER_NAMES.VALUE_OBJECTS,
|
this.folderRegistry = new FolderRegistry()
|
||||||
DDD_FOLDER_NAMES.VO,
|
this.pathAnalyzer = new AggregatePathAnalyzer(this.folderRegistry)
|
||||||
])
|
this.importValidator = new ImportValidator(this.folderRegistry, this.pathAnalyzer)
|
||||||
private readonly allowedFolderNames = new Set<string>([
|
}
|
||||||
DDD_FOLDER_NAMES.VALUE_OBJECTS,
|
|
||||||
DDD_FOLDER_NAMES.VO,
|
|
||||||
DDD_FOLDER_NAMES.EVENTS,
|
|
||||||
DDD_FOLDER_NAMES.DOMAIN_EVENTS,
|
|
||||||
DDD_FOLDER_NAMES.REPOSITORIES,
|
|
||||||
DDD_FOLDER_NAMES.SERVICES,
|
|
||||||
DDD_FOLDER_NAMES.SPECIFICATIONS,
|
|
||||||
DDD_FOLDER_NAMES.ERRORS,
|
|
||||||
DDD_FOLDER_NAMES.EXCEPTIONS,
|
|
||||||
])
|
|
||||||
private readonly nonAggregateFolderNames = new Set<string>([
|
|
||||||
DDD_FOLDER_NAMES.VALUE_OBJECTS,
|
|
||||||
DDD_FOLDER_NAMES.VO,
|
|
||||||
DDD_FOLDER_NAMES.EVENTS,
|
|
||||||
DDD_FOLDER_NAMES.DOMAIN_EVENTS,
|
|
||||||
DDD_FOLDER_NAMES.REPOSITORIES,
|
|
||||||
DDD_FOLDER_NAMES.SERVICES,
|
|
||||||
DDD_FOLDER_NAMES.SPECIFICATIONS,
|
|
||||||
DDD_FOLDER_NAMES.ENTITIES,
|
|
||||||
DDD_FOLDER_NAMES.CONSTANTS,
|
|
||||||
DDD_FOLDER_NAMES.SHARED,
|
|
||||||
DDD_FOLDER_NAMES.FACTORIES,
|
|
||||||
DDD_FOLDER_NAMES.PORTS,
|
|
||||||
DDD_FOLDER_NAMES.INTERFACES,
|
|
||||||
DDD_FOLDER_NAMES.ERRORS,
|
|
||||||
DDD_FOLDER_NAMES.EXCEPTIONS,
|
|
||||||
])
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Detects aggregate boundary violations in the given code
|
* Detects aggregate boundary violations in the given code
|
||||||
@@ -95,41 +69,12 @@ export class AggregateBoundaryDetector implements IAggregateBoundaryDetector {
|
|||||||
return []
|
return []
|
||||||
}
|
}
|
||||||
|
|
||||||
const currentAggregate = this.extractAggregateFromPath(filePath)
|
const currentAggregate = this.pathAnalyzer.extractAggregateFromPath(filePath)
|
||||||
if (!currentAggregate) {
|
if (!currentAggregate) {
|
||||||
return []
|
return []
|
||||||
}
|
}
|
||||||
|
|
||||||
const violations: AggregateBoundaryViolation[] = []
|
return this.analyzeImports(code, filePath, currentAggregate)
|
||||||
const lines = code.split("\n")
|
|
||||||
|
|
||||||
for (let i = 0; i < lines.length; i++) {
|
|
||||||
const line = lines[i]
|
|
||||||
const lineNumber = i + 1
|
|
||||||
|
|
||||||
const imports = this.extractImports(line)
|
|
||||||
for (const importPath of imports) {
|
|
||||||
if (this.isAggregateBoundaryViolation(importPath, currentAggregate)) {
|
|
||||||
const targetAggregate = this.extractAggregateFromImport(importPath)
|
|
||||||
const entityName = this.extractEntityName(importPath)
|
|
||||||
|
|
||||||
if (targetAggregate && entityName) {
|
|
||||||
violations.push(
|
|
||||||
AggregateBoundaryViolation.create(
|
|
||||||
currentAggregate,
|
|
||||||
targetAggregate,
|
|
||||||
entityName,
|
|
||||||
importPath,
|
|
||||||
filePath,
|
|
||||||
lineNumber,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -144,37 +89,7 @@ export class AggregateBoundaryDetector implements IAggregateBoundaryDetector {
|
|||||||
* @returns The aggregate name if found, undefined otherwise
|
* @returns The aggregate name if found, undefined otherwise
|
||||||
*/
|
*/
|
||||||
public extractAggregateFromPath(filePath: string): string | undefined {
|
public extractAggregateFromPath(filePath: string): string | undefined {
|
||||||
const normalizedPath = filePath.toLowerCase().replace(/\\/g, "/")
|
return this.pathAnalyzer.extractAggregateFromPath(filePath)
|
||||||
|
|
||||||
const domainMatch = /(?:^|\/)(domain)\//.exec(normalizedPath)
|
|
||||||
if (!domainMatch) {
|
|
||||||
return undefined
|
|
||||||
}
|
|
||||||
|
|
||||||
const domainEndIndex = domainMatch.index + domainMatch[0].length
|
|
||||||
const pathAfterDomain = normalizedPath.substring(domainEndIndex)
|
|
||||||
const segments = pathAfterDomain.split("/").filter(Boolean)
|
|
||||||
|
|
||||||
if (segments.length < 2) {
|
|
||||||
return undefined
|
|
||||||
}
|
|
||||||
|
|
||||||
if (this.entityFolderNames.has(segments[0])) {
|
|
||||||
if (segments.length < 3) {
|
|
||||||
return undefined
|
|
||||||
}
|
|
||||||
const aggregate = segments[1]
|
|
||||||
if (this.nonAggregateFolderNames.has(aggregate)) {
|
|
||||||
return undefined
|
|
||||||
}
|
|
||||||
return aggregate
|
|
||||||
}
|
|
||||||
|
|
||||||
const aggregate = segments[0]
|
|
||||||
if (this.nonAggregateFolderNames.has(aggregate)) {
|
|
||||||
return undefined
|
|
||||||
}
|
|
||||||
return aggregate
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -185,197 +100,68 @@ export class AggregateBoundaryDetector implements IAggregateBoundaryDetector {
|
|||||||
* @returns True if the import crosses aggregate boundaries inappropriately
|
* @returns True if the import crosses aggregate boundaries inappropriately
|
||||||
*/
|
*/
|
||||||
public isAggregateBoundaryViolation(importPath: string, currentAggregate: string): boolean {
|
public isAggregateBoundaryViolation(importPath: string, currentAggregate: string): boolean {
|
||||||
const normalizedPath = importPath.replace(IMPORT_PATTERNS.QUOTE, "").toLowerCase()
|
return this.importValidator.isViolation(importPath, currentAggregate)
|
||||||
|
|
||||||
if (!normalizedPath.includes("/")) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!normalizedPath.startsWith(".") && !normalizedPath.startsWith("/")) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check if import stays within the same bounded context
|
|
||||||
if (this.isInternalBoundedContextImport(normalizedPath)) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
const targetAggregate = this.extractAggregateFromImport(normalizedPath)
|
|
||||||
if (!targetAggregate || targetAggregate === currentAggregate) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
if (this.isAllowedImport(normalizedPath)) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
return this.seemsLikeEntityImport(normalizedPath)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Checks if the import is internal to the same bounded context
|
* Analyzes all imports in code and detects violations
|
||||||
*
|
|
||||||
* An import like "../aggregates/Entity" from "repositories/Repo" stays within
|
|
||||||
* the same bounded context (one level up goes to the bounded context root).
|
|
||||||
*
|
|
||||||
* An import like "../../other-context/Entity" crosses bounded context boundaries.
|
|
||||||
*/
|
*/
|
||||||
private isInternalBoundedContextImport(normalizedPath: string): boolean {
|
private analyzeImports(
|
||||||
const parts = normalizedPath.split("/")
|
code: string,
|
||||||
const dotDotCount = parts.filter((p) => p === "..").length
|
filePath: string,
|
||||||
|
currentAggregate: string,
|
||||||
|
): AggregateBoundaryViolation[] {
|
||||||
|
const violations: AggregateBoundaryViolation[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
/*
|
for (let i = 0; i < lines.length; i++) {
|
||||||
* If only one ".." and path goes into aggregates/entities folder,
|
const line = lines[i]
|
||||||
* it's likely an internal import within the same bounded context
|
const lineNumber = i + 1
|
||||||
*/
|
|
||||||
if (dotDotCount === 1) {
|
const imports = this.importValidator.extractImports(line)
|
||||||
const nonDotParts = parts.filter((p) => p !== ".." && p !== ".")
|
for (const importPath of imports) {
|
||||||
if (nonDotParts.length >= 1) {
|
const violation = this.checkImport(
|
||||||
const firstFolder = nonDotParts[0]
|
importPath,
|
||||||
// Importing from aggregates/entities within same bounded context is allowed
|
currentAggregate,
|
||||||
if (this.entityFolderNames.has(firstFolder)) {
|
filePath,
|
||||||
return true
|
lineNumber,
|
||||||
|
)
|
||||||
|
if (violation) {
|
||||||
|
violations.push(violation)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
return false
|
return violations
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Checks if the import path is from an allowed folder (value-objects, events, etc.)
|
* Checks a single import for boundary violations
|
||||||
*/
|
*/
|
||||||
private isAllowedImport(normalizedPath: string): boolean {
|
private checkImport(
|
||||||
for (const folderName of this.allowedFolderNames) {
|
importPath: string,
|
||||||
if (normalizedPath.includes(`/${folderName}/`)) {
|
currentAggregate: string,
|
||||||
return true
|
filePath: string,
|
||||||
}
|
lineNumber: number,
|
||||||
}
|
): AggregateBoundaryViolation | undefined {
|
||||||
return false
|
if (!this.importValidator.isViolation(importPath, currentAggregate)) {
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Checks if the import seems to be an entity (not a value object, event, etc.)
|
|
||||||
*
|
|
||||||
* Note: normalizedPath is already lowercased, so we check if the first character
|
|
||||||
* is a letter (indicating it was likely PascalCase originally)
|
|
||||||
*/
|
|
||||||
private seemsLikeEntityImport(normalizedPath: string): boolean {
|
|
||||||
const pathParts = normalizedPath.split("/")
|
|
||||||
const lastPart = pathParts[pathParts.length - 1]
|
|
||||||
|
|
||||||
if (!lastPart) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
const filename = lastPart.replace(/\.(ts|js)$/, "")
|
|
||||||
|
|
||||||
if (filename.length > 0 && /^[a-z][a-z]/.exec(filename)) {
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Extracts the aggregate name from an import path
|
|
||||||
*
|
|
||||||
* Handles both absolute and relative paths:
|
|
||||||
* - ../user/User → user
|
|
||||||
* - ../../domain/user/User → user
|
|
||||||
* - ../user/value-objects/UserId → user (but filtered as value object)
|
|
||||||
*/
|
|
||||||
private extractAggregateFromImport(importPath: string): string | undefined {
|
|
||||||
const normalizedPath = importPath.replace(IMPORT_PATTERNS.QUOTE, "").toLowerCase()
|
|
||||||
|
|
||||||
const segments = normalizedPath.split("/").filter((seg) => seg !== ".." && seg !== ".")
|
|
||||||
|
|
||||||
if (segments.length === 0) {
|
|
||||||
return undefined
|
return undefined
|
||||||
}
|
}
|
||||||
|
|
||||||
for (let i = 0; i < segments.length; i++) {
|
const targetAggregate = this.pathAnalyzer.extractAggregateFromImport(importPath)
|
||||||
if (
|
const entityName = this.pathAnalyzer.extractEntityName(importPath)
|
||||||
segments[i] === DDD_FOLDER_NAMES.DOMAIN ||
|
|
||||||
segments[i] === DDD_FOLDER_NAMES.AGGREGATES
|
|
||||||
) {
|
|
||||||
if (i + 1 < segments.length) {
|
|
||||||
if (
|
|
||||||
this.entityFolderNames.has(segments[i + 1]) ||
|
|
||||||
segments[i + 1] === DDD_FOLDER_NAMES.AGGREGATES
|
|
||||||
) {
|
|
||||||
if (i + 2 < segments.length) {
|
|
||||||
return segments[i + 2]
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
return segments[i + 1]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (segments.length >= 2) {
|
if (targetAggregate && entityName) {
|
||||||
const secondLastSegment = segments[segments.length - 2]
|
return AggregateBoundaryViolation.create(
|
||||||
|
currentAggregate,
|
||||||
if (
|
targetAggregate,
|
||||||
!this.entityFolderNames.has(secondLastSegment) &&
|
entityName,
|
||||||
!this.valueObjectFolderNames.has(secondLastSegment) &&
|
importPath,
|
||||||
!this.allowedFolderNames.has(secondLastSegment) &&
|
filePath,
|
||||||
secondLastSegment !== DDD_FOLDER_NAMES.DOMAIN
|
lineNumber,
|
||||||
) {
|
)
|
||||||
return secondLastSegment
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (segments.length === 1) {
|
|
||||||
return undefined
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return undefined
|
return undefined
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
|
||||||
* Extracts the entity name from an import path
|
|
||||||
*/
|
|
||||||
private extractEntityName(importPath: string): string | undefined {
|
|
||||||
const normalizedPath = importPath.replace(IMPORT_PATTERNS.QUOTE, "")
|
|
||||||
const segments = normalizedPath.split("/")
|
|
||||||
const lastSegment = segments[segments.length - 1]
|
|
||||||
|
|
||||||
if (lastSegment) {
|
|
||||||
return lastSegment.replace(/\.(ts|js)$/, "")
|
|
||||||
}
|
|
||||||
|
|
||||||
return undefined
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Extracts import paths from a line of code
|
|
||||||
*
|
|
||||||
* Handles various import statement formats:
|
|
||||||
* - import { X } from 'path'
|
|
||||||
* - import X from 'path'
|
|
||||||
* - import * as X from 'path'
|
|
||||||
* - const X = require('path')
|
|
||||||
*
|
|
||||||
* @param line - A line of code to analyze
|
|
||||||
* @returns Array of import paths found in the line
|
|
||||||
*/
|
|
||||||
private extractImports(line: string): string[] {
|
|
||||||
const imports: string[] = []
|
|
||||||
|
|
||||||
let match = IMPORT_PATTERNS.ES_IMPORT.exec(line)
|
|
||||||
while (match) {
|
|
||||||
imports.push(match[1])
|
|
||||||
match = IMPORT_PATTERNS.ES_IMPORT.exec(line)
|
|
||||||
}
|
|
||||||
|
|
||||||
match = IMPORT_PATTERNS.REQUIRE.exec(line)
|
|
||||||
while (match) {
|
|
||||||
imports.push(match[1])
|
|
||||||
match = IMPORT_PATTERNS.REQUIRE.exec(line)
|
|
||||||
}
|
|
||||||
|
|
||||||
return imports
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,7 +1,10 @@
|
|||||||
import { IHardcodeDetector } from "../../domain/services/IHardcodeDetector"
|
import { IHardcodeDetector } from "../../domain/services/IHardcodeDetector"
|
||||||
import { HardcodedValue } from "../../domain/value-objects/HardcodedValue"
|
import { HardcodedValue } from "../../domain/value-objects/HardcodedValue"
|
||||||
import { ALLOWED_NUMBERS, CODE_PATTERNS, DETECTION_KEYWORDS } from "../constants/defaults"
|
import { BraceTracker } from "../strategies/BraceTracker"
|
||||||
import { HARDCODE_TYPES } from "../../shared/constants"
|
import { ConstantsFileChecker } from "../strategies/ConstantsFileChecker"
|
||||||
|
import { ExportConstantAnalyzer } from "../strategies/ExportConstantAnalyzer"
|
||||||
|
import { MagicNumberMatcher } from "../strategies/MagicNumberMatcher"
|
||||||
|
import { MagicStringMatcher } from "../strategies/MagicStringMatcher"
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Detects hardcoded values (magic numbers and strings) in TypeScript/JavaScript code
|
* Detects hardcoded values (magic numbers and strings) in TypeScript/JavaScript code
|
||||||
@@ -22,22 +25,19 @@ import { HARDCODE_TYPES } from "../../shared/constants"
|
|||||||
* ```
|
* ```
|
||||||
*/
|
*/
|
||||||
export class HardcodeDetector implements IHardcodeDetector {
|
export class HardcodeDetector implements IHardcodeDetector {
|
||||||
private readonly ALLOWED_NUMBERS = ALLOWED_NUMBERS
|
private readonly constantsChecker: ConstantsFileChecker
|
||||||
|
private readonly braceTracker: BraceTracker
|
||||||
|
private readonly exportAnalyzer: ExportConstantAnalyzer
|
||||||
|
private readonly numberMatcher: MagicNumberMatcher
|
||||||
|
private readonly stringMatcher: MagicStringMatcher
|
||||||
|
|
||||||
private readonly ALLOWED_STRING_PATTERNS = [/^[a-z]$/i, /^\/$/, /^\\$/, /^\s+$/, /^,$/, /^\.$/]
|
constructor() {
|
||||||
|
this.constantsChecker = new ConstantsFileChecker()
|
||||||
/**
|
this.braceTracker = new BraceTracker()
|
||||||
* Patterns to detect TypeScript type contexts where strings should be ignored
|
this.exportAnalyzer = new ExportConstantAnalyzer(this.braceTracker)
|
||||||
*/
|
this.numberMatcher = new MagicNumberMatcher(this.exportAnalyzer)
|
||||||
private readonly TYPE_CONTEXT_PATTERNS = [
|
this.stringMatcher = new MagicStringMatcher(this.exportAnalyzer)
|
||||||
/^\s*type\s+\w+\s*=/i, // type Foo = ...
|
}
|
||||||
/^\s*interface\s+\w+/i, // interface Foo { ... }
|
|
||||||
/^\s*\w+\s*:\s*['"`]/, // property: 'value' (in type or interface)
|
|
||||||
/\s+as\s+['"`]/, // ... as 'type'
|
|
||||||
/Record<.*,\s*import\(/, // Record with import type
|
|
||||||
/typeof\s+\w+\s*===\s*['"`]/, // typeof x === 'string'
|
|
||||||
/['"`]\s*===\s*typeof\s+\w+/, // 'string' === typeof x
|
|
||||||
]
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Detects all hardcoded values (both numbers and strings) in the given code
|
* Detects all hardcoded values (both numbers and strings) in the given code
|
||||||
@@ -47,413 +47,43 @@ export class HardcodeDetector implements IHardcodeDetector {
|
|||||||
* @returns Array of detected hardcoded values with suggestions
|
* @returns Array of detected hardcoded values with suggestions
|
||||||
*/
|
*/
|
||||||
public detectAll(code: string, filePath: string): HardcodedValue[] {
|
public detectAll(code: string, filePath: string): HardcodedValue[] {
|
||||||
if (this.isConstantsFile(filePath)) {
|
if (this.constantsChecker.isConstantsFile(filePath)) {
|
||||||
return []
|
return []
|
||||||
}
|
}
|
||||||
const magicNumbers = this.detectMagicNumbers(code, filePath)
|
|
||||||
const magicStrings = this.detectMagicStrings(code, filePath)
|
const magicNumbers = this.numberMatcher.detect(code)
|
||||||
|
const magicStrings = this.stringMatcher.detect(code)
|
||||||
|
|
||||||
return [...magicNumbers, ...magicStrings]
|
return [...magicNumbers, ...magicStrings]
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Check if a file is a constants definition file or DI tokens file
|
* Detects magic numbers in code
|
||||||
*/
|
|
||||||
private isConstantsFile(filePath: string): boolean {
|
|
||||||
const _fileName = filePath.split("/").pop() ?? ""
|
|
||||||
const constantsPatterns = [
|
|
||||||
/^constants?\.(ts|js)$/i,
|
|
||||||
/constants?\/.*\.(ts|js)$/i,
|
|
||||||
/\/(constants|config|settings|defaults|tokens)\.ts$/i,
|
|
||||||
/\/di\/tokens\.(ts|js)$/i,
|
|
||||||
]
|
|
||||||
return constantsPatterns.some((pattern) => pattern.test(filePath))
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if a line is inside an exported constant definition
|
|
||||||
*/
|
|
||||||
private isInExportedConstant(lines: string[], lineIndex: number): boolean {
|
|
||||||
const currentLineTrimmed = lines[lineIndex].trim()
|
|
||||||
|
|
||||||
if (this.isSingleLineExportConst(currentLineTrimmed)) {
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
|
|
||||||
const exportConstStart = this.findExportConstStart(lines, lineIndex)
|
|
||||||
if (exportConstStart === -1) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
const { braces, brackets } = this.countUnclosedBraces(lines, exportConstStart, lineIndex)
|
|
||||||
return braces > 0 || brackets > 0
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if a line is a single-line export const declaration
|
|
||||||
*/
|
|
||||||
private isSingleLineExportConst(line: string): boolean {
|
|
||||||
if (!line.startsWith(CODE_PATTERNS.EXPORT_CONST)) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
const hasObjectOrArray =
|
|
||||||
line.includes(CODE_PATTERNS.OBJECT_START) || line.includes(CODE_PATTERNS.ARRAY_START)
|
|
||||||
|
|
||||||
if (hasObjectOrArray) {
|
|
||||||
const hasAsConstEnding =
|
|
||||||
line.includes(CODE_PATTERNS.AS_CONST_OBJECT) ||
|
|
||||||
line.includes(CODE_PATTERNS.AS_CONST_ARRAY) ||
|
|
||||||
line.includes(CODE_PATTERNS.AS_CONST_END_SEMICOLON_OBJECT) ||
|
|
||||||
line.includes(CODE_PATTERNS.AS_CONST_END_SEMICOLON_ARRAY)
|
|
||||||
|
|
||||||
return hasAsConstEnding
|
|
||||||
}
|
|
||||||
|
|
||||||
return line.includes(CODE_PATTERNS.AS_CONST)
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Find the starting line of an export const declaration
|
|
||||||
*/
|
|
||||||
private findExportConstStart(lines: string[], lineIndex: number): number {
|
|
||||||
for (let currentLine = lineIndex; currentLine >= 0; currentLine--) {
|
|
||||||
const trimmed = lines[currentLine].trim()
|
|
||||||
|
|
||||||
const isExportConst =
|
|
||||||
trimmed.startsWith(CODE_PATTERNS.EXPORT_CONST) &&
|
|
||||||
(trimmed.includes(CODE_PATTERNS.OBJECT_START) ||
|
|
||||||
trimmed.includes(CODE_PATTERNS.ARRAY_START))
|
|
||||||
|
|
||||||
if (isExportConst) {
|
|
||||||
return currentLine
|
|
||||||
}
|
|
||||||
|
|
||||||
const isTopLevelStatement =
|
|
||||||
currentLine < lineIndex &&
|
|
||||||
(trimmed.startsWith(CODE_PATTERNS.EXPORT) ||
|
|
||||||
trimmed.startsWith(CODE_PATTERNS.IMPORT))
|
|
||||||
|
|
||||||
if (isTopLevelStatement) {
|
|
||||||
break
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return -1
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Count unclosed braces and brackets between two line indices
|
|
||||||
*/
|
|
||||||
private countUnclosedBraces(
|
|
||||||
lines: string[],
|
|
||||||
startLine: number,
|
|
||||||
endLine: number,
|
|
||||||
): { braces: number; brackets: number } {
|
|
||||||
let braces = 0
|
|
||||||
let brackets = 0
|
|
||||||
|
|
||||||
for (let i = startLine; i <= endLine; i++) {
|
|
||||||
const line = lines[i]
|
|
||||||
let inString = false
|
|
||||||
let stringChar = ""
|
|
||||||
|
|
||||||
for (let j = 0; j < line.length; j++) {
|
|
||||||
const char = line[j]
|
|
||||||
const prevChar = j > 0 ? line[j - 1] : ""
|
|
||||||
|
|
||||||
if ((char === "'" || char === '"' || char === "`") && prevChar !== "\\") {
|
|
||||||
if (!inString) {
|
|
||||||
inString = true
|
|
||||||
stringChar = char
|
|
||||||
} else if (char === stringChar) {
|
|
||||||
inString = false
|
|
||||||
stringChar = ""
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!inString) {
|
|
||||||
if (char === "{") {
|
|
||||||
braces++
|
|
||||||
} else if (char === "}") {
|
|
||||||
braces--
|
|
||||||
} else if (char === "[") {
|
|
||||||
brackets++
|
|
||||||
} else if (char === "]") {
|
|
||||||
brackets--
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return { braces, brackets }
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Detects magic numbers in code (timeouts, ports, limits, retries, etc.)
|
|
||||||
*
|
|
||||||
* Skips allowed numbers (-1, 0, 1, 2, 10, 100, 1000) and values in exported constants
|
|
||||||
*
|
*
|
||||||
* @param code - Source code to analyze
|
* @param code - Source code to analyze
|
||||||
* @param _filePath - File path (currently unused, reserved for future use)
|
* @param filePath - File path (used for constants file check)
|
||||||
* @returns Array of detected magic numbers
|
* @returns Array of detected magic numbers
|
||||||
*/
|
*/
|
||||||
public detectMagicNumbers(code: string, _filePath: string): HardcodedValue[] {
|
public detectMagicNumbers(code: string, filePath: string): HardcodedValue[] {
|
||||||
const results: HardcodedValue[] = []
|
if (this.constantsChecker.isConstantsFile(filePath)) {
|
||||||
const lines = code.split("\n")
|
return []
|
||||||
|
}
|
||||||
|
|
||||||
const numberPatterns = [
|
return this.numberMatcher.detect(code)
|
||||||
/(?:setTimeout|setInterval)\s*\(\s*[^,]+,\s*(\d+)/g,
|
|
||||||
/(?:maxRetries|retries|attempts)\s*[=:]\s*(\d+)/gi,
|
|
||||||
/(?:limit|max|min)\s*[=:]\s*(\d+)/gi,
|
|
||||||
/(?:port|PORT)\s*[=:]\s*(\d+)/g,
|
|
||||||
/(?:delay|timeout|TIMEOUT)\s*[=:]\s*(\d+)/gi,
|
|
||||||
]
|
|
||||||
|
|
||||||
lines.forEach((line, lineIndex) => {
|
|
||||||
if (line.trim().startsWith("//") || line.trim().startsWith("*")) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
// Skip lines inside exported constants
|
|
||||||
if (this.isInExportedConstant(lines, lineIndex)) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
numberPatterns.forEach((pattern) => {
|
|
||||||
let match
|
|
||||||
const regex = new RegExp(pattern)
|
|
||||||
|
|
||||||
while ((match = regex.exec(line)) !== null) {
|
|
||||||
const value = parseInt(match[1], 10)
|
|
||||||
|
|
||||||
if (!this.ALLOWED_NUMBERS.has(value)) {
|
|
||||||
results.push(
|
|
||||||
HardcodedValue.create(
|
|
||||||
value,
|
|
||||||
HARDCODE_TYPES.MAGIC_NUMBER,
|
|
||||||
lineIndex + 1,
|
|
||||||
match.index,
|
|
||||||
line.trim(),
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
|
||||||
const genericNumberRegex = /\b(\d{3,})\b/g
|
|
||||||
let match
|
|
||||||
|
|
||||||
while ((match = genericNumberRegex.exec(line)) !== null) {
|
|
||||||
const value = parseInt(match[1], 10)
|
|
||||||
|
|
||||||
if (
|
|
||||||
!this.ALLOWED_NUMBERS.has(value) &&
|
|
||||||
!this.isInComment(line, match.index) &&
|
|
||||||
!this.isInString(line, match.index)
|
|
||||||
) {
|
|
||||||
const context = this.extractContext(line, match.index)
|
|
||||||
if (this.looksLikeMagicNumber(context)) {
|
|
||||||
results.push(
|
|
||||||
HardcodedValue.create(
|
|
||||||
value,
|
|
||||||
HARDCODE_TYPES.MAGIC_NUMBER,
|
|
||||||
lineIndex + 1,
|
|
||||||
match.index,
|
|
||||||
line.trim(),
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
|
||||||
return results
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Detects magic strings in code (URLs, connection strings, error messages, etc.)
|
* Detects magic strings in code
|
||||||
*
|
|
||||||
* Skips short strings (≤3 chars), console logs, test descriptions, imports,
|
|
||||||
* and values in exported constants
|
|
||||||
*
|
*
|
||||||
* @param code - Source code to analyze
|
* @param code - Source code to analyze
|
||||||
* @param _filePath - File path (currently unused, reserved for future use)
|
* @param filePath - File path (used for constants file check)
|
||||||
* @returns Array of detected magic strings
|
* @returns Array of detected magic strings
|
||||||
*/
|
*/
|
||||||
public detectMagicStrings(code: string, _filePath: string): HardcodedValue[] {
|
public detectMagicStrings(code: string, filePath: string): HardcodedValue[] {
|
||||||
const results: HardcodedValue[] = []
|
if (this.constantsChecker.isConstantsFile(filePath)) {
|
||||||
const lines = code.split("\n")
|
return []
|
||||||
|
|
||||||
const stringRegex = /(['"`])(?:(?!\1).)+\1/g
|
|
||||||
|
|
||||||
lines.forEach((line, lineIndex) => {
|
|
||||||
if (
|
|
||||||
line.trim().startsWith("//") ||
|
|
||||||
line.trim().startsWith("*") ||
|
|
||||||
line.includes("import ") ||
|
|
||||||
line.includes("from ")
|
|
||||||
) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
// Skip lines inside exported constants
|
|
||||||
if (this.isInExportedConstant(lines, lineIndex)) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
let match
|
|
||||||
const regex = new RegExp(stringRegex)
|
|
||||||
|
|
||||||
while ((match = regex.exec(line)) !== null) {
|
|
||||||
const fullMatch = match[0]
|
|
||||||
const value = fullMatch.slice(1, -1)
|
|
||||||
|
|
||||||
// Skip template literals (backtick strings with ${} interpolation)
|
|
||||||
if (fullMatch.startsWith("`") || value.includes("${")) {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!this.isAllowedString(value) && this.looksLikeMagicString(line, value)) {
|
|
||||||
results.push(
|
|
||||||
HardcodedValue.create(
|
|
||||||
value,
|
|
||||||
HARDCODE_TYPES.MAGIC_STRING,
|
|
||||||
lineIndex + 1,
|
|
||||||
match.index,
|
|
||||||
line.trim(),
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
|
||||||
return results
|
|
||||||
}
|
|
||||||
|
|
||||||
private isAllowedString(str: string): boolean {
|
|
||||||
if (str.length <= 1) {
|
|
||||||
return true
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return this.ALLOWED_STRING_PATTERNS.some((pattern) => pattern.test(str))
|
return this.stringMatcher.detect(code)
|
||||||
}
|
|
||||||
|
|
||||||
private looksLikeMagicString(line: string, value: string): boolean {
|
|
||||||
const lowerLine = line.toLowerCase()
|
|
||||||
|
|
||||||
if (
|
|
||||||
lowerLine.includes(DETECTION_KEYWORDS.TEST) ||
|
|
||||||
lowerLine.includes(DETECTION_KEYWORDS.DESCRIBE)
|
|
||||||
) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
if (
|
|
||||||
lowerLine.includes(DETECTION_KEYWORDS.CONSOLE_LOG) ||
|
|
||||||
lowerLine.includes(DETECTION_KEYWORDS.CONSOLE_ERROR)
|
|
||||||
) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
if (this.isInTypeContext(line)) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
if (this.isInSymbolCall(line, value)) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
if (this.isInImportCall(line, value)) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
if (value.includes(DETECTION_KEYWORDS.HTTP) || value.includes(DETECTION_KEYWORDS.API)) {
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
|
|
||||||
if (/^\d{2,}$/.test(value)) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
return value.length > 3
|
|
||||||
}
|
|
||||||
|
|
||||||
private looksLikeMagicNumber(context: string): boolean {
|
|
||||||
const lowerContext = context.toLowerCase()
|
|
||||||
|
|
||||||
const configKeywords = [
|
|
||||||
DETECTION_KEYWORDS.TIMEOUT,
|
|
||||||
DETECTION_KEYWORDS.DELAY,
|
|
||||||
DETECTION_KEYWORDS.RETRY,
|
|
||||||
DETECTION_KEYWORDS.LIMIT,
|
|
||||||
DETECTION_KEYWORDS.MAX,
|
|
||||||
DETECTION_KEYWORDS.MIN,
|
|
||||||
DETECTION_KEYWORDS.PORT,
|
|
||||||
DETECTION_KEYWORDS.INTERVAL,
|
|
||||||
]
|
|
||||||
|
|
||||||
return configKeywords.some((keyword) => lowerContext.includes(keyword))
|
|
||||||
}
|
|
||||||
|
|
||||||
private isInComment(line: string, index: number): boolean {
|
|
||||||
const beforeIndex = line.substring(0, index)
|
|
||||||
return beforeIndex.includes("//") || beforeIndex.includes("/*")
|
|
||||||
}
|
|
||||||
|
|
||||||
private isInString(line: string, index: number): boolean {
|
|
||||||
const beforeIndex = line.substring(0, index)
|
|
||||||
const singleQuotes = (beforeIndex.match(/'/g) ?? []).length
|
|
||||||
const doubleQuotes = (beforeIndex.match(/"/g) ?? []).length
|
|
||||||
const backticks = (beforeIndex.match(/`/g) ?? []).length
|
|
||||||
|
|
||||||
return singleQuotes % 2 !== 0 || doubleQuotes % 2 !== 0 || backticks % 2 !== 0
|
|
||||||
}
|
|
||||||
|
|
||||||
private extractContext(line: string, index: number): string {
|
|
||||||
const start = Math.max(0, index - 30)
|
|
||||||
const end = Math.min(line.length, index + 30)
|
|
||||||
return line.substring(start, end)
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if a line is in a TypeScript type definition context
|
|
||||||
* Examples:
|
|
||||||
* - type Foo = 'a' | 'b'
|
|
||||||
* - interface Bar { prop: 'value' }
|
|
||||||
* - Record<X, import('path')>
|
|
||||||
* - ... as 'type'
|
|
||||||
*/
|
|
||||||
private isInTypeContext(line: string): boolean {
|
|
||||||
const trimmedLine = line.trim()
|
|
||||||
|
|
||||||
if (this.TYPE_CONTEXT_PATTERNS.some((pattern) => pattern.test(trimmedLine))) {
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
|
|
||||||
if (trimmedLine.includes("|") && /['"`][^'"`]+['"`]\s*\|/.test(trimmedLine)) {
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if a string is inside a Symbol() call
|
|
||||||
* Example: Symbol('TOKEN_NAME')
|
|
||||||
*/
|
|
||||||
private isInSymbolCall(line: string, stringValue: string): boolean {
|
|
||||||
const symbolPattern = new RegExp(
|
|
||||||
`Symbol\\s*\\(\\s*['"\`]${stringValue.replace(/[.*+?^${}()|[\]\\]/g, "\\$&")}['"\`]\\s*\\)`,
|
|
||||||
)
|
|
||||||
return symbolPattern.test(line)
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if a string is inside an import() call
|
|
||||||
* Example: import('../../path/to/module.js')
|
|
||||||
*/
|
|
||||||
private isInImportCall(line: string, stringValue: string): boolean {
|
|
||||||
const importPattern = /import\s*\(\s*['"`][^'"`]+['"`]\s*\)/
|
|
||||||
return importPattern.test(line) && line.includes(stringValue)
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,9 +1,9 @@
|
|||||||
import { IRepositoryPatternDetector } from "../../domain/services/RepositoryPatternDetectorService"
|
import { IRepositoryPatternDetector } from "../../domain/services/RepositoryPatternDetectorService"
|
||||||
import { RepositoryViolation } from "../../domain/value-objects/RepositoryViolation"
|
import { RepositoryViolation } from "../../domain/value-objects/RepositoryViolation"
|
||||||
import { LAYERS, REPOSITORY_VIOLATION_TYPES } from "../../shared/constants/rules"
|
import { OrmTypeMatcher } from "../strategies/OrmTypeMatcher"
|
||||||
import { ORM_QUERY_METHODS } from "../constants/orm-methods"
|
import { MethodNameValidator } from "../strategies/MethodNameValidator"
|
||||||
import { REPOSITORY_PATTERN_MESSAGES } from "../../domain/constants/Messages"
|
import { RepositoryFileAnalyzer } from "../strategies/RepositoryFileAnalyzer"
|
||||||
import { REPOSITORY_METHOD_SUGGESTIONS } from "../constants/detectorPatterns"
|
import { RepositoryViolationDetector } from "../strategies/RepositoryViolationDetector"
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Detects Repository Pattern violations in the codebase
|
* Detects Repository Pattern violations in the codebase
|
||||||
@@ -36,84 +36,20 @@ import { REPOSITORY_METHOD_SUGGESTIONS } from "../constants/detectorPatterns"
|
|||||||
* ```
|
* ```
|
||||||
*/
|
*/
|
||||||
export class RepositoryPatternDetector implements IRepositoryPatternDetector {
|
export class RepositoryPatternDetector implements IRepositoryPatternDetector {
|
||||||
private readonly ormTypePatterns = [
|
private readonly ormMatcher: OrmTypeMatcher
|
||||||
/Prisma\./,
|
private readonly methodValidator: MethodNameValidator
|
||||||
/PrismaClient/,
|
private readonly fileAnalyzer: RepositoryFileAnalyzer
|
||||||
/TypeORM/,
|
private readonly violationDetector: RepositoryViolationDetector
|
||||||
/@Entity/,
|
|
||||||
/@Column/,
|
|
||||||
/@PrimaryColumn/,
|
|
||||||
/@PrimaryGeneratedColumn/,
|
|
||||||
/@ManyToOne/,
|
|
||||||
/@OneToMany/,
|
|
||||||
/@ManyToMany/,
|
|
||||||
/@JoinColumn/,
|
|
||||||
/@JoinTable/,
|
|
||||||
/Mongoose\./,
|
|
||||||
/Schema/,
|
|
||||||
/Model</,
|
|
||||||
/Document/,
|
|
||||||
/Sequelize\./,
|
|
||||||
/DataTypes\./,
|
|
||||||
/FindOptions/,
|
|
||||||
/WhereOptions/,
|
|
||||||
/IncludeOptions/,
|
|
||||||
/QueryInterface/,
|
|
||||||
/MikroORM/,
|
|
||||||
/EntityManager/,
|
|
||||||
/EntityRepository/,
|
|
||||||
/Collection</,
|
|
||||||
]
|
|
||||||
|
|
||||||
private readonly technicalMethodNames = ORM_QUERY_METHODS
|
constructor() {
|
||||||
|
this.ormMatcher = new OrmTypeMatcher()
|
||||||
private readonly domainMethodPatterns = [
|
this.methodValidator = new MethodNameValidator(this.ormMatcher)
|
||||||
/^findBy[A-Z]/,
|
this.fileAnalyzer = new RepositoryFileAnalyzer()
|
||||||
/^findAll$/,
|
this.violationDetector = new RepositoryViolationDetector(
|
||||||
/^find[A-Z]/,
|
this.ormMatcher,
|
||||||
/^save$/,
|
this.methodValidator,
|
||||||
/^saveAll$/,
|
)
|
||||||
/^create$/,
|
}
|
||||||
/^update$/,
|
|
||||||
/^delete$/,
|
|
||||||
/^deleteBy[A-Z]/,
|
|
||||||
/^deleteAll$/,
|
|
||||||
/^remove$/,
|
|
||||||
/^removeBy[A-Z]/,
|
|
||||||
/^removeAll$/,
|
|
||||||
/^add$/,
|
|
||||||
/^add[A-Z]/,
|
|
||||||
/^get[A-Z]/,
|
|
||||||
/^getAll$/,
|
|
||||||
/^search/,
|
|
||||||
/^list/,
|
|
||||||
/^has[A-Z]/,
|
|
||||||
/^is[A-Z]/,
|
|
||||||
/^exists$/,
|
|
||||||
/^exists[A-Z]/,
|
|
||||||
/^existsBy[A-Z]/,
|
|
||||||
/^clear[A-Z]/,
|
|
||||||
/^clearAll$/,
|
|
||||||
/^store[A-Z]/,
|
|
||||||
/^initialize$/,
|
|
||||||
/^initializeCollection$/,
|
|
||||||
/^close$/,
|
|
||||||
/^connect$/,
|
|
||||||
/^disconnect$/,
|
|
||||||
/^count$/,
|
|
||||||
/^countBy[A-Z]/,
|
|
||||||
]
|
|
||||||
|
|
||||||
private readonly concreteRepositoryPatterns = [
|
|
||||||
/PrismaUserRepository/,
|
|
||||||
/MongoUserRepository/,
|
|
||||||
/TypeOrmUserRepository/,
|
|
||||||
/SequelizeUserRepository/,
|
|
||||||
/InMemoryUserRepository/,
|
|
||||||
/PostgresUserRepository/,
|
|
||||||
/MySqlUserRepository/,
|
|
||||||
/Repository(?!Interface)/,
|
|
||||||
]
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Detects all Repository Pattern violations in the given code
|
* Detects all Repository Pattern violations in the given code
|
||||||
@@ -125,14 +61,16 @@ export class RepositoryPatternDetector implements IRepositoryPatternDetector {
|
|||||||
): RepositoryViolation[] {
|
): RepositoryViolation[] {
|
||||||
const violations: RepositoryViolation[] = []
|
const violations: RepositoryViolation[] = []
|
||||||
|
|
||||||
if (this.isRepositoryInterface(filePath, layer)) {
|
if (this.fileAnalyzer.isRepositoryInterface(filePath, layer)) {
|
||||||
violations.push(...this.detectOrmTypesInInterface(code, filePath, layer))
|
violations.push(...this.violationDetector.detectOrmTypes(code, filePath, layer))
|
||||||
violations.push(...this.detectNonDomainMethodNames(code, filePath, layer))
|
violations.push(...this.violationDetector.detectNonDomainMethods(code, filePath, layer))
|
||||||
}
|
}
|
||||||
|
|
||||||
if (this.isUseCase(filePath, layer)) {
|
if (this.fileAnalyzer.isUseCase(filePath, layer)) {
|
||||||
violations.push(...this.detectConcreteRepositoryUsage(code, filePath, layer))
|
violations.push(
|
||||||
violations.push(...this.detectNewRepositoryInstantiation(code, filePath, layer))
|
...this.violationDetector.detectConcreteRepositoryUsage(code, filePath, layer),
|
||||||
|
)
|
||||||
|
violations.push(...this.violationDetector.detectNewInstantiation(code, filePath, layer))
|
||||||
}
|
}
|
||||||
|
|
||||||
return violations
|
return violations
|
||||||
@@ -142,338 +80,27 @@ export class RepositoryPatternDetector implements IRepositoryPatternDetector {
|
|||||||
* Checks if a type is an ORM-specific type
|
* Checks if a type is an ORM-specific type
|
||||||
*/
|
*/
|
||||||
public isOrmType(typeName: string): boolean {
|
public isOrmType(typeName: string): boolean {
|
||||||
return this.ormTypePatterns.some((pattern) => pattern.test(typeName))
|
return this.ormMatcher.isOrmType(typeName)
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Checks if a method name follows domain language conventions
|
* Checks if a method name follows domain language conventions
|
||||||
*/
|
*/
|
||||||
public isDomainMethodName(methodName: string): boolean {
|
public isDomainMethodName(methodName: string): boolean {
|
||||||
if ((this.technicalMethodNames as readonly string[]).includes(methodName)) {
|
return this.methodValidator.isDomainMethodName(methodName)
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
return this.domainMethodPatterns.some((pattern) => pattern.test(methodName))
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Checks if a file is a repository interface
|
* Checks if a file is a repository interface
|
||||||
*/
|
*/
|
||||||
public isRepositoryInterface(filePath: string, layer: string | undefined): boolean {
|
public isRepositoryInterface(filePath: string, layer: string | undefined): boolean {
|
||||||
if (layer !== LAYERS.DOMAIN) {
|
return this.fileAnalyzer.isRepositoryInterface(filePath, layer)
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
return /I[A-Z]\w*Repository\.ts$/.test(filePath) && /repositories?\//.test(filePath)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Checks if a file is a use case
|
* Checks if a file is a use case
|
||||||
*/
|
*/
|
||||||
public isUseCase(filePath: string, layer: string | undefined): boolean {
|
public isUseCase(filePath: string, layer: string | undefined): boolean {
|
||||||
if (layer !== LAYERS.APPLICATION) {
|
return this.fileAnalyzer.isUseCase(filePath, layer)
|
||||||
return false
|
|
||||||
}
|
|
||||||
|
|
||||||
return /use-cases?\//.test(filePath) && /[A-Z][a-z]+[A-Z]\w*\.ts$/.test(filePath)
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Detects ORM-specific types in repository interfaces
|
|
||||||
*/
|
|
||||||
private detectOrmTypesInInterface(
|
|
||||||
code: string,
|
|
||||||
filePath: string,
|
|
||||||
layer: string | undefined,
|
|
||||||
): RepositoryViolation[] {
|
|
||||||
const violations: RepositoryViolation[] = []
|
|
||||||
const lines = code.split("\n")
|
|
||||||
|
|
||||||
for (let i = 0; i < lines.length; i++) {
|
|
||||||
const line = lines[i]
|
|
||||||
const lineNumber = i + 1
|
|
||||||
|
|
||||||
const methodMatch =
|
|
||||||
/(\w+)\s*\([^)]*:\s*([^)]+)\)\s*:\s*.*?(?:Promise<([^>]+)>|([A-Z]\w+))/.exec(line)
|
|
||||||
|
|
||||||
if (methodMatch) {
|
|
||||||
const params = methodMatch[2]
|
|
||||||
const returnType = methodMatch[3] || methodMatch[4]
|
|
||||||
|
|
||||||
if (this.isOrmType(params)) {
|
|
||||||
const ormType = this.extractOrmType(params)
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.DOMAIN,
|
|
||||||
lineNumber,
|
|
||||||
`Method parameter uses ORM type: ${ormType}`,
|
|
||||||
ormType,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (returnType && this.isOrmType(returnType)) {
|
|
||||||
const ormType = this.extractOrmType(returnType)
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.DOMAIN,
|
|
||||||
lineNumber,
|
|
||||||
`Method return type uses ORM type: ${ormType}`,
|
|
||||||
ormType,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const pattern of this.ormTypePatterns) {
|
|
||||||
if (pattern.test(line) && !line.trim().startsWith("//")) {
|
|
||||||
const ormType = this.extractOrmType(line)
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.DOMAIN,
|
|
||||||
lineNumber,
|
|
||||||
`Repository interface contains ORM-specific type: ${ormType}`,
|
|
||||||
ormType,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
break
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Suggests better domain method names based on the original method name
|
|
||||||
*/
|
|
||||||
private suggestDomainMethodName(methodName: string): string {
|
|
||||||
const lowerName = methodName.toLowerCase()
|
|
||||||
const suggestions: string[] = []
|
|
||||||
|
|
||||||
const suggestionMap: Record<string, string[]> = {
|
|
||||||
query: [
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.SEARCH,
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
|
||||||
],
|
|
||||||
select: [
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.GET_ENTITY,
|
|
||||||
],
|
|
||||||
insert: [
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.CREATE,
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.ADD_ENTITY,
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.STORE_ENTITY,
|
|
||||||
],
|
|
||||||
update: [
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.UPDATE,
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.MODIFY_ENTITY,
|
|
||||||
],
|
|
||||||
upsert: [
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.SAVE,
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.STORE_ENTITY,
|
|
||||||
],
|
|
||||||
remove: [
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.DELETE,
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.REMOVE_BY_PROPERTY,
|
|
||||||
],
|
|
||||||
fetch: [
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.GET_ENTITY,
|
|
||||||
],
|
|
||||||
retrieve: [
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.GET_ENTITY,
|
|
||||||
],
|
|
||||||
load: [
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.GET_ENTITY,
|
|
||||||
],
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const [keyword, keywords] of Object.entries(suggestionMap)) {
|
|
||||||
if (lowerName.includes(keyword)) {
|
|
||||||
suggestions.push(...keywords)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (lowerName.includes("get") && lowerName.includes("all")) {
|
|
||||||
suggestions.push(
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.FIND_ALL,
|
|
||||||
REPOSITORY_METHOD_SUGGESTIONS.LIST_ALL,
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (suggestions.length === 0) {
|
|
||||||
return REPOSITORY_METHOD_SUGGESTIONS.DEFAULT_SUGGESTION
|
|
||||||
}
|
|
||||||
|
|
||||||
return `Consider: ${suggestions.slice(0, 3).join(", ")}`
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Detects non-domain method names in repository interfaces
|
|
||||||
*/
|
|
||||||
private detectNonDomainMethodNames(
|
|
||||||
code: string,
|
|
||||||
filePath: string,
|
|
||||||
layer: string | undefined,
|
|
||||||
): RepositoryViolation[] {
|
|
||||||
const violations: RepositoryViolation[] = []
|
|
||||||
const lines = code.split("\n")
|
|
||||||
|
|
||||||
for (let i = 0; i < lines.length; i++) {
|
|
||||||
const line = lines[i]
|
|
||||||
const lineNumber = i + 1
|
|
||||||
|
|
||||||
const methodMatch = /^\s*(\w+)\s*\(/.exec(line)
|
|
||||||
|
|
||||||
if (methodMatch) {
|
|
||||||
const methodName = methodMatch[1]
|
|
||||||
|
|
||||||
if (!this.isDomainMethodName(methodName) && !line.trim().startsWith("//")) {
|
|
||||||
const suggestion = this.suggestDomainMethodName(methodName)
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.DOMAIN,
|
|
||||||
lineNumber,
|
|
||||||
`Method '${methodName}' uses technical name instead of domain language. ${suggestion}`,
|
|
||||||
undefined,
|
|
||||||
undefined,
|
|
||||||
methodName,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Detects concrete repository usage in use cases
|
|
||||||
*/
|
|
||||||
private detectConcreteRepositoryUsage(
|
|
||||||
code: string,
|
|
||||||
filePath: string,
|
|
||||||
layer: string | undefined,
|
|
||||||
): RepositoryViolation[] {
|
|
||||||
const violations: RepositoryViolation[] = []
|
|
||||||
const lines = code.split("\n")
|
|
||||||
|
|
||||||
for (let i = 0; i < lines.length; i++) {
|
|
||||||
const line = lines[i]
|
|
||||||
const lineNumber = i + 1
|
|
||||||
|
|
||||||
const constructorParamMatch =
|
|
||||||
/constructor\s*\([^)]*(?:private|public|protected)\s+(?:readonly\s+)?(\w+)\s*:\s*([A-Z]\w*Repository)/.exec(
|
|
||||||
line,
|
|
||||||
)
|
|
||||||
|
|
||||||
if (constructorParamMatch) {
|
|
||||||
const repositoryType = constructorParamMatch[2]
|
|
||||||
|
|
||||||
if (!repositoryType.startsWith("I")) {
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.APPLICATION,
|
|
||||||
lineNumber,
|
|
||||||
`Use case depends on concrete repository '${repositoryType}'`,
|
|
||||||
undefined,
|
|
||||||
repositoryType,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const fieldMatch =
|
|
||||||
/(?:private|public|protected)\s+(?:readonly\s+)?(\w+)\s*:\s*([A-Z]\w*Repository)/.exec(
|
|
||||||
line,
|
|
||||||
)
|
|
||||||
|
|
||||||
if (fieldMatch) {
|
|
||||||
const repositoryType = fieldMatch[2]
|
|
||||||
|
|
||||||
if (
|
|
||||||
!repositoryType.startsWith("I") &&
|
|
||||||
!line.includes(REPOSITORY_PATTERN_MESSAGES.CONSTRUCTOR)
|
|
||||||
) {
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.APPLICATION,
|
|
||||||
lineNumber,
|
|
||||||
`Use case field uses concrete repository '${repositoryType}'`,
|
|
||||||
undefined,
|
|
||||||
repositoryType,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Detects 'new Repository()' instantiation in use cases
|
|
||||||
*/
|
|
||||||
private detectNewRepositoryInstantiation(
|
|
||||||
code: string,
|
|
||||||
filePath: string,
|
|
||||||
layer: string | undefined,
|
|
||||||
): RepositoryViolation[] {
|
|
||||||
const violations: RepositoryViolation[] = []
|
|
||||||
const lines = code.split("\n")
|
|
||||||
|
|
||||||
for (let i = 0; i < lines.length; i++) {
|
|
||||||
const line = lines[i]
|
|
||||||
const lineNumber = i + 1
|
|
||||||
|
|
||||||
const newRepositoryMatch = /new\s+([A-Z]\w*Repository)\s*\(/.exec(line)
|
|
||||||
|
|
||||||
if (newRepositoryMatch && !line.trim().startsWith("//")) {
|
|
||||||
const repositoryName = newRepositoryMatch[1]
|
|
||||||
violations.push(
|
|
||||||
RepositoryViolation.create(
|
|
||||||
REPOSITORY_VIOLATION_TYPES.NEW_REPOSITORY_IN_USE_CASE,
|
|
||||||
filePath,
|
|
||||||
layer || LAYERS.APPLICATION,
|
|
||||||
lineNumber,
|
|
||||||
`Use case creates repository with 'new ${repositoryName}()'`,
|
|
||||||
undefined,
|
|
||||||
repositoryName,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return violations
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Extracts ORM type name from a code line
|
|
||||||
*/
|
|
||||||
private extractOrmType(line: string): string {
|
|
||||||
for (const pattern of this.ormTypePatterns) {
|
|
||||||
const match = line.match(pattern)
|
|
||||||
if (match) {
|
|
||||||
const startIdx = match.index || 0
|
|
||||||
const typeMatch = /[\w.]+/.exec(line.slice(startIdx))
|
|
||||||
return typeMatch ? typeMatch[0] : REPOSITORY_PATTERN_MESSAGES.UNKNOWN_TYPE
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return REPOSITORY_PATTERN_MESSAGES.UNKNOWN_TYPE
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
168
packages/guardian/src/infrastructure/analyzers/SecretDetector.ts
Normal file
168
packages/guardian/src/infrastructure/analyzers/SecretDetector.ts
Normal file
@@ -0,0 +1,168 @@
|
|||||||
|
import { createEngine } from "@secretlint/node"
|
||||||
|
import type { SecretLintConfigDescriptor } from "@secretlint/types"
|
||||||
|
import { ISecretDetector } from "../../domain/services/ISecretDetector"
|
||||||
|
import { SecretViolation } from "../../domain/value-objects/SecretViolation"
|
||||||
|
import { SECRET_KEYWORDS, SECRET_TYPE_NAMES } from "../../domain/constants/SecretExamples"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects hardcoded secrets in TypeScript/JavaScript code
|
||||||
|
*
|
||||||
|
* Uses industry-standard Secretlint library to detect 350+ types of secrets
|
||||||
|
* including AWS keys, GitHub tokens, NPM tokens, SSH keys, API keys, and more.
|
||||||
|
*
|
||||||
|
* All detected secrets are marked as CRITICAL severity because they represent
|
||||||
|
* serious security risks that could lead to unauthorized access or data breaches.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* const detector = new SecretDetector()
|
||||||
|
* const code = `const AWS_KEY = "AKIA1234567890ABCDEF"`
|
||||||
|
* const violations = await detector.detectAll(code, 'config.ts')
|
||||||
|
* // Returns array of SecretViolation objects with CRITICAL severity
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export class SecretDetector implements ISecretDetector {
|
||||||
|
private readonly secretlintConfig: SecretLintConfigDescriptor = {
|
||||||
|
rules: [
|
||||||
|
{
|
||||||
|
id: "@secretlint/secretlint-rule-preset-recommend",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects all types of hardcoded secrets in the provided code
|
||||||
|
*
|
||||||
|
* @param code - Source code to analyze
|
||||||
|
* @param filePath - Path to the file being analyzed
|
||||||
|
* @returns Promise resolving to array of secret violations
|
||||||
|
*/
|
||||||
|
public async detectAll(code: string, filePath: string): Promise<SecretViolation[]> {
|
||||||
|
try {
|
||||||
|
const engine = await createEngine({
|
||||||
|
cwd: process.cwd(),
|
||||||
|
configFileJSON: this.secretlintConfig,
|
||||||
|
formatter: "stylish",
|
||||||
|
color: false,
|
||||||
|
})
|
||||||
|
|
||||||
|
const result = await engine.executeOnContent({
|
||||||
|
content: code,
|
||||||
|
filePath,
|
||||||
|
})
|
||||||
|
|
||||||
|
return this.parseOutputToViolations(result.output, filePath)
|
||||||
|
} catch (_error) {
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private parseOutputToViolations(output: string, filePath: string): SecretViolation[] {
|
||||||
|
const violations: SecretViolation[] = []
|
||||||
|
|
||||||
|
if (!output || output.trim() === "") {
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
const lines = output.split("\n")
|
||||||
|
|
||||||
|
for (const line of lines) {
|
||||||
|
const match = /^\s*(\d+):(\d+)\s+(error|warning)\s+(.+?)\s+(.+)$/.exec(line)
|
||||||
|
|
||||||
|
if (match) {
|
||||||
|
const [, lineNum, column, , message, ruleId] = match
|
||||||
|
const secretType = this.extractSecretType(message, ruleId)
|
||||||
|
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
filePath,
|
||||||
|
parseInt(lineNum, 10),
|
||||||
|
parseInt(column, 10),
|
||||||
|
secretType,
|
||||||
|
message,
|
||||||
|
)
|
||||||
|
|
||||||
|
violations.push(violation)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractSecretType(message: string, ruleId: string): string {
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.AWS)) {
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.ACCESS_KEY)) {
|
||||||
|
return SECRET_TYPE_NAMES.AWS_ACCESS_KEY
|
||||||
|
}
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.SECRET)) {
|
||||||
|
return SECRET_TYPE_NAMES.AWS_SECRET_KEY
|
||||||
|
}
|
||||||
|
return SECRET_TYPE_NAMES.AWS_CREDENTIAL
|
||||||
|
}
|
||||||
|
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.GITHUB)) {
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.PERSONAL_ACCESS_TOKEN)) {
|
||||||
|
return SECRET_TYPE_NAMES.GITHUB_PERSONAL_ACCESS_TOKEN
|
||||||
|
}
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.OAUTH)) {
|
||||||
|
return SECRET_TYPE_NAMES.GITHUB_OAUTH_TOKEN
|
||||||
|
}
|
||||||
|
return SECRET_TYPE_NAMES.GITHUB_TOKEN
|
||||||
|
}
|
||||||
|
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.NPM)) {
|
||||||
|
return SECRET_TYPE_NAMES.NPM_TOKEN
|
||||||
|
}
|
||||||
|
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.GCP) || ruleId.includes(SECRET_KEYWORDS.GOOGLE)) {
|
||||||
|
return SECRET_TYPE_NAMES.GCP_SERVICE_ACCOUNT_KEY
|
||||||
|
}
|
||||||
|
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.PRIVATEKEY) || ruleId.includes(SECRET_KEYWORDS.SSH)) {
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.RSA)) {
|
||||||
|
return SECRET_TYPE_NAMES.SSH_RSA_PRIVATE_KEY
|
||||||
|
}
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.DSA)) {
|
||||||
|
return SECRET_TYPE_NAMES.SSH_DSA_PRIVATE_KEY
|
||||||
|
}
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.ECDSA)) {
|
||||||
|
return SECRET_TYPE_NAMES.SSH_ECDSA_PRIVATE_KEY
|
||||||
|
}
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.ED25519)) {
|
||||||
|
return SECRET_TYPE_NAMES.SSH_ED25519_PRIVATE_KEY
|
||||||
|
}
|
||||||
|
return SECRET_TYPE_NAMES.SSH_PRIVATE_KEY
|
||||||
|
}
|
||||||
|
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.SLACK)) {
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.BOT)) {
|
||||||
|
return SECRET_TYPE_NAMES.SLACK_BOT_TOKEN
|
||||||
|
}
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.USER)) {
|
||||||
|
return SECRET_TYPE_NAMES.SLACK_USER_TOKEN
|
||||||
|
}
|
||||||
|
return SECRET_TYPE_NAMES.SLACK_TOKEN
|
||||||
|
}
|
||||||
|
|
||||||
|
if (ruleId.includes(SECRET_KEYWORDS.BASICAUTH)) {
|
||||||
|
return SECRET_TYPE_NAMES.BASIC_AUTH_CREDENTIALS
|
||||||
|
}
|
||||||
|
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.API_KEY)) {
|
||||||
|
return SECRET_TYPE_NAMES.API_KEY
|
||||||
|
}
|
||||||
|
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.TOKEN)) {
|
||||||
|
return SECRET_TYPE_NAMES.AUTHENTICATION_TOKEN
|
||||||
|
}
|
||||||
|
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.PASSWORD)) {
|
||||||
|
return SECRET_TYPE_NAMES.PASSWORD
|
||||||
|
}
|
||||||
|
|
||||||
|
if (message.toLowerCase().includes(SECRET_KEYWORDS.SECRET)) {
|
||||||
|
return SECRET_TYPE_NAMES.SECRET
|
||||||
|
}
|
||||||
|
|
||||||
|
return SECRET_TYPE_NAMES.SENSITIVE_DATA
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,177 @@
|
|||||||
|
import { DDD_FOLDER_NAMES } from "../constants/detectorPatterns"
|
||||||
|
import { IMPORT_PATTERNS } from "../constants/paths"
|
||||||
|
import { FolderRegistry } from "./FolderRegistry"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Analyzes file paths and imports to extract aggregate information
|
||||||
|
*
|
||||||
|
* Handles path normalization, aggregate extraction, and entity name detection
|
||||||
|
* for aggregate boundary validation.
|
||||||
|
*/
|
||||||
|
export class AggregatePathAnalyzer {
|
||||||
|
constructor(private readonly folderRegistry: FolderRegistry) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts the aggregate name from a file path
|
||||||
|
*
|
||||||
|
* Handles patterns like:
|
||||||
|
* - domain/aggregates/order/Order.ts → 'order'
|
||||||
|
* - domain/order/Order.ts → 'order'
|
||||||
|
* - domain/entities/order/Order.ts → 'order'
|
||||||
|
*/
|
||||||
|
public extractAggregateFromPath(filePath: string): string | undefined {
|
||||||
|
const normalizedPath = this.normalizePath(filePath)
|
||||||
|
const segments = this.getPathSegmentsAfterDomain(normalizedPath)
|
||||||
|
|
||||||
|
if (!segments || segments.length < 2) {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.findAggregateInSegments(segments)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts the aggregate name from an import path
|
||||||
|
*/
|
||||||
|
public extractAggregateFromImport(importPath: string): string | undefined {
|
||||||
|
const normalizedPath = importPath.replace(IMPORT_PATTERNS.QUOTE, "").toLowerCase()
|
||||||
|
const segments = normalizedPath.split("/").filter((seg) => seg !== ".." && seg !== ".")
|
||||||
|
|
||||||
|
if (segments.length === 0) {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.findAggregateInImportSegments(segments)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts the entity name from an import path
|
||||||
|
*/
|
||||||
|
public extractEntityName(importPath: string): string | undefined {
|
||||||
|
const normalizedPath = importPath.replace(IMPORT_PATTERNS.QUOTE, "")
|
||||||
|
const segments = normalizedPath.split("/")
|
||||||
|
const lastSegment = segments[segments.length - 1]
|
||||||
|
|
||||||
|
if (lastSegment) {
|
||||||
|
return lastSegment.replace(/\.(ts|js)$/, "")
|
||||||
|
}
|
||||||
|
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Normalizes a file path for consistent processing
|
||||||
|
*/
|
||||||
|
private normalizePath(filePath: string): string {
|
||||||
|
return filePath.toLowerCase().replace(/\\/g, "/")
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Gets path segments after the 'domain' folder
|
||||||
|
*/
|
||||||
|
private getPathSegmentsAfterDomain(normalizedPath: string): string[] | undefined {
|
||||||
|
const domainMatch = /(?:^|\/)(domain)\//.exec(normalizedPath)
|
||||||
|
if (!domainMatch) {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
const domainEndIndex = domainMatch.index + domainMatch[0].length
|
||||||
|
const pathAfterDomain = normalizedPath.substring(domainEndIndex)
|
||||||
|
return pathAfterDomain.split("/").filter(Boolean)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Finds aggregate name in path segments after domain folder
|
||||||
|
*/
|
||||||
|
private findAggregateInSegments(segments: string[]): string | undefined {
|
||||||
|
if (this.folderRegistry.isEntityFolder(segments[0])) {
|
||||||
|
return this.extractFromEntityFolder(segments)
|
||||||
|
}
|
||||||
|
|
||||||
|
const aggregate = segments[0]
|
||||||
|
if (this.folderRegistry.isNonAggregateFolder(aggregate)) {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
return aggregate
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts aggregate from entity folder structure
|
||||||
|
*/
|
||||||
|
private extractFromEntityFolder(segments: string[]): string | undefined {
|
||||||
|
if (segments.length < 3) {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
const aggregate = segments[1]
|
||||||
|
if (this.folderRegistry.isNonAggregateFolder(aggregate)) {
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
return aggregate
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Finds aggregate in import path segments
|
||||||
|
*/
|
||||||
|
private findAggregateInImportSegments(segments: string[]): string | undefined {
|
||||||
|
const aggregateFromDomainFolder = this.findAggregateAfterDomainFolder(segments)
|
||||||
|
if (aggregateFromDomainFolder) {
|
||||||
|
return aggregateFromDomainFolder
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.findAggregateFromSecondLastSegment(segments)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Finds aggregate after 'domain' or 'aggregates' folder in import
|
||||||
|
*/
|
||||||
|
private findAggregateAfterDomainFolder(segments: string[]): string | undefined {
|
||||||
|
for (let i = 0; i < segments.length; i++) {
|
||||||
|
const isDomainOrAggregatesFolder =
|
||||||
|
segments[i] === DDD_FOLDER_NAMES.DOMAIN ||
|
||||||
|
segments[i] === DDD_FOLDER_NAMES.AGGREGATES
|
||||||
|
|
||||||
|
if (!isDomainOrAggregatesFolder) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
if (i + 1 >= segments.length) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
const nextSegment = segments[i + 1]
|
||||||
|
const isEntityOrAggregateFolder =
|
||||||
|
this.folderRegistry.isEntityFolder(nextSegment) ||
|
||||||
|
nextSegment === DDD_FOLDER_NAMES.AGGREGATES
|
||||||
|
|
||||||
|
if (isEntityOrAggregateFolder) {
|
||||||
|
return i + 2 < segments.length ? segments[i + 2] : undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
return nextSegment
|
||||||
|
}
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts aggregate from second-to-last segment if applicable
|
||||||
|
*/
|
||||||
|
private findAggregateFromSecondLastSegment(segments: string[]): string | undefined {
|
||||||
|
if (segments.length >= 2) {
|
||||||
|
const secondLastSegment = segments[segments.length - 2]
|
||||||
|
|
||||||
|
if (
|
||||||
|
!this.folderRegistry.isEntityFolder(secondLastSegment) &&
|
||||||
|
!this.folderRegistry.isValueObjectFolder(secondLastSegment) &&
|
||||||
|
!this.folderRegistry.isAllowedFolder(secondLastSegment) &&
|
||||||
|
secondLastSegment !== DDD_FOLDER_NAMES.DOMAIN
|
||||||
|
) {
|
||||||
|
return secondLastSegment
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return undefined
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,96 @@
|
|||||||
|
/**
|
||||||
|
* Tracks braces and brackets in code for context analysis
|
||||||
|
*
|
||||||
|
* Used to determine if a line is inside an exported constant
|
||||||
|
* by counting unclosed braces and brackets.
|
||||||
|
*/
|
||||||
|
export class BraceTracker {
|
||||||
|
/**
|
||||||
|
* Counts unclosed braces and brackets between two line indices
|
||||||
|
*/
|
||||||
|
public countUnclosed(
|
||||||
|
lines: string[],
|
||||||
|
startLine: number,
|
||||||
|
endLine: number,
|
||||||
|
): { braces: number; brackets: number } {
|
||||||
|
let braces = 0
|
||||||
|
let brackets = 0
|
||||||
|
|
||||||
|
for (let i = startLine; i <= endLine; i++) {
|
||||||
|
const counts = this.countInLine(lines[i])
|
||||||
|
braces += counts.braces
|
||||||
|
brackets += counts.brackets
|
||||||
|
}
|
||||||
|
|
||||||
|
return { braces, brackets }
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Counts braces and brackets in a single line
|
||||||
|
*/
|
||||||
|
private countInLine(line: string): { braces: number; brackets: number } {
|
||||||
|
let braces = 0
|
||||||
|
let brackets = 0
|
||||||
|
let inString = false
|
||||||
|
let stringChar = ""
|
||||||
|
|
||||||
|
for (let j = 0; j < line.length; j++) {
|
||||||
|
const char = line[j]
|
||||||
|
const prevChar = j > 0 ? line[j - 1] : ""
|
||||||
|
|
||||||
|
this.updateStringState(
|
||||||
|
char,
|
||||||
|
prevChar,
|
||||||
|
inString,
|
||||||
|
stringChar,
|
||||||
|
(newInString, newStringChar) => {
|
||||||
|
inString = newInString
|
||||||
|
stringChar = newStringChar
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
if (!inString) {
|
||||||
|
const counts = this.countChar(char)
|
||||||
|
braces += counts.braces
|
||||||
|
brackets += counts.brackets
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return { braces, brackets }
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Updates string tracking state
|
||||||
|
*/
|
||||||
|
private updateStringState(
|
||||||
|
char: string,
|
||||||
|
prevChar: string,
|
||||||
|
inString: boolean,
|
||||||
|
stringChar: string,
|
||||||
|
callback: (inString: boolean, stringChar: string) => void,
|
||||||
|
): void {
|
||||||
|
if ((char === "'" || char === '"' || char === "`") && prevChar !== "\\") {
|
||||||
|
if (!inString) {
|
||||||
|
callback(true, char)
|
||||||
|
} else if (char === stringChar) {
|
||||||
|
callback(false, "")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Counts a single character
|
||||||
|
*/
|
||||||
|
private countChar(char: string): { braces: number; brackets: number } {
|
||||||
|
if (char === "{") {
|
||||||
|
return { braces: 1, brackets: 0 }
|
||||||
|
} else if (char === "}") {
|
||||||
|
return { braces: -1, brackets: 0 }
|
||||||
|
} else if (char === "[") {
|
||||||
|
return { braces: 0, brackets: 1 }
|
||||||
|
} else if (char === "]") {
|
||||||
|
return { braces: 0, brackets: -1 }
|
||||||
|
}
|
||||||
|
return { braces: 0, brackets: 0 }
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,21 @@
|
|||||||
|
/**
|
||||||
|
* Checks if a file is a constants definition file
|
||||||
|
*
|
||||||
|
* Identifies files that should be skipped for hardcode detection
|
||||||
|
* since they are meant to contain constant definitions.
|
||||||
|
*/
|
||||||
|
export class ConstantsFileChecker {
|
||||||
|
private readonly constantsPatterns = [
|
||||||
|
/^constants?\.(ts|js)$/i,
|
||||||
|
/constants?\/.*\.(ts|js)$/i,
|
||||||
|
/\/(constants|config|settings|defaults|tokens)\.ts$/i,
|
||||||
|
/\/di\/tokens\.(ts|js)$/i,
|
||||||
|
]
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a file path represents a constants file
|
||||||
|
*/
|
||||||
|
public isConstantsFile(filePath: string): boolean {
|
||||||
|
return this.constantsPatterns.some((pattern) => pattern.test(filePath))
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,112 @@
|
|||||||
|
import { CODE_PATTERNS } from "../constants/defaults"
|
||||||
|
import { BraceTracker } from "./BraceTracker"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Analyzes export const declarations in code
|
||||||
|
*
|
||||||
|
* Determines if a line is inside an exported constant declaration
|
||||||
|
* to skip hardcode detection in constant definitions.
|
||||||
|
*/
|
||||||
|
export class ExportConstantAnalyzer {
|
||||||
|
constructor(private readonly braceTracker: BraceTracker) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a line is inside an exported constant definition
|
||||||
|
*/
|
||||||
|
public isInExportedConstant(lines: string[], lineIndex: number): boolean {
|
||||||
|
const currentLineTrimmed = lines[lineIndex].trim()
|
||||||
|
|
||||||
|
if (this.isSingleLineExportConst(currentLineTrimmed)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
const exportConstStart = this.findExportConstStart(lines, lineIndex)
|
||||||
|
if (exportConstStart === -1) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
const { braces, brackets } = this.braceTracker.countUnclosed(
|
||||||
|
lines,
|
||||||
|
exportConstStart,
|
||||||
|
lineIndex,
|
||||||
|
)
|
||||||
|
|
||||||
|
return braces > 0 || brackets > 0
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a line is a single-line export const declaration
|
||||||
|
*/
|
||||||
|
public isSingleLineExportConst(line: string): boolean {
|
||||||
|
if (!line.startsWith(CODE_PATTERNS.EXPORT_CONST)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
const hasObjectOrArray = this.hasObjectOrArray(line)
|
||||||
|
|
||||||
|
if (hasObjectOrArray) {
|
||||||
|
return this.hasAsConstEnding(line)
|
||||||
|
}
|
||||||
|
|
||||||
|
return line.includes(CODE_PATTERNS.AS_CONST)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Finds the starting line of an export const declaration
|
||||||
|
*/
|
||||||
|
public findExportConstStart(lines: string[], lineIndex: number): number {
|
||||||
|
for (let currentLine = lineIndex; currentLine >= 0; currentLine--) {
|
||||||
|
const trimmed = lines[currentLine].trim()
|
||||||
|
|
||||||
|
if (this.isExportConstWithStructure(trimmed)) {
|
||||||
|
return currentLine
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isTopLevelStatement(trimmed, currentLine, lineIndex)) {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return -1
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line has object or array structure
|
||||||
|
*/
|
||||||
|
private hasObjectOrArray(line: string): boolean {
|
||||||
|
return line.includes(CODE_PATTERNS.OBJECT_START) || line.includes(CODE_PATTERNS.ARRAY_START)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line has 'as const' ending
|
||||||
|
*/
|
||||||
|
private hasAsConstEnding(line: string): boolean {
|
||||||
|
return (
|
||||||
|
line.includes(CODE_PATTERNS.AS_CONST_OBJECT) ||
|
||||||
|
line.includes(CODE_PATTERNS.AS_CONST_ARRAY) ||
|
||||||
|
line.includes(CODE_PATTERNS.AS_CONST_END_SEMICOLON_OBJECT) ||
|
||||||
|
line.includes(CODE_PATTERNS.AS_CONST_END_SEMICOLON_ARRAY)
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line is export const with object or array
|
||||||
|
*/
|
||||||
|
private isExportConstWithStructure(trimmed: string): boolean {
|
||||||
|
return (
|
||||||
|
trimmed.startsWith(CODE_PATTERNS.EXPORT_CONST) &&
|
||||||
|
(trimmed.includes(CODE_PATTERNS.OBJECT_START) ||
|
||||||
|
trimmed.includes(CODE_PATTERNS.ARRAY_START))
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line is a top-level statement
|
||||||
|
*/
|
||||||
|
private isTopLevelStatement(trimmed: string, currentLine: number, lineIndex: number): boolean {
|
||||||
|
return (
|
||||||
|
currentLine < lineIndex &&
|
||||||
|
(trimmed.startsWith(CODE_PATTERNS.EXPORT) || trimmed.startsWith(CODE_PATTERNS.IMPORT))
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,72 @@
|
|||||||
|
import { DDD_FOLDER_NAMES } from "../constants/detectorPatterns"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Registry for DDD folder names used in aggregate boundary detection
|
||||||
|
*
|
||||||
|
* Centralizes folder name management for cleaner code organization
|
||||||
|
* and easier maintenance of folder name rules.
|
||||||
|
*/
|
||||||
|
export class FolderRegistry {
|
||||||
|
public readonly entityFolders: Set<string>
|
||||||
|
public readonly valueObjectFolders: Set<string>
|
||||||
|
public readonly allowedFolders: Set<string>
|
||||||
|
public readonly nonAggregateFolders: Set<string>
|
||||||
|
|
||||||
|
constructor() {
|
||||||
|
this.entityFolders = new Set<string>([
|
||||||
|
DDD_FOLDER_NAMES.ENTITIES,
|
||||||
|
DDD_FOLDER_NAMES.AGGREGATES,
|
||||||
|
])
|
||||||
|
|
||||||
|
this.valueObjectFolders = new Set<string>([
|
||||||
|
DDD_FOLDER_NAMES.VALUE_OBJECTS,
|
||||||
|
DDD_FOLDER_NAMES.VO,
|
||||||
|
])
|
||||||
|
|
||||||
|
this.allowedFolders = new Set<string>([
|
||||||
|
DDD_FOLDER_NAMES.VALUE_OBJECTS,
|
||||||
|
DDD_FOLDER_NAMES.VO,
|
||||||
|
DDD_FOLDER_NAMES.EVENTS,
|
||||||
|
DDD_FOLDER_NAMES.DOMAIN_EVENTS,
|
||||||
|
DDD_FOLDER_NAMES.REPOSITORIES,
|
||||||
|
DDD_FOLDER_NAMES.SERVICES,
|
||||||
|
DDD_FOLDER_NAMES.SPECIFICATIONS,
|
||||||
|
DDD_FOLDER_NAMES.ERRORS,
|
||||||
|
DDD_FOLDER_NAMES.EXCEPTIONS,
|
||||||
|
])
|
||||||
|
|
||||||
|
this.nonAggregateFolders = new Set<string>([
|
||||||
|
DDD_FOLDER_NAMES.VALUE_OBJECTS,
|
||||||
|
DDD_FOLDER_NAMES.VO,
|
||||||
|
DDD_FOLDER_NAMES.EVENTS,
|
||||||
|
DDD_FOLDER_NAMES.DOMAIN_EVENTS,
|
||||||
|
DDD_FOLDER_NAMES.REPOSITORIES,
|
||||||
|
DDD_FOLDER_NAMES.SERVICES,
|
||||||
|
DDD_FOLDER_NAMES.SPECIFICATIONS,
|
||||||
|
DDD_FOLDER_NAMES.ENTITIES,
|
||||||
|
DDD_FOLDER_NAMES.CONSTANTS,
|
||||||
|
DDD_FOLDER_NAMES.SHARED,
|
||||||
|
DDD_FOLDER_NAMES.FACTORIES,
|
||||||
|
DDD_FOLDER_NAMES.PORTS,
|
||||||
|
DDD_FOLDER_NAMES.INTERFACES,
|
||||||
|
DDD_FOLDER_NAMES.ERRORS,
|
||||||
|
DDD_FOLDER_NAMES.EXCEPTIONS,
|
||||||
|
])
|
||||||
|
}
|
||||||
|
|
||||||
|
public isEntityFolder(folderName: string): boolean {
|
||||||
|
return this.entityFolders.has(folderName)
|
||||||
|
}
|
||||||
|
|
||||||
|
public isValueObjectFolder(folderName: string): boolean {
|
||||||
|
return this.valueObjectFolders.has(folderName)
|
||||||
|
}
|
||||||
|
|
||||||
|
public isAllowedFolder(folderName: string): boolean {
|
||||||
|
return this.allowedFolders.has(folderName)
|
||||||
|
}
|
||||||
|
|
||||||
|
public isNonAggregateFolder(folderName: string): boolean {
|
||||||
|
return this.nonAggregateFolders.has(folderName)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,150 @@
|
|||||||
|
import { IMPORT_PATTERNS } from "../constants/paths"
|
||||||
|
import { AggregatePathAnalyzer } from "./AggregatePathAnalyzer"
|
||||||
|
import { FolderRegistry } from "./FolderRegistry"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validates imports for aggregate boundary violations
|
||||||
|
*
|
||||||
|
* Checks if imports cross aggregate boundaries inappropriately
|
||||||
|
* and ensures proper encapsulation in DDD architecture.
|
||||||
|
*/
|
||||||
|
export class ImportValidator {
|
||||||
|
constructor(
|
||||||
|
private readonly folderRegistry: FolderRegistry,
|
||||||
|
private readonly pathAnalyzer: AggregatePathAnalyzer,
|
||||||
|
) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if an import violates aggregate boundaries
|
||||||
|
*/
|
||||||
|
public isViolation(importPath: string, currentAggregate: string): boolean {
|
||||||
|
const normalizedPath = this.normalizeImportPath(importPath)
|
||||||
|
|
||||||
|
if (!this.isValidImportPath(normalizedPath)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInternalBoundedContextImport(normalizedPath)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
const targetAggregate = this.pathAnalyzer.extractAggregateFromImport(normalizedPath)
|
||||||
|
if (!targetAggregate || targetAggregate === currentAggregate) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isAllowedImport(normalizedPath)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.seemsLikeEntityImport(normalizedPath)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts all import paths from a line of code
|
||||||
|
*/
|
||||||
|
public extractImports(line: string): string[] {
|
||||||
|
const imports: string[] = []
|
||||||
|
|
||||||
|
this.extractEsImports(line, imports)
|
||||||
|
this.extractRequireImports(line, imports)
|
||||||
|
|
||||||
|
return imports
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Normalizes an import path for consistent processing
|
||||||
|
*/
|
||||||
|
private normalizeImportPath(importPath: string): string {
|
||||||
|
return importPath.replace(IMPORT_PATTERNS.QUOTE, "").toLowerCase()
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if import path is valid for analysis
|
||||||
|
*/
|
||||||
|
private isValidImportPath(normalizedPath: string): boolean {
|
||||||
|
if (!normalizedPath.includes("/")) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!normalizedPath.startsWith(".") && !normalizedPath.startsWith("/")) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if import is internal to the same bounded context
|
||||||
|
*/
|
||||||
|
private isInternalBoundedContextImport(normalizedPath: string): boolean {
|
||||||
|
const parts = normalizedPath.split("/")
|
||||||
|
const dotDotCount = parts.filter((p) => p === "..").length
|
||||||
|
|
||||||
|
if (dotDotCount === 1) {
|
||||||
|
const nonDotParts = parts.filter((p) => p !== ".." && p !== ".")
|
||||||
|
if (nonDotParts.length >= 1) {
|
||||||
|
const firstFolder = nonDotParts[0]
|
||||||
|
if (this.folderRegistry.isEntityFolder(firstFolder)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if import is from an allowed folder
|
||||||
|
*/
|
||||||
|
private isAllowedImport(normalizedPath: string): boolean {
|
||||||
|
for (const folderName of this.folderRegistry.allowedFolders) {
|
||||||
|
if (normalizedPath.includes(`/${folderName}/`)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if import seems to be an entity
|
||||||
|
*/
|
||||||
|
private seemsLikeEntityImport(normalizedPath: string): boolean {
|
||||||
|
const pathParts = normalizedPath.split("/")
|
||||||
|
const lastPart = pathParts[pathParts.length - 1]
|
||||||
|
|
||||||
|
if (!lastPart) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
const filename = lastPart.replace(/\.(ts|js)$/, "")
|
||||||
|
|
||||||
|
if (filename.length > 0 && /^[a-z][a-z]/.exec(filename)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts ES6 imports from a line
|
||||||
|
*/
|
||||||
|
private extractEsImports(line: string, imports: string[]): void {
|
||||||
|
let match = IMPORT_PATTERNS.ES_IMPORT.exec(line)
|
||||||
|
while (match) {
|
||||||
|
imports.push(match[1])
|
||||||
|
match = IMPORT_PATTERNS.ES_IMPORT.exec(line)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts CommonJS requires from a line
|
||||||
|
*/
|
||||||
|
private extractRequireImports(line: string, imports: string[]): void {
|
||||||
|
let match = IMPORT_PATTERNS.REQUIRE.exec(line)
|
||||||
|
while (match) {
|
||||||
|
imports.push(match[1])
|
||||||
|
match = IMPORT_PATTERNS.REQUIRE.exec(line)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,171 @@
|
|||||||
|
import { HardcodedValue } from "../../domain/value-objects/HardcodedValue"
|
||||||
|
import { ALLOWED_NUMBERS, DETECTION_KEYWORDS } from "../constants/defaults"
|
||||||
|
import { HARDCODE_TYPES } from "../../shared/constants"
|
||||||
|
import { ExportConstantAnalyzer } from "./ExportConstantAnalyzer"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects magic numbers in code
|
||||||
|
*
|
||||||
|
* Identifies hardcoded numeric values that should be extracted
|
||||||
|
* to constants, excluding allowed values and exported constants.
|
||||||
|
*/
|
||||||
|
export class MagicNumberMatcher {
|
||||||
|
private readonly numberPatterns = [
|
||||||
|
/(?:setTimeout|setInterval)\s*\(\s*[^,]+,\s*(\d+)/g,
|
||||||
|
/(?:maxRetries|retries|attempts)\s*[=:]\s*(\d+)/gi,
|
||||||
|
/(?:limit|max|min)\s*[=:]\s*(\d+)/gi,
|
||||||
|
/(?:port|PORT)\s*[=:]\s*(\d+)/g,
|
||||||
|
/(?:delay|timeout|TIMEOUT)\s*[=:]\s*(\d+)/gi,
|
||||||
|
]
|
||||||
|
|
||||||
|
constructor(private readonly exportAnalyzer: ExportConstantAnalyzer) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects magic numbers in code
|
||||||
|
*/
|
||||||
|
public detect(code: string): HardcodedValue[] {
|
||||||
|
const results: HardcodedValue[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
|
lines.forEach((line, lineIndex) => {
|
||||||
|
if (this.shouldSkipLine(line, lines, lineIndex)) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
this.detectInPatterns(line, lineIndex, results)
|
||||||
|
this.detectGenericNumbers(line, lineIndex, results)
|
||||||
|
})
|
||||||
|
|
||||||
|
return results
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line should be skipped
|
||||||
|
*/
|
||||||
|
private shouldSkipLine(line: string, lines: string[], lineIndex: number): boolean {
|
||||||
|
if (line.trim().startsWith("//") || line.trim().startsWith("*")) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.exportAnalyzer.isInExportedConstant(lines, lineIndex)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects numbers in specific patterns
|
||||||
|
*/
|
||||||
|
private detectInPatterns(line: string, lineIndex: number, results: HardcodedValue[]): void {
|
||||||
|
this.numberPatterns.forEach((pattern) => {
|
||||||
|
let match
|
||||||
|
const regex = new RegExp(pattern)
|
||||||
|
|
||||||
|
while ((match = regex.exec(line)) !== null) {
|
||||||
|
const value = parseInt(match[1], 10)
|
||||||
|
|
||||||
|
if (!ALLOWED_NUMBERS.has(value)) {
|
||||||
|
results.push(
|
||||||
|
HardcodedValue.create(
|
||||||
|
value,
|
||||||
|
HARDCODE_TYPES.MAGIC_NUMBER,
|
||||||
|
lineIndex + 1,
|
||||||
|
match.index,
|
||||||
|
line.trim(),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects generic 3+ digit numbers
|
||||||
|
*/
|
||||||
|
private detectGenericNumbers(line: string, lineIndex: number, results: HardcodedValue[]): void {
|
||||||
|
const genericNumberRegex = /\b(\d{3,})\b/g
|
||||||
|
let match
|
||||||
|
|
||||||
|
while ((match = genericNumberRegex.exec(line)) !== null) {
|
||||||
|
const value = parseInt(match[1], 10)
|
||||||
|
|
||||||
|
if (this.shouldDetectNumber(value, line, match.index)) {
|
||||||
|
results.push(
|
||||||
|
HardcodedValue.create(
|
||||||
|
value,
|
||||||
|
HARDCODE_TYPES.MAGIC_NUMBER,
|
||||||
|
lineIndex + 1,
|
||||||
|
match.index,
|
||||||
|
line.trim(),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if number should be detected
|
||||||
|
*/
|
||||||
|
private shouldDetectNumber(value: number, line: string, index: number): boolean {
|
||||||
|
if (ALLOWED_NUMBERS.has(value)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInComment(line, index)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInString(line, index)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
const context = this.extractContext(line, index)
|
||||||
|
return this.looksLikeMagicNumber(context)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if position is in a comment
|
||||||
|
*/
|
||||||
|
private isInComment(line: string, index: number): boolean {
|
||||||
|
const beforeIndex = line.substring(0, index)
|
||||||
|
return beforeIndex.includes("//") || beforeIndex.includes("/*")
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if position is in a string
|
||||||
|
*/
|
||||||
|
private isInString(line: string, index: number): boolean {
|
||||||
|
const beforeIndex = line.substring(0, index)
|
||||||
|
const singleQuotes = (beforeIndex.match(/'/g) ?? []).length
|
||||||
|
const doubleQuotes = (beforeIndex.match(/"/g) ?? []).length
|
||||||
|
const backticks = (beforeIndex.match(/`/g) ?? []).length
|
||||||
|
|
||||||
|
return singleQuotes % 2 !== 0 || doubleQuotes % 2 !== 0 || backticks % 2 !== 0
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts context around a position
|
||||||
|
*/
|
||||||
|
private extractContext(line: string, index: number): string {
|
||||||
|
const start = Math.max(0, index - 30)
|
||||||
|
const end = Math.min(line.length, index + 30)
|
||||||
|
return line.substring(start, end)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if context suggests a magic number
|
||||||
|
*/
|
||||||
|
private looksLikeMagicNumber(context: string): boolean {
|
||||||
|
const lowerContext = context.toLowerCase()
|
||||||
|
|
||||||
|
const configKeywords = [
|
||||||
|
DETECTION_KEYWORDS.TIMEOUT,
|
||||||
|
DETECTION_KEYWORDS.DELAY,
|
||||||
|
DETECTION_KEYWORDS.RETRY,
|
||||||
|
DETECTION_KEYWORDS.LIMIT,
|
||||||
|
DETECTION_KEYWORDS.MAX,
|
||||||
|
DETECTION_KEYWORDS.MIN,
|
||||||
|
DETECTION_KEYWORDS.PORT,
|
||||||
|
DETECTION_KEYWORDS.INTERVAL,
|
||||||
|
]
|
||||||
|
|
||||||
|
return configKeywords.some((keyword) => lowerContext.includes(keyword))
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,220 @@
|
|||||||
|
import { HardcodedValue } from "../../domain/value-objects/HardcodedValue"
|
||||||
|
import { DETECTION_KEYWORDS } from "../constants/defaults"
|
||||||
|
import { HARDCODE_TYPES } from "../../shared/constants"
|
||||||
|
import { ExportConstantAnalyzer } from "./ExportConstantAnalyzer"
|
||||||
|
import {
|
||||||
|
DYNAMIC_IMPORT_PATTERN_PARTS,
|
||||||
|
REGEX_ESCAPE_PATTERN,
|
||||||
|
} from "../../domain/constants/SecretExamples"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects magic strings in code
|
||||||
|
*
|
||||||
|
* Identifies hardcoded string values that should be extracted
|
||||||
|
* to constants, excluding test code, console logs, and type contexts.
|
||||||
|
*/
|
||||||
|
export class MagicStringMatcher {
|
||||||
|
private readonly stringRegex = /(['"`])(?:(?!\1).)+\1/g
|
||||||
|
|
||||||
|
private readonly allowedPatterns = [/^[a-z]$/i, /^\/$/, /^\\$/, /^\s+$/, /^,$/, /^\.$/]
|
||||||
|
|
||||||
|
private readonly typeContextPatterns = [
|
||||||
|
/^\s*type\s+\w+\s*=/i,
|
||||||
|
/^\s*interface\s+\w+/i,
|
||||||
|
/^\s*\w+\s*:\s*['"`]/,
|
||||||
|
/\s+as\s+['"`]/,
|
||||||
|
/Record<.*,\s*import\(/,
|
||||||
|
/typeof\s+\w+\s*===\s*['"`]/,
|
||||||
|
/['"`]\s*===\s*typeof\s+\w+/,
|
||||||
|
]
|
||||||
|
|
||||||
|
constructor(private readonly exportAnalyzer: ExportConstantAnalyzer) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects magic strings in code
|
||||||
|
*/
|
||||||
|
public detect(code: string): HardcodedValue[] {
|
||||||
|
const results: HardcodedValue[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
|
lines.forEach((line, lineIndex) => {
|
||||||
|
if (this.shouldSkipLine(line, lines, lineIndex)) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
this.detectStringsInLine(line, lineIndex, results)
|
||||||
|
})
|
||||||
|
|
||||||
|
return results
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line should be skipped
|
||||||
|
*/
|
||||||
|
private shouldSkipLine(line: string, lines: string[], lineIndex: number): boolean {
|
||||||
|
if (
|
||||||
|
line.trim().startsWith("//") ||
|
||||||
|
line.trim().startsWith("*") ||
|
||||||
|
line.includes("import ") ||
|
||||||
|
line.includes("from ")
|
||||||
|
) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.exportAnalyzer.isInExportedConstant(lines, lineIndex)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects strings in a single line
|
||||||
|
*/
|
||||||
|
private detectStringsInLine(line: string, lineIndex: number, results: HardcodedValue[]): void {
|
||||||
|
let match
|
||||||
|
const regex = new RegExp(this.stringRegex)
|
||||||
|
|
||||||
|
while ((match = regex.exec(line)) !== null) {
|
||||||
|
const fullMatch = match[0]
|
||||||
|
const value = fullMatch.slice(1, -1)
|
||||||
|
|
||||||
|
if (this.shouldDetectString(fullMatch, value, line)) {
|
||||||
|
results.push(
|
||||||
|
HardcodedValue.create(
|
||||||
|
value,
|
||||||
|
HARDCODE_TYPES.MAGIC_STRING,
|
||||||
|
lineIndex + 1,
|
||||||
|
match.index,
|
||||||
|
line.trim(),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if string should be detected
|
||||||
|
*/
|
||||||
|
private shouldDetectString(fullMatch: string, value: string, line: string): boolean {
|
||||||
|
if (fullMatch.startsWith("`") || value.includes("${")) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isAllowedString(value)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.looksLikeMagicString(line, value)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if string is allowed (short strings, single chars, etc.)
|
||||||
|
*/
|
||||||
|
private isAllowedString(str: string): boolean {
|
||||||
|
if (str.length <= 1) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.allowedPatterns.some((pattern) => pattern.test(str))
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line context suggests a magic string
|
||||||
|
*/
|
||||||
|
private looksLikeMagicString(line: string, value: string): boolean {
|
||||||
|
const lowerLine = line.toLowerCase()
|
||||||
|
|
||||||
|
if (this.isTestCode(lowerLine)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isConsoleLog(lowerLine)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInTypeContext(line)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInSymbolCall(line, value)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isInImportCall(line, value)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.isUrlOrApi(value)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
if (/^\d{2,}$/.test(value)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return value.length > 3
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line is test code
|
||||||
|
*/
|
||||||
|
private isTestCode(lowerLine: string): boolean {
|
||||||
|
return (
|
||||||
|
lowerLine.includes(DETECTION_KEYWORDS.TEST) ||
|
||||||
|
lowerLine.includes(DETECTION_KEYWORDS.DESCRIBE)
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line is console log
|
||||||
|
*/
|
||||||
|
private isConsoleLog(lowerLine: string): boolean {
|
||||||
|
return (
|
||||||
|
lowerLine.includes(DETECTION_KEYWORDS.CONSOLE_LOG) ||
|
||||||
|
lowerLine.includes(DETECTION_KEYWORDS.CONSOLE_ERROR)
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if line is in type context
|
||||||
|
*/
|
||||||
|
private isInTypeContext(line: string): boolean {
|
||||||
|
const trimmedLine = line.trim()
|
||||||
|
|
||||||
|
if (this.typeContextPatterns.some((pattern) => pattern.test(trimmedLine))) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
if (trimmedLine.includes("|") && /['"`][^'"`]+['"`]\s*\|/.test(trimmedLine)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if string is inside Symbol() call
|
||||||
|
*/
|
||||||
|
private isInSymbolCall(line: string, stringValue: string): boolean {
|
||||||
|
const escapedValue = stringValue.replace(
|
||||||
|
/[.*+?^${}()|[\]\\]/g,
|
||||||
|
REGEX_ESCAPE_PATTERN.DOLLAR_AMPERSAND,
|
||||||
|
)
|
||||||
|
const symbolPattern = new RegExp(`Symbol\\s*\\(\\s*['"\`]${escapedValue}['"\`]\\s*\\)`)
|
||||||
|
return symbolPattern.test(line)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if string is inside import() call
|
||||||
|
*/
|
||||||
|
private isInImportCall(line: string, stringValue: string): boolean {
|
||||||
|
const importPattern = new RegExp(
|
||||||
|
`import\\s*\\(\\s*['${DYNAMIC_IMPORT_PATTERN_PARTS.QUOTE_START}'${DYNAMIC_IMPORT_PATTERN_PARTS.QUOTE_END}"]\\s*\\)`,
|
||||||
|
)
|
||||||
|
return importPattern.test(line) && line.includes(stringValue)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if string contains URL or API reference
|
||||||
|
*/
|
||||||
|
private isUrlOrApi(value: string): boolean {
|
||||||
|
return value.includes(DETECTION_KEYWORDS.HTTP) || value.includes(DETECTION_KEYWORDS.API)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,134 @@
|
|||||||
|
import { REPOSITORY_METHOD_SUGGESTIONS } from "../constants/detectorPatterns"
|
||||||
|
import { OrmTypeMatcher } from "./OrmTypeMatcher"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validates repository method names for domain language compliance
|
||||||
|
*
|
||||||
|
* Ensures repository methods use domain language instead of
|
||||||
|
* technical database terminology.
|
||||||
|
*/
|
||||||
|
export class MethodNameValidator {
|
||||||
|
private readonly domainMethodPatterns = [
|
||||||
|
/^findBy[A-Z]/,
|
||||||
|
/^findAll$/,
|
||||||
|
/^find[A-Z]/,
|
||||||
|
/^save$/,
|
||||||
|
/^saveAll$/,
|
||||||
|
/^create$/,
|
||||||
|
/^update$/,
|
||||||
|
/^delete$/,
|
||||||
|
/^deleteBy[A-Z]/,
|
||||||
|
/^deleteAll$/,
|
||||||
|
/^remove$/,
|
||||||
|
/^removeBy[A-Z]/,
|
||||||
|
/^removeAll$/,
|
||||||
|
/^add$/,
|
||||||
|
/^add[A-Z]/,
|
||||||
|
/^get[A-Z]/,
|
||||||
|
/^getAll$/,
|
||||||
|
/^search/,
|
||||||
|
/^list/,
|
||||||
|
/^has[A-Z]/,
|
||||||
|
/^is[A-Z]/,
|
||||||
|
/^exists$/,
|
||||||
|
/^exists[A-Z]/,
|
||||||
|
/^existsBy[A-Z]/,
|
||||||
|
/^clear[A-Z]/,
|
||||||
|
/^clearAll$/,
|
||||||
|
/^store[A-Z]/,
|
||||||
|
/^initialize$/,
|
||||||
|
/^initializeCollection$/,
|
||||||
|
/^close$/,
|
||||||
|
/^connect$/,
|
||||||
|
/^disconnect$/,
|
||||||
|
/^count$/,
|
||||||
|
/^countBy[A-Z]/,
|
||||||
|
]
|
||||||
|
|
||||||
|
constructor(private readonly ormMatcher: OrmTypeMatcher) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a method name follows domain language conventions
|
||||||
|
*/
|
||||||
|
public isDomainMethodName(methodName: string): boolean {
|
||||||
|
if (this.ormMatcher.isTechnicalMethod(methodName)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.domainMethodPatterns.some((pattern) => pattern.test(methodName))
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Suggests better domain method names
|
||||||
|
*/
|
||||||
|
public suggestDomainMethodName(methodName: string): string {
|
||||||
|
const lowerName = methodName.toLowerCase()
|
||||||
|
const suggestions: string[] = []
|
||||||
|
|
||||||
|
this.collectSuggestions(lowerName, suggestions)
|
||||||
|
|
||||||
|
if (lowerName.includes("get") && lowerName.includes("all")) {
|
||||||
|
suggestions.push(
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.FIND_ALL,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.LIST_ALL,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
if (suggestions.length === 0) {
|
||||||
|
return REPOSITORY_METHOD_SUGGESTIONS.DEFAULT_SUGGESTION
|
||||||
|
}
|
||||||
|
|
||||||
|
return `Consider: ${suggestions.slice(0, 3).join(", ")}`
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Collects method name suggestions based on keywords
|
||||||
|
*/
|
||||||
|
private collectSuggestions(lowerName: string, suggestions: string[]): void {
|
||||||
|
const suggestionMap: Record<string, string[]> = {
|
||||||
|
query: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.SEARCH,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
||||||
|
],
|
||||||
|
select: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.GET_ENTITY,
|
||||||
|
],
|
||||||
|
insert: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.CREATE,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.ADD_ENTITY,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.STORE_ENTITY,
|
||||||
|
],
|
||||||
|
update: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.UPDATE,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.MODIFY_ENTITY,
|
||||||
|
],
|
||||||
|
upsert: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.SAVE,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.STORE_ENTITY,
|
||||||
|
],
|
||||||
|
remove: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.DELETE,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.REMOVE_BY_PROPERTY,
|
||||||
|
],
|
||||||
|
fetch: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.GET_ENTITY,
|
||||||
|
],
|
||||||
|
retrieve: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.GET_ENTITY,
|
||||||
|
],
|
||||||
|
load: [
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.FIND_BY_PROPERTY,
|
||||||
|
REPOSITORY_METHOD_SUGGESTIONS.GET_ENTITY,
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const [keyword, keywords] of Object.entries(suggestionMap)) {
|
||||||
|
if (lowerName.includes(keyword)) {
|
||||||
|
suggestions.push(...keywords)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,68 @@
|
|||||||
|
import { ORM_QUERY_METHODS } from "../constants/orm-methods"
|
||||||
|
import { REPOSITORY_PATTERN_MESSAGES } from "../../domain/constants/Messages"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Matches and validates ORM-specific types and patterns
|
||||||
|
*
|
||||||
|
* Identifies ORM-specific types (Prisma, TypeORM, Mongoose, etc.)
|
||||||
|
* that should not appear in domain layer repository interfaces.
|
||||||
|
*/
|
||||||
|
export class OrmTypeMatcher {
|
||||||
|
private readonly ormTypePatterns = [
|
||||||
|
/Prisma\./,
|
||||||
|
/PrismaClient/,
|
||||||
|
/TypeORM/,
|
||||||
|
/@Entity/,
|
||||||
|
/@Column/,
|
||||||
|
/@PrimaryColumn/,
|
||||||
|
/@PrimaryGeneratedColumn/,
|
||||||
|
/@ManyToOne/,
|
||||||
|
/@OneToMany/,
|
||||||
|
/@ManyToMany/,
|
||||||
|
/@JoinColumn/,
|
||||||
|
/@JoinTable/,
|
||||||
|
/Mongoose\./,
|
||||||
|
/Schema/,
|
||||||
|
/Model</,
|
||||||
|
/Document/,
|
||||||
|
/Sequelize\./,
|
||||||
|
/DataTypes\./,
|
||||||
|
/FindOptions/,
|
||||||
|
/WhereOptions/,
|
||||||
|
/IncludeOptions/,
|
||||||
|
/QueryInterface/,
|
||||||
|
/MikroORM/,
|
||||||
|
/EntityManager/,
|
||||||
|
/EntityRepository/,
|
||||||
|
/Collection</,
|
||||||
|
]
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a type name is an ORM-specific type
|
||||||
|
*/
|
||||||
|
public isOrmType(typeName: string): boolean {
|
||||||
|
return this.ormTypePatterns.some((pattern) => pattern.test(typeName))
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts ORM type name from a code line
|
||||||
|
*/
|
||||||
|
public extractOrmType(line: string): string {
|
||||||
|
for (const pattern of this.ormTypePatterns) {
|
||||||
|
const match = line.match(pattern)
|
||||||
|
if (match) {
|
||||||
|
const startIdx = match.index || 0
|
||||||
|
const typeMatch = /[\w.]+/.exec(line.slice(startIdx))
|
||||||
|
return typeMatch ? typeMatch[0] : REPOSITORY_PATTERN_MESSAGES.UNKNOWN_TYPE
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return REPOSITORY_PATTERN_MESSAGES.UNKNOWN_TYPE
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a method name is a technical ORM method
|
||||||
|
*/
|
||||||
|
public isTechnicalMethod(methodName: string): boolean {
|
||||||
|
return (ORM_QUERY_METHODS as readonly string[]).includes(methodName)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,31 @@
|
|||||||
|
import { LAYERS } from "../../shared/constants/rules"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Analyzes files to determine their role in the repository pattern
|
||||||
|
*
|
||||||
|
* Identifies repository interfaces and use cases based on file paths
|
||||||
|
* and architectural layer conventions.
|
||||||
|
*/
|
||||||
|
export class RepositoryFileAnalyzer {
|
||||||
|
/**
|
||||||
|
* Checks if a file is a repository interface
|
||||||
|
*/
|
||||||
|
public isRepositoryInterface(filePath: string, layer: string | undefined): boolean {
|
||||||
|
if (layer !== LAYERS.DOMAIN) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return /I[A-Z]\w*Repository\.ts$/.test(filePath) && /repositories?\//.test(filePath)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a file is a use case
|
||||||
|
*/
|
||||||
|
public isUseCase(filePath: string, layer: string | undefined): boolean {
|
||||||
|
if (layer !== LAYERS.APPLICATION) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return /use-cases?\//.test(filePath) && /[A-Z][a-z]+[A-Z]\w*\.ts$/.test(filePath)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,285 @@
|
|||||||
|
import { RepositoryViolation } from "../../domain/value-objects/RepositoryViolation"
|
||||||
|
import { LAYERS, REPOSITORY_VIOLATION_TYPES } from "../../shared/constants/rules"
|
||||||
|
import { REPOSITORY_PATTERN_MESSAGES } from "../../domain/constants/Messages"
|
||||||
|
import { OrmTypeMatcher } from "./OrmTypeMatcher"
|
||||||
|
import { MethodNameValidator } from "./MethodNameValidator"
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects specific repository pattern violations
|
||||||
|
*
|
||||||
|
* Handles detection of ORM types, non-domain methods, concrete repositories,
|
||||||
|
* and repository instantiation violations.
|
||||||
|
*/
|
||||||
|
export class RepositoryViolationDetector {
|
||||||
|
constructor(
|
||||||
|
private readonly ormMatcher: OrmTypeMatcher,
|
||||||
|
private readonly methodValidator: MethodNameValidator,
|
||||||
|
) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects ORM types in repository interface
|
||||||
|
*/
|
||||||
|
public detectOrmTypes(
|
||||||
|
code: string,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
): RepositoryViolation[] {
|
||||||
|
const violations: RepositoryViolation[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
|
for (let i = 0; i < lines.length; i++) {
|
||||||
|
const line = lines[i]
|
||||||
|
const lineNumber = i + 1
|
||||||
|
|
||||||
|
this.detectOrmInMethod(line, lineNumber, filePath, layer, violations)
|
||||||
|
this.detectOrmInLine(line, lineNumber, filePath, layer, violations)
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects non-domain method names
|
||||||
|
*/
|
||||||
|
public detectNonDomainMethods(
|
||||||
|
code: string,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
): RepositoryViolation[] {
|
||||||
|
const violations: RepositoryViolation[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
|
for (let i = 0; i < lines.length; i++) {
|
||||||
|
const line = lines[i]
|
||||||
|
const lineNumber = i + 1
|
||||||
|
|
||||||
|
const methodMatch = /^\s*(\w+)\s*\(/.exec(line)
|
||||||
|
|
||||||
|
if (methodMatch) {
|
||||||
|
const methodName = methodMatch[1]
|
||||||
|
|
||||||
|
if (
|
||||||
|
!this.methodValidator.isDomainMethodName(methodName) &&
|
||||||
|
!line.trim().startsWith("//")
|
||||||
|
) {
|
||||||
|
const suggestion = this.methodValidator.suggestDomainMethodName(methodName)
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NON_DOMAIN_METHOD_NAME,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.DOMAIN,
|
||||||
|
lineNumber,
|
||||||
|
`Method '${methodName}' uses technical name instead of domain language. ${suggestion}`,
|
||||||
|
undefined,
|
||||||
|
undefined,
|
||||||
|
methodName,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects concrete repository usage
|
||||||
|
*/
|
||||||
|
public detectConcreteRepositoryUsage(
|
||||||
|
code: string,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
): RepositoryViolation[] {
|
||||||
|
const violations: RepositoryViolation[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
|
for (let i = 0; i < lines.length; i++) {
|
||||||
|
const line = lines[i]
|
||||||
|
const lineNumber = i + 1
|
||||||
|
|
||||||
|
this.detectConcreteInConstructor(line, lineNumber, filePath, layer, violations)
|
||||||
|
this.detectConcreteInField(line, lineNumber, filePath, layer, violations)
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects new Repository() instantiation
|
||||||
|
*/
|
||||||
|
public detectNewInstantiation(
|
||||||
|
code: string,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
): RepositoryViolation[] {
|
||||||
|
const violations: RepositoryViolation[] = []
|
||||||
|
const lines = code.split("\n")
|
||||||
|
|
||||||
|
for (let i = 0; i < lines.length; i++) {
|
||||||
|
const line = lines[i]
|
||||||
|
const lineNumber = i + 1
|
||||||
|
|
||||||
|
const newRepositoryMatch = /new\s+([A-Z]\w*Repository)\s*\(/.exec(line)
|
||||||
|
|
||||||
|
if (newRepositoryMatch && !line.trim().startsWith("//")) {
|
||||||
|
const repositoryName = newRepositoryMatch[1]
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.NEW_REPOSITORY_IN_USE_CASE,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.APPLICATION,
|
||||||
|
lineNumber,
|
||||||
|
`Use case creates repository with 'new ${repositoryName}()'`,
|
||||||
|
undefined,
|
||||||
|
repositoryName,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return violations
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects ORM types in method signatures
|
||||||
|
*/
|
||||||
|
private detectOrmInMethod(
|
||||||
|
line: string,
|
||||||
|
lineNumber: number,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
violations: RepositoryViolation[],
|
||||||
|
): void {
|
||||||
|
const methodMatch =
|
||||||
|
/(\w+)\s*\([^)]*:\s*([^)]+)\)\s*:\s*.*?(?:Promise<([^>]+)>|([A-Z]\w+))/.exec(line)
|
||||||
|
|
||||||
|
if (methodMatch) {
|
||||||
|
const params = methodMatch[2]
|
||||||
|
const returnType = methodMatch[3] || methodMatch[4]
|
||||||
|
|
||||||
|
if (this.ormMatcher.isOrmType(params)) {
|
||||||
|
const ormType = this.ormMatcher.extractOrmType(params)
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.DOMAIN,
|
||||||
|
lineNumber,
|
||||||
|
`Method parameter uses ORM type: ${ormType}`,
|
||||||
|
ormType,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
if (returnType && this.ormMatcher.isOrmType(returnType)) {
|
||||||
|
const ormType = this.ormMatcher.extractOrmType(returnType)
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.DOMAIN,
|
||||||
|
lineNumber,
|
||||||
|
`Method return type uses ORM type: ${ormType}`,
|
||||||
|
ormType,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects ORM types in general code line
|
||||||
|
*/
|
||||||
|
private detectOrmInLine(
|
||||||
|
line: string,
|
||||||
|
lineNumber: number,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
violations: RepositoryViolation[],
|
||||||
|
): void {
|
||||||
|
if (this.ormMatcher.isOrmType(line) && !line.trim().startsWith("//")) {
|
||||||
|
const ormType = this.ormMatcher.extractOrmType(line)
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.ORM_TYPE_IN_INTERFACE,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.DOMAIN,
|
||||||
|
lineNumber,
|
||||||
|
`Repository interface contains ORM-specific type: ${ormType}`,
|
||||||
|
ormType,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects concrete repository in constructor
|
||||||
|
*/
|
||||||
|
private detectConcreteInConstructor(
|
||||||
|
line: string,
|
||||||
|
lineNumber: number,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
violations: RepositoryViolation[],
|
||||||
|
): void {
|
||||||
|
const constructorParamMatch =
|
||||||
|
/constructor\s*\([^)]*(?:private|public|protected)\s+(?:readonly\s+)?(\w+)\s*:\s*([A-Z]\w*Repository)/.exec(
|
||||||
|
line,
|
||||||
|
)
|
||||||
|
|
||||||
|
if (constructorParamMatch) {
|
||||||
|
const repositoryType = constructorParamMatch[2]
|
||||||
|
|
||||||
|
if (!repositoryType.startsWith("I")) {
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.APPLICATION,
|
||||||
|
lineNumber,
|
||||||
|
`Use case depends on concrete repository '${repositoryType}'`,
|
||||||
|
undefined,
|
||||||
|
repositoryType,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detects concrete repository in field
|
||||||
|
*/
|
||||||
|
private detectConcreteInField(
|
||||||
|
line: string,
|
||||||
|
lineNumber: number,
|
||||||
|
filePath: string,
|
||||||
|
layer: string | undefined,
|
||||||
|
violations: RepositoryViolation[],
|
||||||
|
): void {
|
||||||
|
const fieldMatch =
|
||||||
|
/(?:private|public|protected)\s+(?:readonly\s+)?(\w+)\s*:\s*([A-Z]\w*Repository)/.exec(
|
||||||
|
line,
|
||||||
|
)
|
||||||
|
|
||||||
|
if (fieldMatch) {
|
||||||
|
const repositoryType = fieldMatch[2]
|
||||||
|
|
||||||
|
if (
|
||||||
|
!repositoryType.startsWith("I") &&
|
||||||
|
!line.includes(REPOSITORY_PATTERN_MESSAGES.CONSTRUCTOR)
|
||||||
|
) {
|
||||||
|
violations.push(
|
||||||
|
RepositoryViolation.create(
|
||||||
|
REPOSITORY_VIOLATION_TYPES.CONCRETE_REPOSITORY_IN_USE_CASE,
|
||||||
|
filePath,
|
||||||
|
layer || LAYERS.APPLICATION,
|
||||||
|
lineNumber,
|
||||||
|
`Use case field uses concrete repository '${repositoryType}'`,
|
||||||
|
undefined,
|
||||||
|
repositoryType,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -86,6 +86,7 @@ export const SEVERITY_ORDER: Record<SeverityLevel, number> = {
|
|||||||
* Violation type to severity mapping
|
* Violation type to severity mapping
|
||||||
*/
|
*/
|
||||||
export const VIOLATION_SEVERITY_MAP = {
|
export const VIOLATION_SEVERITY_MAP = {
|
||||||
|
SECRET_EXPOSURE: SEVERITY_LEVELS.CRITICAL,
|
||||||
CIRCULAR_DEPENDENCY: SEVERITY_LEVELS.CRITICAL,
|
CIRCULAR_DEPENDENCY: SEVERITY_LEVELS.CRITICAL,
|
||||||
REPOSITORY_PATTERN: SEVERITY_LEVELS.CRITICAL,
|
REPOSITORY_PATTERN: SEVERITY_LEVELS.CRITICAL,
|
||||||
AGGREGATE_BOUNDARY: SEVERITY_LEVELS.CRITICAL,
|
AGGREGATE_BOUNDARY: SEVERITY_LEVELS.CRITICAL,
|
||||||
|
|||||||
@@ -11,6 +11,7 @@ export const RULES = {
|
|||||||
DEPENDENCY_DIRECTION: "dependency-direction",
|
DEPENDENCY_DIRECTION: "dependency-direction",
|
||||||
REPOSITORY_PATTERN: "repository-pattern",
|
REPOSITORY_PATTERN: "repository-pattern",
|
||||||
AGGREGATE_BOUNDARY: "aggregate-boundary",
|
AGGREGATE_BOUNDARY: "aggregate-boundary",
|
||||||
|
SECRET_EXPOSURE: "secret-exposure",
|
||||||
} as const
|
} as const
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -102,32 +103,35 @@ export const NAMING_PATTERNS = {
|
|||||||
* Common verbs for use cases
|
* Common verbs for use cases
|
||||||
*/
|
*/
|
||||||
export const USE_CASE_VERBS = [
|
export const USE_CASE_VERBS = [
|
||||||
|
"Aggregate",
|
||||||
"Analyze",
|
"Analyze",
|
||||||
"Create",
|
"Approve",
|
||||||
"Update",
|
|
||||||
"Delete",
|
|
||||||
"Get",
|
|
||||||
"Find",
|
|
||||||
"List",
|
|
||||||
"Search",
|
|
||||||
"Validate",
|
|
||||||
"Calculate",
|
|
||||||
"Generate",
|
|
||||||
"Send",
|
|
||||||
"Fetch",
|
|
||||||
"Process",
|
|
||||||
"Execute",
|
|
||||||
"Handle",
|
|
||||||
"Register",
|
|
||||||
"Authenticate",
|
"Authenticate",
|
||||||
"Authorize",
|
"Authorize",
|
||||||
"Import",
|
"Calculate",
|
||||||
"Export",
|
|
||||||
"Place",
|
|
||||||
"Cancel",
|
"Cancel",
|
||||||
"Approve",
|
"Collect",
|
||||||
"Reject",
|
|
||||||
"Confirm",
|
"Confirm",
|
||||||
|
"Create",
|
||||||
|
"Delete",
|
||||||
|
"Execute",
|
||||||
|
"Export",
|
||||||
|
"Fetch",
|
||||||
|
"Find",
|
||||||
|
"Generate",
|
||||||
|
"Get",
|
||||||
|
"Handle",
|
||||||
|
"Import",
|
||||||
|
"List",
|
||||||
|
"Parse",
|
||||||
|
"Place",
|
||||||
|
"Process",
|
||||||
|
"Register",
|
||||||
|
"Reject",
|
||||||
|
"Search",
|
||||||
|
"Send",
|
||||||
|
"Update",
|
||||||
|
"Validate",
|
||||||
] as const
|
] as const
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|||||||
320
packages/guardian/tests/unit/domain/SecretViolation.test.ts
Normal file
320
packages/guardian/tests/unit/domain/SecretViolation.test.ts
Normal file
@@ -0,0 +1,320 @@
|
|||||||
|
import { describe, it, expect } from "vitest"
|
||||||
|
import { SecretViolation } from "../../../src/domain/value-objects/SecretViolation"
|
||||||
|
|
||||||
|
describe("SecretViolation", () => {
|
||||||
|
describe("create", () => {
|
||||||
|
it("should create a secret violation with all properties", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"AKIA1234567890ABCDEF",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.file).toBe("src/config/aws.ts")
|
||||||
|
expect(violation.line).toBe(10)
|
||||||
|
expect(violation.column).toBe(15)
|
||||||
|
expect(violation.secretType).toBe("AWS Access Key")
|
||||||
|
expect(violation.matchedPattern).toBe("AKIA1234567890ABCDEF")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should create a secret violation with GitHub token", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/github.ts",
|
||||||
|
5,
|
||||||
|
20,
|
||||||
|
"GitHub Personal Access Token",
|
||||||
|
"ghp_1234567890abcdefghijklmnopqrstuv",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.secretType).toBe("GitHub Personal Access Token")
|
||||||
|
expect(violation.file).toBe("src/config/github.ts")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should create a secret violation with NPM token", () => {
|
||||||
|
const violation = SecretViolation.create(".npmrc", 1, 1, "NPM Token", "npm_abc123xyz")
|
||||||
|
|
||||||
|
expect(violation.secretType).toBe("NPM Token")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getters", () => {
|
||||||
|
it("should return file path", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.file).toBe("src/config/aws.ts")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return line number", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.line).toBe(10)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return column number", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.column).toBe(15)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return secret type", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.secretType).toBe("AWS Access Key")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return matched pattern", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"AKIA1234567890ABCDEF",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.matchedPattern).toBe("AKIA1234567890ABCDEF")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getMessage", () => {
|
||||||
|
it("should return formatted message for AWS Access Key", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.getMessage()).toBe("Hardcoded AWS Access Key detected")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return formatted message for GitHub token", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/github.ts",
|
||||||
|
5,
|
||||||
|
20,
|
||||||
|
"GitHub Token",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.getMessage()).toBe("Hardcoded GitHub Token detected")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return formatted message for NPM token", () => {
|
||||||
|
const violation = SecretViolation.create(".npmrc", 1, 1, "NPM Token", "test")
|
||||||
|
|
||||||
|
expect(violation.getMessage()).toBe("Hardcoded NPM Token detected")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getSuggestion", () => {
|
||||||
|
it("should return multi-line suggestion", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const suggestion = violation.getSuggestion()
|
||||||
|
|
||||||
|
expect(suggestion).toContain("1. Use environment variables")
|
||||||
|
expect(suggestion).toContain("2. Use secret management services")
|
||||||
|
expect(suggestion).toContain("3. Never commit secrets")
|
||||||
|
expect(suggestion).toContain("4. If secret was committed, rotate it immediately")
|
||||||
|
expect(suggestion).toContain("5. Add secret files to .gitignore")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return the same suggestion for all secret types", () => {
|
||||||
|
const awsViolation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const githubViolation = SecretViolation.create(
|
||||||
|
"src/config/github.ts",
|
||||||
|
5,
|
||||||
|
20,
|
||||||
|
"GitHub Token",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(awsViolation.getSuggestion()).toBe(githubViolation.getSuggestion())
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getExampleFix", () => {
|
||||||
|
it("should return AWS-specific example for AWS Access Key", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("AWS")
|
||||||
|
expect(example).toContain("process.env.AWS_ACCESS_KEY_ID")
|
||||||
|
expect(example).toContain("credentials provider")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return GitHub-specific example for GitHub token", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/github.ts",
|
||||||
|
5,
|
||||||
|
20,
|
||||||
|
"GitHub Token",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("GitHub")
|
||||||
|
expect(example).toContain("process.env.GITHUB_TOKEN")
|
||||||
|
expect(example).toContain("GitHub Apps")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return NPM-specific example for NPM token", () => {
|
||||||
|
const violation = SecretViolation.create(".npmrc", 1, 1, "NPM Token", "test")
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("NPM")
|
||||||
|
expect(example).toContain(".npmrc")
|
||||||
|
expect(example).toContain("process.env.NPM_TOKEN")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return SSH-specific example for SSH Private Key", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/ssh.ts",
|
||||||
|
1,
|
||||||
|
1,
|
||||||
|
"SSH Private Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("SSH")
|
||||||
|
expect(example).toContain("readFileSync")
|
||||||
|
expect(example).toContain("SSH_KEY_PATH")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return SSH RSA-specific example for SSH RSA Private Key", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/ssh.ts",
|
||||||
|
1,
|
||||||
|
1,
|
||||||
|
"SSH RSA Private Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("SSH")
|
||||||
|
expect(example).toContain("RSA PRIVATE KEY")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return Slack-specific example for Slack token", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/slack.ts",
|
||||||
|
1,
|
||||||
|
1,
|
||||||
|
"Slack Bot Token",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("Slack")
|
||||||
|
expect(example).toContain("process.env.SLACK_BOT_TOKEN")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return API Key example for generic API key", () => {
|
||||||
|
const violation = SecretViolation.create("src/config/api.ts", 1, 1, "API Key", "test")
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("API")
|
||||||
|
expect(example).toContain("process.env.API_KEY")
|
||||||
|
expect(example).toContain("secret management service")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return generic example for unknown secret type", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/unknown.ts",
|
||||||
|
1,
|
||||||
|
1,
|
||||||
|
"Unknown Secret",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
const example = violation.getExampleFix()
|
||||||
|
|
||||||
|
expect(example).toContain("process.env.SECRET_KEY")
|
||||||
|
expect(example).toContain("secret management")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("getSeverity", () => {
|
||||||
|
it("should always return critical severity", () => {
|
||||||
|
const violation = SecretViolation.create(
|
||||||
|
"src/config/aws.ts",
|
||||||
|
10,
|
||||||
|
15,
|
||||||
|
"AWS Access Key",
|
||||||
|
"test",
|
||||||
|
)
|
||||||
|
|
||||||
|
expect(violation.getSeverity()).toBe("critical")
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return critical severity for all secret types", () => {
|
||||||
|
const types = [
|
||||||
|
"AWS Access Key",
|
||||||
|
"GitHub Token",
|
||||||
|
"NPM Token",
|
||||||
|
"SSH Private Key",
|
||||||
|
"Slack Token",
|
||||||
|
"API Key",
|
||||||
|
]
|
||||||
|
|
||||||
|
types.forEach((type) => {
|
||||||
|
const violation = SecretViolation.create("test.ts", 1, 1, type, "test")
|
||||||
|
expect(violation.getSeverity()).toBe("critical")
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
@@ -0,0 +1,277 @@
|
|||||||
|
import { describe, it, expect, beforeEach } from "vitest"
|
||||||
|
import { SecretDetector } from "../../../src/infrastructure/analyzers/SecretDetector"
|
||||||
|
|
||||||
|
describe("SecretDetector", () => {
|
||||||
|
let detector: SecretDetector
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
detector = new SecretDetector()
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("detectAll", () => {
|
||||||
|
it("should return empty array for code without secrets", async () => {
|
||||||
|
const code = `
|
||||||
|
const greeting = "Hello World"
|
||||||
|
const count = 42
|
||||||
|
function test() {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "test.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return empty array for normal environment variable usage", async () => {
|
||||||
|
const code = `
|
||||||
|
const apiKey = process.env.API_KEY
|
||||||
|
const dbUrl = process.env.DATABASE_URL
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "config.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle empty code", async () => {
|
||||||
|
const violations = await detector.detectAll("", "empty.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle code with only comments", async () => {
|
||||||
|
const code = `
|
||||||
|
// This is a comment
|
||||||
|
/* Multi-line
|
||||||
|
comment */
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "comments.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle multiline strings without secrets", async () => {
|
||||||
|
const code = `
|
||||||
|
const template = \`
|
||||||
|
Hello World
|
||||||
|
This is a test
|
||||||
|
No secrets here
|
||||||
|
\`
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "template.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle code with URLs", async () => {
|
||||||
|
const code = `
|
||||||
|
const apiUrl = "https://api.example.com/v1"
|
||||||
|
const websiteUrl = "http://localhost:3000"
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "urls.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle imports and requires", async () => {
|
||||||
|
const code = `
|
||||||
|
import { something } from "some-package"
|
||||||
|
const fs = require('fs')
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "imports.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should return violations with correct file path", async () => {
|
||||||
|
const code = `const secret = "test-secret-value"`
|
||||||
|
const filePath = "src/config/secrets.ts"
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, filePath)
|
||||||
|
|
||||||
|
violations.forEach((v) => {
|
||||||
|
expect(v.file).toBe(filePath)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle .js files", async () => {
|
||||||
|
const code = `const test = "value"`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "test.js")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle .jsx files", async () => {
|
||||||
|
const code = `const Component = () => <div>Test</div>`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "Component.jsx")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle .tsx files", async () => {
|
||||||
|
const code = `const Component: React.FC = () => <div>Test</div>`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "Component.tsx")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle errors gracefully", async () => {
|
||||||
|
const code = null as unknown as string
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "test.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle malformed code gracefully", async () => {
|
||||||
|
const code = "const = = ="
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "malformed.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("parseOutputToViolations", () => {
|
||||||
|
it("should parse empty output", async () => {
|
||||||
|
const code = ""
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "test.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle whitespace-only output", async () => {
|
||||||
|
const code = " \n \n "
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "test.ts")
|
||||||
|
|
||||||
|
expect(violations).toHaveLength(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("extractSecretType", () => {
|
||||||
|
it("should handle various secret types correctly", async () => {
|
||||||
|
const code = `const value = "test"`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "test.ts")
|
||||||
|
|
||||||
|
violations.forEach((v) => {
|
||||||
|
expect(v.secretType).toBeTruthy()
|
||||||
|
expect(typeof v.secretType).toBe("string")
|
||||||
|
expect(v.secretType.length).toBeGreaterThan(0)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("integration", () => {
|
||||||
|
it("should work with TypeScript code", async () => {
|
||||||
|
const code = `
|
||||||
|
interface Config {
|
||||||
|
apiKey: string
|
||||||
|
}
|
||||||
|
|
||||||
|
const config: Config = {
|
||||||
|
apiKey: process.env.API_KEY || "default"
|
||||||
|
}
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "config.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should work with ES6+ syntax", async () => {
|
||||||
|
const code = `
|
||||||
|
const fetchData = async () => {
|
||||||
|
const response = await fetch(url)
|
||||||
|
return response.json()
|
||||||
|
}
|
||||||
|
|
||||||
|
const [data, setData] = useState(null)
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "hooks.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should work with JSX/TSX", async () => {
|
||||||
|
const code = `
|
||||||
|
export const Button = ({ onClick }: Props) => {
|
||||||
|
return <button onClick={onClick}>Click me</button>
|
||||||
|
}
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "Button.tsx")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle concurrent detections", async () => {
|
||||||
|
const code1 = "const test1 = 'value1'"
|
||||||
|
const code2 = "const test2 = 'value2'"
|
||||||
|
const code3 = "const test3 = 'value3'"
|
||||||
|
|
||||||
|
const [result1, result2, result3] = await Promise.all([
|
||||||
|
detector.detectAll(code1, "file1.ts"),
|
||||||
|
detector.detectAll(code2, "file2.ts"),
|
||||||
|
detector.detectAll(code3, "file3.ts"),
|
||||||
|
])
|
||||||
|
|
||||||
|
expect(result1).toBeInstanceOf(Array)
|
||||||
|
expect(result2).toBeInstanceOf(Array)
|
||||||
|
expect(result3).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
describe("edge cases", () => {
|
||||||
|
it("should handle very long code", async () => {
|
||||||
|
const longCode = "const value = 'test'\n".repeat(1000)
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(longCode, "long.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle special characters in code", async () => {
|
||||||
|
const code = `
|
||||||
|
const special = "!@#$%^&*()_+-=[]{}|;:',.<>?"
|
||||||
|
const unicode = "日本語 🚀"
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "special.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle code with regex patterns", async () => {
|
||||||
|
const code = `
|
||||||
|
const pattern = /^[A-Z0-9._%+-]+@[A-Z0-9.-]+\\.[A-Z]{2,}$/i
|
||||||
|
const matches = text.match(pattern)
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "regex.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
|
||||||
|
it("should handle code with template literals", async () => {
|
||||||
|
const code = `
|
||||||
|
const message = \`Hello \${name}, your balance is \${balance}\`
|
||||||
|
`
|
||||||
|
|
||||||
|
const violations = await detector.detectAll(code, "template.ts")
|
||||||
|
|
||||||
|
expect(violations).toBeInstanceOf(Array)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
})
|
||||||
315
pnpm-lock.yaml
generated
315
pnpm-lock.yaml
generated
@@ -80,6 +80,18 @@ importers:
|
|||||||
|
|
||||||
packages/guardian:
|
packages/guardian:
|
||||||
dependencies:
|
dependencies:
|
||||||
|
'@secretlint/core':
|
||||||
|
specifier: ^11.2.5
|
||||||
|
version: 11.2.5
|
||||||
|
'@secretlint/node':
|
||||||
|
specifier: ^11.2.5
|
||||||
|
version: 11.2.5
|
||||||
|
'@secretlint/secretlint-rule-preset-recommend':
|
||||||
|
specifier: ^11.2.5
|
||||||
|
version: 11.2.5
|
||||||
|
'@secretlint/types':
|
||||||
|
specifier: ^11.2.5
|
||||||
|
version: 11.2.5
|
||||||
commander:
|
commander:
|
||||||
specifier: ^12.1.0
|
specifier: ^12.1.0
|
||||||
version: 12.1.0
|
version: 12.1.0
|
||||||
@@ -154,6 +166,12 @@ packages:
|
|||||||
resolution: {integrity: sha512-J4Jarr0SohdrHcb40gTL4wGPCQ952IMWF1G/MSAQfBAPvA9ZKApYhpxcY7PmehVePve+ujpus1dGsJ7dPxz8Kg==}
|
resolution: {integrity: sha512-J4Jarr0SohdrHcb40gTL4wGPCQ952IMWF1G/MSAQfBAPvA9ZKApYhpxcY7PmehVePve+ujpus1dGsJ7dPxz8Kg==}
|
||||||
engines: {node: ^18.19.1 || ^20.11.1 || >=22.0.0, npm: ^6.11.0 || ^7.5.6 || >=8.0.0, yarn: '>= 1.13.0'}
|
engines: {node: ^18.19.1 || ^20.11.1 || >=22.0.0, npm: ^6.11.0 || ^7.5.6 || >=8.0.0, yarn: '>= 1.13.0'}
|
||||||
|
|
||||||
|
'@azu/format-text@1.0.2':
|
||||||
|
resolution: {integrity: sha512-Swi4N7Edy1Eqq82GxgEECXSSLyn6GOb5htRFPzBDdUkECGXtlf12ynO5oJSpWKPwCaUssOu7NfhDcCWpIC6Ywg==}
|
||||||
|
|
||||||
|
'@azu/style-format@1.0.1':
|
||||||
|
resolution: {integrity: sha512-AHcTojlNBdD/3/KxIKlg8sxIWHfOtQszLvOpagLTO+bjC3u7SAszu1lf//u7JJC50aUSH+BVWDD/KvaA6Gfn5g==}
|
||||||
|
|
||||||
'@babel/code-frame@7.27.1':
|
'@babel/code-frame@7.27.1':
|
||||||
resolution: {integrity: sha512-cjQ7ZlQ0Mv3b47hABuTevyTuYN4i+loJKGeV9flcCgIK37cCXRh+L1bd3iBHlynerhQ7BhCkn2BPbQUL+rGqFg==}
|
resolution: {integrity: sha512-cjQ7ZlQ0Mv3b47hABuTevyTuYN4i+loJKGeV9flcCgIK37cCXRh+L1bd3iBHlynerhQ7BhCkn2BPbQUL+rGqFg==}
|
||||||
engines: {node: '>=6.9.0'}
|
engines: {node: '>=6.9.0'}
|
||||||
@@ -1040,6 +1058,40 @@ packages:
|
|||||||
cpu: [x64]
|
cpu: [x64]
|
||||||
os: [win32]
|
os: [win32]
|
||||||
|
|
||||||
|
'@secretlint/config-loader@11.2.5':
|
||||||
|
resolution: {integrity: sha512-pUiH5xc3x8RLEDq+0dCz65v4kohtfp68I7qmYPuymTwHodzjyJ089ZbNdN1ZX8SZV4xZLQsFIrRLn1lJ55QyyQ==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
|
'@secretlint/core@11.2.5':
|
||||||
|
resolution: {integrity: sha512-PZNpBd6+KVya2tA3o1oC2kTWYKju8lZG9phXyQY7geWKf+a+fInN4/HSYfCQS495oyTSjhc9qI0mNQEw83PY2Q==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
|
'@secretlint/formatter@11.2.5':
|
||||||
|
resolution: {integrity: sha512-9XBMeveo1eKXMC9zLjA6nd2lb5JjUgjj8NUpCo1Il8jO4YJ12k7qXZk3T/QJup+Kh0ThpHO03D9C1xLDIPIEPQ==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
|
'@secretlint/node@11.2.5':
|
||||||
|
resolution: {integrity: sha512-nPdtUsTzDzBJzFiKh80/H5+2ZRRogtDuHhnNiGtF7LSHp8YsQHU5piAVbESdV0AmUjbWijAjscIsWqvtU+2JUQ==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
|
'@secretlint/profiler@11.2.5':
|
||||||
|
resolution: {integrity: sha512-evQ2PeO3Ub0apWIPaXJy8lMDO1OFgvgQhZd+MhYLcLHgR559EtJ9V02Sh5c10wTLkLAtJ+czlJg2kmlt0nm8fw==}
|
||||||
|
|
||||||
|
'@secretlint/resolver@11.2.5':
|
||||||
|
resolution: {integrity: sha512-Zn9+Gj7cRNjEDX8d1NYZNjTG9/Wjlc8N+JvARFYYYu6JxfbtkabhFxzwxBLkRZ2ZCkPCCnuXJwepcgfVXSPsng==}
|
||||||
|
|
||||||
|
'@secretlint/secretlint-rule-preset-recommend@11.2.5':
|
||||||
|
resolution: {integrity: sha512-FAnp/dPdbvHEw50aF9JMPF/OwW58ULvVXEsk+mXTtBD09VJZhG0vFum8WzxMbB98Eo4xDddGzYtE3g27pBOaQA==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
|
'@secretlint/source-creator@11.2.5':
|
||||||
|
resolution: {integrity: sha512-+ApoNDS4uIaLb2PG9PPEP9Zu1HDBWpxSd/+Qlb3MzKTwp2BG9sbUhvpGgxuIHFn7pMWQU60DhzYJJUBpbXZEHQ==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
|
'@secretlint/types@11.2.5':
|
||||||
|
resolution: {integrity: sha512-iA7E+uXuiEydOwv8glEYM4tCHnl8C7wTgLxg+3upHhH/iSSnefWfoRqrJwVBhwxPg4MDoypVI7Oal7bX7/ne+w==}
|
||||||
|
engines: {node: '>=20.0.0'}
|
||||||
|
|
||||||
'@sinclair/typebox@0.34.41':
|
'@sinclair/typebox@0.34.41':
|
||||||
resolution: {integrity: sha512-6gS8pZzSXdyRHTIqoqSVknxolr1kzfy4/CeDnrzsVz8TTIWUbOBr6gnzOmTYJ3eXQNh4IYHIGi5aIL7sOZ2G/g==}
|
resolution: {integrity: sha512-6gS8pZzSXdyRHTIqoqSVknxolr1kzfy4/CeDnrzsVz8TTIWUbOBr6gnzOmTYJ3eXQNh4IYHIGi5aIL7sOZ2G/g==}
|
||||||
|
|
||||||
@@ -1052,6 +1104,21 @@ packages:
|
|||||||
'@standard-schema/spec@1.0.0':
|
'@standard-schema/spec@1.0.0':
|
||||||
resolution: {integrity: sha512-m2bOd0f2RT9k8QJx1JN85cZYyH1RqFBdlwtkSlf4tBDYLCiiZnv1fIIwacK6cqwXavOydf0NPToMQgpKq+dVlA==}
|
resolution: {integrity: sha512-m2bOd0f2RT9k8QJx1JN85cZYyH1RqFBdlwtkSlf4tBDYLCiiZnv1fIIwacK6cqwXavOydf0NPToMQgpKq+dVlA==}
|
||||||
|
|
||||||
|
'@textlint/ast-node-types@15.4.0':
|
||||||
|
resolution: {integrity: sha512-IqY8i7IOGuvy05wZxISB7Me1ZyrvhaQGgx6DavfQjH3cfwpPFdDbDYmMXMuSv2xLS1kDB1kYKBV7fL2Vi16lRA==}
|
||||||
|
|
||||||
|
'@textlint/linter-formatter@15.4.0':
|
||||||
|
resolution: {integrity: sha512-rfqOZmnI1Wwc/Pa4LK+vagvVPmvxf9oRsBRqIOB04DwhucingZyAIJI/TyG18DIDYbP2aFXBZ3oOvyAxHe/8PQ==}
|
||||||
|
|
||||||
|
'@textlint/module-interop@15.4.0':
|
||||||
|
resolution: {integrity: sha512-uGf+SFIfzOLCbZI0gp+2NLsrkSArsvEWulPP6lJuKp7yRHadmy7Xf/YHORe46qhNyyxc8PiAfiixHJSaHGUrGg==}
|
||||||
|
|
||||||
|
'@textlint/resolver@15.4.0':
|
||||||
|
resolution: {integrity: sha512-Vh/QceKZQHFJFG4GxxIsKM1Xhwv93mbtKHmFE5/ybal1mIKHdqF03Z9Guaqt6Sx/AeNUshq0hkMOEhEyEWnehQ==}
|
||||||
|
|
||||||
|
'@textlint/types@15.4.0':
|
||||||
|
resolution: {integrity: sha512-ZMwJgw/xjxJufOD+IB7I2Enl9Si4Hxo04B76RwUZ5cKBKzOPcmd6WvGe2F7jqdgmTdGnfMU+Bo/joQrjPNIWqg==}
|
||||||
|
|
||||||
'@tokenizer/inflate@0.3.1':
|
'@tokenizer/inflate@0.3.1':
|
||||||
resolution: {integrity: sha512-4oeoZEBQdLdt5WmP/hx1KZ6D3/Oid/0cUb2nk4F0pTDAWy+KCH3/EnAkZF/bvckWo8I33EqBm01lIPgmgc8rCA==}
|
resolution: {integrity: sha512-4oeoZEBQdLdt5WmP/hx1KZ6D3/Oid/0cUb2nk4F0pTDAWy+KCH3/EnAkZF/bvckWo8I33EqBm01lIPgmgc8rCA==}
|
||||||
engines: {node: '>=18'}
|
engines: {node: '>=18'}
|
||||||
@@ -1488,6 +1555,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-gKXj5ALrKWQLsYG9jlTRmR/xKluxHV+Z9QEwNIgCfM1/uwPMCuzVVnh5mwTd+OuBZcwSIMbqssNWRm1lE51QaQ==}
|
resolution: {integrity: sha512-gKXj5ALrKWQLsYG9jlTRmR/xKluxHV+Z9QEwNIgCfM1/uwPMCuzVVnh5mwTd+OuBZcwSIMbqssNWRm1lE51QaQ==}
|
||||||
engines: {node: '>=8'}
|
engines: {node: '>=8'}
|
||||||
|
|
||||||
|
ansi-escapes@7.2.0:
|
||||||
|
resolution: {integrity: sha512-g6LhBsl+GBPRWGWsBtutpzBYuIIdBkLEvad5C/va/74Db018+5TZiyA26cZJAr3Rft5lprVqOIPxf5Vid6tqAw==}
|
||||||
|
engines: {node: '>=18'}
|
||||||
|
|
||||||
ansi-regex@5.0.1:
|
ansi-regex@5.0.1:
|
||||||
resolution: {integrity: sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==}
|
resolution: {integrity: sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==}
|
||||||
engines: {node: '>=8'}
|
engines: {node: '>=8'}
|
||||||
@@ -1538,6 +1609,10 @@ packages:
|
|||||||
ast-v8-to-istanbul@0.3.8:
|
ast-v8-to-istanbul@0.3.8:
|
||||||
resolution: {integrity: sha512-szgSZqUxI5T8mLKvS7WTjF9is+MVbOeLADU73IseOcrqhxr/VAvy6wfoVE39KnKzA7JRhjF5eUagNlHwvZPlKQ==}
|
resolution: {integrity: sha512-szgSZqUxI5T8mLKvS7WTjF9is+MVbOeLADU73IseOcrqhxr/VAvy6wfoVE39KnKzA7JRhjF5eUagNlHwvZPlKQ==}
|
||||||
|
|
||||||
|
astral-regex@2.0.0:
|
||||||
|
resolution: {integrity: sha512-Z7tMw1ytTXt5jqMcOP+OQteU1VuNK9Y02uuJtKQ1Sv69jXQKKg5cibLwGJow8yzZP+eAc18EmLGPal0bp36rvQ==}
|
||||||
|
engines: {node: '>=8'}
|
||||||
|
|
||||||
asynckit@0.4.0:
|
asynckit@0.4.0:
|
||||||
resolution: {integrity: sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==}
|
resolution: {integrity: sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==}
|
||||||
|
|
||||||
@@ -1576,9 +1651,16 @@ packages:
|
|||||||
resolution: {integrity: sha512-a28v2eWrrRWPpJSzxc+mKwm0ZtVx/G8SepdQZDArnXYU/XS+IF6mp8aB/4E+hH1tyGCoDo3KlUCdlSxGDsRkAw==}
|
resolution: {integrity: sha512-a28v2eWrrRWPpJSzxc+mKwm0ZtVx/G8SepdQZDArnXYU/XS+IF6mp8aB/4E+hH1tyGCoDo3KlUCdlSxGDsRkAw==}
|
||||||
hasBin: true
|
hasBin: true
|
||||||
|
|
||||||
|
binaryextensions@6.11.0:
|
||||||
|
resolution: {integrity: sha512-sXnYK/Ij80TO3lcqZVV2YgfKN5QjUWIRk/XSm2J/4bd/lPko3lvk0O4ZppH6m+6hB2/GTu+ptNwVFe1xh+QLQw==}
|
||||||
|
engines: {node: '>=4'}
|
||||||
|
|
||||||
bl@4.1.0:
|
bl@4.1.0:
|
||||||
resolution: {integrity: sha512-1W07cM9gS6DcLperZfFSj+bWLtaPGSOHWhPiGzXmvVJbRLdG82sH/Kn8EtW1VqWVA54AKf2h5k5BbnIbwF3h6w==}
|
resolution: {integrity: sha512-1W07cM9gS6DcLperZfFSj+bWLtaPGSOHWhPiGzXmvVJbRLdG82sH/Kn8EtW1VqWVA54AKf2h5k5BbnIbwF3h6w==}
|
||||||
|
|
||||||
|
boundary@2.0.0:
|
||||||
|
resolution: {integrity: sha512-rJKn5ooC9u8q13IMCrW0RSp31pxBCHE3y9V/tp3TdWSLf8Em3p6Di4NBpfzbJge9YjjFEsD0RtFEjtvHL5VyEA==}
|
||||||
|
|
||||||
brace-expansion@1.1.12:
|
brace-expansion@1.1.12:
|
||||||
resolution: {integrity: sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==}
|
resolution: {integrity: sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==}
|
||||||
|
|
||||||
@@ -1638,6 +1720,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==}
|
resolution: {integrity: sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==}
|
||||||
engines: {node: '>=10'}
|
engines: {node: '>=10'}
|
||||||
|
|
||||||
|
chalk@5.6.2:
|
||||||
|
resolution: {integrity: sha512-7NzBL0rN6fMUW+f7A6Io4h40qQlG+xGmtMxfbnH/K7TAtt8JQWVQK+6g0UXKMeVJoyV5EkkNsErQ8pVD3bLHbA==}
|
||||||
|
engines: {node: ^12.17.0 || ^14.13 || >=16.0.0}
|
||||||
|
|
||||||
char-regex@1.0.2:
|
char-regex@1.0.2:
|
||||||
resolution: {integrity: sha512-kWWXztvZ5SBQV+eRgKFeh8q5sLuZY2+8WUIzlxWVTg+oGwY14qylx1KbKzHd8P6ZYkAg0xyIDU9JMHhyJMZ1jw==}
|
resolution: {integrity: sha512-kWWXztvZ5SBQV+eRgKFeh8q5sLuZY2+8WUIzlxWVTg+oGwY14qylx1KbKzHd8P6ZYkAg0xyIDU9JMHhyJMZ1jw==}
|
||||||
engines: {node: '>=10'}
|
engines: {node: '>=10'}
|
||||||
@@ -1801,6 +1887,10 @@ packages:
|
|||||||
eastasianwidth@0.2.0:
|
eastasianwidth@0.2.0:
|
||||||
resolution: {integrity: sha512-I88TYZWc9XiYHRQ4/3c5rjjfgkjhLyW2luGIheGERbNQ6OY7yTybanSpDXZa8y7VUP9YmDcYa+eyq4ca7iLqWA==}
|
resolution: {integrity: sha512-I88TYZWc9XiYHRQ4/3c5rjjfgkjhLyW2luGIheGERbNQ6OY7yTybanSpDXZa8y7VUP9YmDcYa+eyq4ca7iLqWA==}
|
||||||
|
|
||||||
|
editions@6.22.0:
|
||||||
|
resolution: {integrity: sha512-UgGlf8IW75je7HZjNDpJdCv4cGJWIi6yumFdZ0R7A8/CIhQiWUjyGLCxdHpd8bmyD1gnkfUNK0oeOXqUS2cpfQ==}
|
||||||
|
engines: {ecmascript: '>= es5', node: '>=4'}
|
||||||
|
|
||||||
electron-to-chromium@1.5.259:
|
electron-to-chromium@1.5.259:
|
||||||
resolution: {integrity: sha512-I+oLXgpEJzD6Cwuwt1gYjxsDmu/S/Kd41mmLA3O+/uH2pFRO/DvOjUyGozL8j3KeLV6WyZ7ssPwELMsXCcsJAQ==}
|
resolution: {integrity: sha512-I+oLXgpEJzD6Cwuwt1gYjxsDmu/S/Kd41mmLA3O+/uH2pFRO/DvOjUyGozL8j3KeLV6WyZ7ssPwELMsXCcsJAQ==}
|
||||||
|
|
||||||
@@ -1818,6 +1908,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-d4lC8xfavMeBjzGr2vECC3fsGXziXZQyJxD868h2M/mBI3PwAuODxAkLkq5HYuvrPYcUtiLzsTo8U3PgX3Ocww==}
|
resolution: {integrity: sha512-d4lC8xfavMeBjzGr2vECC3fsGXziXZQyJxD868h2M/mBI3PwAuODxAkLkq5HYuvrPYcUtiLzsTo8U3PgX3Ocww==}
|
||||||
engines: {node: '>=10.13.0'}
|
engines: {node: '>=10.13.0'}
|
||||||
|
|
||||||
|
environment@1.1.0:
|
||||||
|
resolution: {integrity: sha512-xUtoPkMggbz0MPyPiIWr1Kp4aeWJjDZ6SMvURhimjdZgsRuDplF5/s9hcgGhyXMhs+6vpnuoiZ2kFiu3FMnS8Q==}
|
||||||
|
engines: {node: '>=18'}
|
||||||
|
|
||||||
error-ex@1.3.4:
|
error-ex@1.3.4:
|
||||||
resolution: {integrity: sha512-sqQamAnR14VgCr1A618A3sGrygcpK+HEbenA/HiEAkkUwcZIIB/tgWqHFxWgOyDh4nB4JCRimh79dR5Ywc9MDQ==}
|
resolution: {integrity: sha512-sqQamAnR14VgCr1A618A3sGrygcpK+HEbenA/HiEAkkUwcZIIB/tgWqHFxWgOyDh4nB4JCRimh79dR5Ywc9MDQ==}
|
||||||
|
|
||||||
@@ -2249,6 +2343,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-HGYWWS/ehqTV3xN10i23tkPkpH46MLCIMFNCaaKNavAXTF1RkqxawEPtnjnGZ6XKSInBKkiOA5BKS+aZiY3AvA==}
|
resolution: {integrity: sha512-HGYWWS/ehqTV3xN10i23tkPkpH46MLCIMFNCaaKNavAXTF1RkqxawEPtnjnGZ6XKSInBKkiOA5BKS+aZiY3AvA==}
|
||||||
engines: {node: '>=8'}
|
engines: {node: '>=8'}
|
||||||
|
|
||||||
|
istextorbinary@9.5.0:
|
||||||
|
resolution: {integrity: sha512-5mbUj3SiZXCuRf9fT3ibzbSSEWiy63gFfksmGfdOzujPjW3k+z8WvIBxcJHBoQNlaZaiyB25deviif2+osLmLw==}
|
||||||
|
engines: {node: '>=4'}
|
||||||
|
|
||||||
iterare@1.2.1:
|
iterare@1.2.1:
|
||||||
resolution: {integrity: sha512-RKYVTCjAnRthyJes037NX/IiqeidgN1xc3j1RjFfECFp28A1GVwK9nA+i0rJPaHqSZwygLzRnFlzUuHFoWWy+Q==}
|
resolution: {integrity: sha512-RKYVTCjAnRthyJes037NX/IiqeidgN1xc3j1RjFfECFp28A1GVwK9nA+i0rJPaHqSZwygLzRnFlzUuHFoWWy+Q==}
|
||||||
engines: {node: '>=6'}
|
engines: {node: '>=6'}
|
||||||
@@ -2473,6 +2571,9 @@ packages:
|
|||||||
lodash.merge@4.6.2:
|
lodash.merge@4.6.2:
|
||||||
resolution: {integrity: sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ==}
|
resolution: {integrity: sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ==}
|
||||||
|
|
||||||
|
lodash.truncate@4.4.2:
|
||||||
|
resolution: {integrity: sha512-jttmRe7bRse52OsWIMDLaXxWqRAmtIUccAQ3garviCqJjafXOfNMO0yMfNpdD6zbGaTU0P5Nz7e7gAT6cKmJRw==}
|
||||||
|
|
||||||
lodash@4.17.21:
|
lodash@4.17.21:
|
||||||
resolution: {integrity: sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==}
|
resolution: {integrity: sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==}
|
||||||
|
|
||||||
@@ -2657,6 +2758,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-LaNjtRWUBY++zB5nE/NwcaoMylSPk+S+ZHNB1TzdbMJMny6dynpAGt7X/tl/QYq3TIeE6nxHppbo2LGymrG5Pw==}
|
resolution: {integrity: sha512-LaNjtRWUBY++zB5nE/NwcaoMylSPk+S+ZHNB1TzdbMJMny6dynpAGt7X/tl/QYq3TIeE6nxHppbo2LGymrG5Pw==}
|
||||||
engines: {node: '>=10'}
|
engines: {node: '>=10'}
|
||||||
|
|
||||||
|
p-map@7.0.4:
|
||||||
|
resolution: {integrity: sha512-tkAQEw8ysMzmkhgw8k+1U/iPhWNhykKnSk4Rd5zLoPJCuJaGRPo6YposrZgaxHKzDHdDWWZvE/Sk7hsL2X/CpQ==}
|
||||||
|
engines: {node: '>=18'}
|
||||||
|
|
||||||
p-try@2.2.0:
|
p-try@2.2.0:
|
||||||
resolution: {integrity: sha512-R4nPAVTAU0B9D35/Gk3uJf/7XYbQcyohSKdvAxIRSNghFl4e71hVoGnBNQz9cWaXxO2I10KTC+3jMdvvoKw6dQ==}
|
resolution: {integrity: sha512-R4nPAVTAU0B9D35/Gk3uJf/7XYbQcyohSKdvAxIRSNghFl4e71hVoGnBNQz9cWaXxO2I10KTC+3jMdvvoKw6dQ==}
|
||||||
engines: {node: '>=6'}
|
engines: {node: '>=6'}
|
||||||
@@ -2725,6 +2830,9 @@ packages:
|
|||||||
resolution: {integrity: sha512-HRDzbaKjC+AOWVXxAU/x54COGeIv9eb+6CkDSQoNTt4XyWoIJvuPsXizxu/Fr23EiekbtZwmh1IcIG/l/a10GQ==}
|
resolution: {integrity: sha512-HRDzbaKjC+AOWVXxAU/x54COGeIv9eb+6CkDSQoNTt4XyWoIJvuPsXizxu/Fr23EiekbtZwmh1IcIG/l/a10GQ==}
|
||||||
engines: {node: '>=8'}
|
engines: {node: '>=8'}
|
||||||
|
|
||||||
|
pluralize@2.0.0:
|
||||||
|
resolution: {integrity: sha512-TqNZzQCD4S42De9IfnnBvILN7HAW7riLqsCyp8lgjXeysyPlX5HhqKAcJHHHb9XskE4/a+7VGC9zzx8Ls0jOAw==}
|
||||||
|
|
||||||
pluralize@8.0.0:
|
pluralize@8.0.0:
|
||||||
resolution: {integrity: sha512-Nc3IT5yHzflTfbjgqWcCPpo7DaKy4FnpB0l/zCAW0Tc7jxAiuqSxHasntB3D7887LSrA93kDJ9IXovxJYxyLCA==}
|
resolution: {integrity: sha512-Nc3IT5yHzflTfbjgqWcCPpo7DaKy4FnpB0l/zCAW0Tc7jxAiuqSxHasntB3D7887LSrA93kDJ9IXovxJYxyLCA==}
|
||||||
engines: {node: '>=4'}
|
engines: {node: '>=4'}
|
||||||
@@ -2767,6 +2875,9 @@ packages:
|
|||||||
randombytes@2.1.0:
|
randombytes@2.1.0:
|
||||||
resolution: {integrity: sha512-vYl3iOX+4CKUWuxGi9Ukhie6fsqXqS9FE2Zaic4tNFD2N2QQaXOMFbuKK4QmDHC0JO6B1Zp41J0LpT0oR68amQ==}
|
resolution: {integrity: sha512-vYl3iOX+4CKUWuxGi9Ukhie6fsqXqS9FE2Zaic4tNFD2N2QQaXOMFbuKK4QmDHC0JO6B1Zp41J0LpT0oR68amQ==}
|
||||||
|
|
||||||
|
rc-config-loader@4.1.3:
|
||||||
|
resolution: {integrity: sha512-kD7FqML7l800i6pS6pvLyIE2ncbk9Du8Q0gp/4hMPhJU6ZxApkoLcGD8ZeqgiAlfwZ6BlETq6qqe+12DUL207w==}
|
||||||
|
|
||||||
react-is@18.3.1:
|
react-is@18.3.1:
|
||||||
resolution: {integrity: sha512-/LLMVyas0ljjAtoYiPqYiL8VWXzUUdThrmU5+n20DZv+a+ClRoevUzw5JxU+Ieh5/c87ytoTBV9G1FiKfNJdmg==}
|
resolution: {integrity: sha512-/LLMVyas0ljjAtoYiPqYiL8VWXzUUdThrmU5+n20DZv+a+ClRoevUzw5JxU+Ieh5/c87ytoTBV9G1FiKfNJdmg==}
|
||||||
|
|
||||||
@@ -2894,6 +3005,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-g9Q1haeby36OSStwb4ntCGGGaKsaVSjQ68fBxoQcutl5fS1vuY18H3wSt3jFyFtrkx+Kz0V1G85A4MyAdDMi2Q==}
|
resolution: {integrity: sha512-g9Q1haeby36OSStwb4ntCGGGaKsaVSjQ68fBxoQcutl5fS1vuY18H3wSt3jFyFtrkx+Kz0V1G85A4MyAdDMi2Q==}
|
||||||
engines: {node: '>=8'}
|
engines: {node: '>=8'}
|
||||||
|
|
||||||
|
slice-ansi@4.0.0:
|
||||||
|
resolution: {integrity: sha512-qMCMfhY040cVHT43K9BFygqYbUPFZKHOg7K73mtTWJRb8pyP3fzf4Ixd5SzdEJQ6MRUg/WBnOLxghZtKKurENQ==}
|
||||||
|
engines: {node: '>=10'}
|
||||||
|
|
||||||
source-map-js@1.2.1:
|
source-map-js@1.2.1:
|
||||||
resolution: {integrity: sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==}
|
resolution: {integrity: sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==}
|
||||||
engines: {node: '>=0.10.0'}
|
engines: {node: '>=0.10.0'}
|
||||||
@@ -2972,6 +3087,9 @@ packages:
|
|||||||
resolution: {integrity: sha512-KIy5nylvC5le1OdaaoCJ07L+8iQzJHGH6pWDuzS+d07Cu7n1MZ2x26P8ZKIWfbK02+XIL8Mp4RkWeqdUCrDMfg==}
|
resolution: {integrity: sha512-KIy5nylvC5le1OdaaoCJ07L+8iQzJHGH6pWDuzS+d07Cu7n1MZ2x26P8ZKIWfbK02+XIL8Mp4RkWeqdUCrDMfg==}
|
||||||
engines: {node: '>=18'}
|
engines: {node: '>=18'}
|
||||||
|
|
||||||
|
structured-source@4.0.0:
|
||||||
|
resolution: {integrity: sha512-qGzRFNJDjFieQkl/sVOI2dUjHKRyL9dAJi2gCPGJLbJHBIkyOHxjuocpIEfbLioX+qSJpvbYdT49/YCdMznKxA==}
|
||||||
|
|
||||||
superagent@10.2.3:
|
superagent@10.2.3:
|
||||||
resolution: {integrity: sha512-y/hkYGeXAj7wUMjxRbB21g/l6aAEituGXM9Rwl4o20+SX3e8YOSV6BxFXl+dL3Uk0mjSL3kCbNkwURm8/gEDig==}
|
resolution: {integrity: sha512-y/hkYGeXAj7wUMjxRbB21g/l6aAEituGXM9Rwl4o20+SX3e8YOSV6BxFXl+dL3Uk0mjSL3kCbNkwURm8/gEDig==}
|
||||||
engines: {node: '>=14.18.0'}
|
engines: {node: '>=14.18.0'}
|
||||||
@@ -2988,6 +3106,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-MpUEN2OodtUzxvKQl72cUF7RQ5EiHsGvSsVG0ia9c5RbWGL2CI4C7EpPS8UTBIplnlzZiNuV56w+FuNxy3ty2Q==}
|
resolution: {integrity: sha512-MpUEN2OodtUzxvKQl72cUF7RQ5EiHsGvSsVG0ia9c5RbWGL2CI4C7EpPS8UTBIplnlzZiNuV56w+FuNxy3ty2Q==}
|
||||||
engines: {node: '>=10'}
|
engines: {node: '>=10'}
|
||||||
|
|
||||||
|
supports-hyperlinks@3.2.0:
|
||||||
|
resolution: {integrity: sha512-zFObLMyZeEwzAoKCyu1B91U79K2t7ApXuQfo8OuxwXLDgcKxuwM+YvcbIhm6QWqz7mHUH1TVytR1PwVVjEuMig==}
|
||||||
|
engines: {node: '>=14.18'}
|
||||||
|
|
||||||
symbol-observable@4.0.0:
|
symbol-observable@4.0.0:
|
||||||
resolution: {integrity: sha512-b19dMThMV4HVFynSAM1++gBHAbk2Tc/osgLIBZMKsyqh34jb2e8Os7T6ZW/Bt3pJFdBTd2JwAnAAEQV7rSNvcQ==}
|
resolution: {integrity: sha512-b19dMThMV4HVFynSAM1++gBHAbk2Tc/osgLIBZMKsyqh34jb2e8Os7T6ZW/Bt3pJFdBTd2JwAnAAEQV7rSNvcQ==}
|
||||||
engines: {node: '>=0.10'}
|
engines: {node: '>=0.10'}
|
||||||
@@ -2996,10 +3118,18 @@ packages:
|
|||||||
resolution: {integrity: sha512-MeQTA1r0litLUf0Rp/iisCaL8761lKAZHaimlbGK4j0HysC4PLfqygQj9srcs0m2RdtDYnF8UuYyKpbjHYp7Jw==}
|
resolution: {integrity: sha512-MeQTA1r0litLUf0Rp/iisCaL8761lKAZHaimlbGK4j0HysC4PLfqygQj9srcs0m2RdtDYnF8UuYyKpbjHYp7Jw==}
|
||||||
engines: {node: ^14.18.0 || >=16.0.0}
|
engines: {node: ^14.18.0 || >=16.0.0}
|
||||||
|
|
||||||
|
table@6.9.0:
|
||||||
|
resolution: {integrity: sha512-9kY+CygyYM6j02t5YFHbNz2FN5QmYGv9zAjVp4lCDjlCw7amdckXlEt/bjMhUIfj4ThGRE4gCUH5+yGnNuPo5A==}
|
||||||
|
engines: {node: '>=10.0.0'}
|
||||||
|
|
||||||
tapable@2.3.0:
|
tapable@2.3.0:
|
||||||
resolution: {integrity: sha512-g9ljZiwki/LfxmQADO3dEY1CbpmXT5Hm2fJ+QaGKwSXUylMybePR7/67YW7jOrrvjEgL1Fmz5kzyAjWVWLlucg==}
|
resolution: {integrity: sha512-g9ljZiwki/LfxmQADO3dEY1CbpmXT5Hm2fJ+QaGKwSXUylMybePR7/67YW7jOrrvjEgL1Fmz5kzyAjWVWLlucg==}
|
||||||
engines: {node: '>=6'}
|
engines: {node: '>=6'}
|
||||||
|
|
||||||
|
terminal-link@4.0.0:
|
||||||
|
resolution: {integrity: sha512-lk+vH+MccxNqgVqSnkMVKx4VLJfnLjDBGzH16JVZjKE2DoxP57s6/vt6JmXV5I3jBcfGrxNrYtC+mPtU7WJztA==}
|
||||||
|
engines: {node: '>=18'}
|
||||||
|
|
||||||
terser-webpack-plugin@5.3.14:
|
terser-webpack-plugin@5.3.14:
|
||||||
resolution: {integrity: sha512-vkZjpUjb6OMS7dhV+tILUW6BhpDR7P2L/aQSAv+Uwk+m8KATX9EccViHTJR2qDtACKPIYndLGCyl3FMo+r2LMw==}
|
resolution: {integrity: sha512-vkZjpUjb6OMS7dhV+tILUW6BhpDR7P2L/aQSAv+Uwk+m8KATX9EccViHTJR2qDtACKPIYndLGCyl3FMo+r2LMw==}
|
||||||
engines: {node: '>= 10.13.0'}
|
engines: {node: '>= 10.13.0'}
|
||||||
@@ -3025,6 +3155,13 @@ packages:
|
|||||||
resolution: {integrity: sha512-cAGWPIyOHU6zlmg88jwm7VRyXnMN7iV68OGAbYDk/Mh/xC/pzVPlQtY6ngoIH/5/tciuhGfvESU8GrHrcxD56w==}
|
resolution: {integrity: sha512-cAGWPIyOHU6zlmg88jwm7VRyXnMN7iV68OGAbYDk/Mh/xC/pzVPlQtY6ngoIH/5/tciuhGfvESU8GrHrcxD56w==}
|
||||||
engines: {node: '>=8'}
|
engines: {node: '>=8'}
|
||||||
|
|
||||||
|
text-table@0.2.0:
|
||||||
|
resolution: {integrity: sha512-N+8UisAXDGk8PFXP4HAzVR9nbfmVJ3zYLAWiTIoqC5v5isinhr+r5uaO8+7r3BMfuNIufIsA7RdpVgacC2cSpw==}
|
||||||
|
|
||||||
|
textextensions@6.11.0:
|
||||||
|
resolution: {integrity: sha512-tXJwSr9355kFJI3lbCkPpUH5cP8/M0GGy2xLO34aZCjMXBaK3SoPnZwr/oWmo1FdCnELcs4npdCIOFtq9W3ruQ==}
|
||||||
|
engines: {node: '>=4'}
|
||||||
|
|
||||||
tinybench@2.9.0:
|
tinybench@2.9.0:
|
||||||
resolution: {integrity: sha512-0+DUvqWMValLmha6lr4kD8iAMK1HzV0/aKnCtWb9v9641TnP/MFb7Pc2bxoxQjTXAErryXVgUOfv2YqNllqGeg==}
|
resolution: {integrity: sha512-0+DUvqWMValLmha6lr4kD8iAMK1HzV0/aKnCtWb9v9641TnP/MFb7Pc2bxoxQjTXAErryXVgUOfv2YqNllqGeg==}
|
||||||
|
|
||||||
@@ -3217,6 +3354,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-kiGUalWN+rgBJ/1OHZsBtU4rXZOfj/7rKQxULKlIzwzQSvMJUUNgPwJEEh7gU6xEVxC0ahoOBvN2YI8GH6FNgA==}
|
resolution: {integrity: sha512-kiGUalWN+rgBJ/1OHZsBtU4rXZOfj/7rKQxULKlIzwzQSvMJUUNgPwJEEh7gU6xEVxC0ahoOBvN2YI8GH6FNgA==}
|
||||||
engines: {node: '>=10.12.0'}
|
engines: {node: '>=10.12.0'}
|
||||||
|
|
||||||
|
version-range@4.15.0:
|
||||||
|
resolution: {integrity: sha512-Ck0EJbAGxHwprkzFO966t4/5QkRuzh+/I1RxhLgUKKwEn+Cd8NwM60mE3AqBZg5gYODoXW0EFsQvbZjRlvdqbg==}
|
||||||
|
engines: {node: '>=4'}
|
||||||
|
|
||||||
vite@7.2.4:
|
vite@7.2.4:
|
||||||
resolution: {integrity: sha512-NL8jTlbo0Tn4dUEXEsUg8KeyG/Lkmc4Fnzb8JXN/Ykm9G4HNImjtABMJgkQoVjOBN/j2WAwDTRytdqJbZsah7w==}
|
resolution: {integrity: sha512-NL8jTlbo0Tn4dUEXEsUg8KeyG/Lkmc4Fnzb8JXN/Ykm9G4HNImjtABMJgkQoVjOBN/j2WAwDTRytdqJbZsah7w==}
|
||||||
engines: {node: ^20.19.0 || >=22.12.0}
|
engines: {node: ^20.19.0 || >=22.12.0}
|
||||||
@@ -3441,6 +3582,12 @@ snapshots:
|
|||||||
transitivePeerDependencies:
|
transitivePeerDependencies:
|
||||||
- chokidar
|
- chokidar
|
||||||
|
|
||||||
|
'@azu/format-text@1.0.2': {}
|
||||||
|
|
||||||
|
'@azu/style-format@1.0.1':
|
||||||
|
dependencies:
|
||||||
|
'@azu/format-text': 1.0.2
|
||||||
|
|
||||||
'@babel/code-frame@7.27.1':
|
'@babel/code-frame@7.27.1':
|
||||||
dependencies:
|
dependencies:
|
||||||
'@babel/helper-validator-identifier': 7.28.5
|
'@babel/helper-validator-identifier': 7.28.5
|
||||||
@@ -4344,6 +4491,68 @@ snapshots:
|
|||||||
'@rollup/rollup-win32-x64-msvc@4.53.3':
|
'@rollup/rollup-win32-x64-msvc@4.53.3':
|
||||||
optional: true
|
optional: true
|
||||||
|
|
||||||
|
'@secretlint/config-loader@11.2.5':
|
||||||
|
dependencies:
|
||||||
|
'@secretlint/profiler': 11.2.5
|
||||||
|
'@secretlint/resolver': 11.2.5
|
||||||
|
'@secretlint/types': 11.2.5
|
||||||
|
ajv: 8.17.1
|
||||||
|
debug: 4.4.3
|
||||||
|
rc-config-loader: 4.1.3
|
||||||
|
transitivePeerDependencies:
|
||||||
|
- supports-color
|
||||||
|
|
||||||
|
'@secretlint/core@11.2.5':
|
||||||
|
dependencies:
|
||||||
|
'@secretlint/profiler': 11.2.5
|
||||||
|
'@secretlint/types': 11.2.5
|
||||||
|
debug: 4.4.3
|
||||||
|
structured-source: 4.0.0
|
||||||
|
transitivePeerDependencies:
|
||||||
|
- supports-color
|
||||||
|
|
||||||
|
'@secretlint/formatter@11.2.5':
|
||||||
|
dependencies:
|
||||||
|
'@secretlint/resolver': 11.2.5
|
||||||
|
'@secretlint/types': 11.2.5
|
||||||
|
'@textlint/linter-formatter': 15.4.0
|
||||||
|
'@textlint/module-interop': 15.4.0
|
||||||
|
'@textlint/types': 15.4.0
|
||||||
|
chalk: 5.6.2
|
||||||
|
debug: 4.4.3
|
||||||
|
pluralize: 8.0.0
|
||||||
|
strip-ansi: 7.1.2
|
||||||
|
table: 6.9.0
|
||||||
|
terminal-link: 4.0.0
|
||||||
|
transitivePeerDependencies:
|
||||||
|
- supports-color
|
||||||
|
|
||||||
|
'@secretlint/node@11.2.5':
|
||||||
|
dependencies:
|
||||||
|
'@secretlint/config-loader': 11.2.5
|
||||||
|
'@secretlint/core': 11.2.5
|
||||||
|
'@secretlint/formatter': 11.2.5
|
||||||
|
'@secretlint/profiler': 11.2.5
|
||||||
|
'@secretlint/source-creator': 11.2.5
|
||||||
|
'@secretlint/types': 11.2.5
|
||||||
|
debug: 4.4.3
|
||||||
|
p-map: 7.0.4
|
||||||
|
transitivePeerDependencies:
|
||||||
|
- supports-color
|
||||||
|
|
||||||
|
'@secretlint/profiler@11.2.5': {}
|
||||||
|
|
||||||
|
'@secretlint/resolver@11.2.5': {}
|
||||||
|
|
||||||
|
'@secretlint/secretlint-rule-preset-recommend@11.2.5': {}
|
||||||
|
|
||||||
|
'@secretlint/source-creator@11.2.5':
|
||||||
|
dependencies:
|
||||||
|
'@secretlint/types': 11.2.5
|
||||||
|
istextorbinary: 9.5.0
|
||||||
|
|
||||||
|
'@secretlint/types@11.2.5': {}
|
||||||
|
|
||||||
'@sinclair/typebox@0.34.41': {}
|
'@sinclair/typebox@0.34.41': {}
|
||||||
|
|
||||||
'@sinonjs/commons@3.0.1':
|
'@sinonjs/commons@3.0.1':
|
||||||
@@ -4356,6 +4565,35 @@ snapshots:
|
|||||||
|
|
||||||
'@standard-schema/spec@1.0.0': {}
|
'@standard-schema/spec@1.0.0': {}
|
||||||
|
|
||||||
|
'@textlint/ast-node-types@15.4.0': {}
|
||||||
|
|
||||||
|
'@textlint/linter-formatter@15.4.0':
|
||||||
|
dependencies:
|
||||||
|
'@azu/format-text': 1.0.2
|
||||||
|
'@azu/style-format': 1.0.1
|
||||||
|
'@textlint/module-interop': 15.4.0
|
||||||
|
'@textlint/resolver': 15.4.0
|
||||||
|
'@textlint/types': 15.4.0
|
||||||
|
chalk: 4.1.2
|
||||||
|
debug: 4.4.3
|
||||||
|
js-yaml: 3.14.2
|
||||||
|
lodash: 4.17.21
|
||||||
|
pluralize: 2.0.0
|
||||||
|
string-width: 4.2.3
|
||||||
|
strip-ansi: 6.0.1
|
||||||
|
table: 6.9.0
|
||||||
|
text-table: 0.2.0
|
||||||
|
transitivePeerDependencies:
|
||||||
|
- supports-color
|
||||||
|
|
||||||
|
'@textlint/module-interop@15.4.0': {}
|
||||||
|
|
||||||
|
'@textlint/resolver@15.4.0': {}
|
||||||
|
|
||||||
|
'@textlint/types@15.4.0':
|
||||||
|
dependencies:
|
||||||
|
'@textlint/ast-node-types': 15.4.0
|
||||||
|
|
||||||
'@tokenizer/inflate@0.3.1':
|
'@tokenizer/inflate@0.3.1':
|
||||||
dependencies:
|
dependencies:
|
||||||
debug: 4.4.3
|
debug: 4.4.3
|
||||||
@@ -4865,6 +5103,10 @@ snapshots:
|
|||||||
dependencies:
|
dependencies:
|
||||||
type-fest: 0.21.3
|
type-fest: 0.21.3
|
||||||
|
|
||||||
|
ansi-escapes@7.2.0:
|
||||||
|
dependencies:
|
||||||
|
environment: 1.1.0
|
||||||
|
|
||||||
ansi-regex@5.0.1: {}
|
ansi-regex@5.0.1: {}
|
||||||
|
|
||||||
ansi-regex@6.2.2: {}
|
ansi-regex@6.2.2: {}
|
||||||
@@ -4904,6 +5146,8 @@ snapshots:
|
|||||||
estree-walker: 3.0.3
|
estree-walker: 3.0.3
|
||||||
js-tokens: 9.0.1
|
js-tokens: 9.0.1
|
||||||
|
|
||||||
|
astral-regex@2.0.0: {}
|
||||||
|
|
||||||
asynckit@0.4.0: {}
|
asynckit@0.4.0: {}
|
||||||
|
|
||||||
babel-jest@30.2.0(@babel/core@7.28.5):
|
babel-jest@30.2.0(@babel/core@7.28.5):
|
||||||
@@ -4964,12 +5208,18 @@ snapshots:
|
|||||||
|
|
||||||
baseline-browser-mapping@2.8.31: {}
|
baseline-browser-mapping@2.8.31: {}
|
||||||
|
|
||||||
|
binaryextensions@6.11.0:
|
||||||
|
dependencies:
|
||||||
|
editions: 6.22.0
|
||||||
|
|
||||||
bl@4.1.0:
|
bl@4.1.0:
|
||||||
dependencies:
|
dependencies:
|
||||||
buffer: 5.7.1
|
buffer: 5.7.1
|
||||||
inherits: 2.0.4
|
inherits: 2.0.4
|
||||||
readable-stream: 3.6.2
|
readable-stream: 3.6.2
|
||||||
|
|
||||||
|
boundary@2.0.0: {}
|
||||||
|
|
||||||
brace-expansion@1.1.12:
|
brace-expansion@1.1.12:
|
||||||
dependencies:
|
dependencies:
|
||||||
balanced-match: 1.0.2
|
balanced-match: 1.0.2
|
||||||
@@ -5031,6 +5281,8 @@ snapshots:
|
|||||||
ansi-styles: 4.3.0
|
ansi-styles: 4.3.0
|
||||||
supports-color: 7.2.0
|
supports-color: 7.2.0
|
||||||
|
|
||||||
|
chalk@5.6.2: {}
|
||||||
|
|
||||||
char-regex@1.0.2: {}
|
char-regex@1.0.2: {}
|
||||||
|
|
||||||
chardet@2.1.1: {}
|
chardet@2.1.1: {}
|
||||||
@@ -5155,6 +5407,10 @@ snapshots:
|
|||||||
|
|
||||||
eastasianwidth@0.2.0: {}
|
eastasianwidth@0.2.0: {}
|
||||||
|
|
||||||
|
editions@6.22.0:
|
||||||
|
dependencies:
|
||||||
|
version-range: 4.15.0
|
||||||
|
|
||||||
electron-to-chromium@1.5.259: {}
|
electron-to-chromium@1.5.259: {}
|
||||||
|
|
||||||
emittery@0.13.1: {}
|
emittery@0.13.1: {}
|
||||||
@@ -5168,6 +5424,8 @@ snapshots:
|
|||||||
graceful-fs: 4.2.11
|
graceful-fs: 4.2.11
|
||||||
tapable: 2.3.0
|
tapable: 2.3.0
|
||||||
|
|
||||||
|
environment@1.1.0: {}
|
||||||
|
|
||||||
error-ex@1.3.4:
|
error-ex@1.3.4:
|
||||||
dependencies:
|
dependencies:
|
||||||
is-arrayish: 0.2.1
|
is-arrayish: 0.2.1
|
||||||
@@ -5647,6 +5905,12 @@ snapshots:
|
|||||||
html-escaper: 2.0.2
|
html-escaper: 2.0.2
|
||||||
istanbul-lib-report: 3.0.1
|
istanbul-lib-report: 3.0.1
|
||||||
|
|
||||||
|
istextorbinary@9.5.0:
|
||||||
|
dependencies:
|
||||||
|
binaryextensions: 6.11.0
|
||||||
|
editions: 6.22.0
|
||||||
|
textextensions: 6.11.0
|
||||||
|
|
||||||
iterare@1.2.1: {}
|
iterare@1.2.1: {}
|
||||||
|
|
||||||
jackspeak@3.4.3:
|
jackspeak@3.4.3:
|
||||||
@@ -6041,6 +6305,8 @@ snapshots:
|
|||||||
|
|
||||||
lodash.merge@4.6.2: {}
|
lodash.merge@4.6.2: {}
|
||||||
|
|
||||||
|
lodash.truncate@4.4.2: {}
|
||||||
|
|
||||||
lodash@4.17.21: {}
|
lodash@4.17.21: {}
|
||||||
|
|
||||||
log-symbols@4.1.0:
|
log-symbols@4.1.0:
|
||||||
@@ -6204,6 +6470,8 @@ snapshots:
|
|||||||
dependencies:
|
dependencies:
|
||||||
p-limit: 3.1.0
|
p-limit: 3.1.0
|
||||||
|
|
||||||
|
p-map@7.0.4: {}
|
||||||
|
|
||||||
p-try@2.2.0: {}
|
p-try@2.2.0: {}
|
||||||
|
|
||||||
package-json-from-dist@1.0.1: {}
|
package-json-from-dist@1.0.1: {}
|
||||||
@@ -6255,6 +6523,8 @@ snapshots:
|
|||||||
dependencies:
|
dependencies:
|
||||||
find-up: 4.1.0
|
find-up: 4.1.0
|
||||||
|
|
||||||
|
pluralize@2.0.0: {}
|
||||||
|
|
||||||
pluralize@8.0.0: {}
|
pluralize@8.0.0: {}
|
||||||
|
|
||||||
postcss@8.5.6:
|
postcss@8.5.6:
|
||||||
@@ -6291,6 +6561,15 @@ snapshots:
|
|||||||
dependencies:
|
dependencies:
|
||||||
safe-buffer: 5.2.1
|
safe-buffer: 5.2.1
|
||||||
|
|
||||||
|
rc-config-loader@4.1.3:
|
||||||
|
dependencies:
|
||||||
|
debug: 4.4.3
|
||||||
|
js-yaml: 4.1.1
|
||||||
|
json5: 2.2.3
|
||||||
|
require-from-string: 2.0.2
|
||||||
|
transitivePeerDependencies:
|
||||||
|
- supports-color
|
||||||
|
|
||||||
react-is@18.3.1: {}
|
react-is@18.3.1: {}
|
||||||
|
|
||||||
readable-stream@3.6.2:
|
readable-stream@3.6.2:
|
||||||
@@ -6441,6 +6720,12 @@ snapshots:
|
|||||||
|
|
||||||
slash@3.0.0: {}
|
slash@3.0.0: {}
|
||||||
|
|
||||||
|
slice-ansi@4.0.0:
|
||||||
|
dependencies:
|
||||||
|
ansi-styles: 4.3.0
|
||||||
|
astral-regex: 2.0.0
|
||||||
|
is-fullwidth-code-point: 3.0.0
|
||||||
|
|
||||||
source-map-js@1.2.1: {}
|
source-map-js@1.2.1: {}
|
||||||
|
|
||||||
source-map-support@0.5.13:
|
source-map-support@0.5.13:
|
||||||
@@ -6510,6 +6795,10 @@ snapshots:
|
|||||||
dependencies:
|
dependencies:
|
||||||
'@tokenizer/token': 0.3.0
|
'@tokenizer/token': 0.3.0
|
||||||
|
|
||||||
|
structured-source@4.0.0:
|
||||||
|
dependencies:
|
||||||
|
boundary: 2.0.0
|
||||||
|
|
||||||
superagent@10.2.3:
|
superagent@10.2.3:
|
||||||
dependencies:
|
dependencies:
|
||||||
component-emitter: 1.3.1
|
component-emitter: 1.3.1
|
||||||
@@ -6539,14 +6828,32 @@ snapshots:
|
|||||||
dependencies:
|
dependencies:
|
||||||
has-flag: 4.0.0
|
has-flag: 4.0.0
|
||||||
|
|
||||||
|
supports-hyperlinks@3.2.0:
|
||||||
|
dependencies:
|
||||||
|
has-flag: 4.0.0
|
||||||
|
supports-color: 7.2.0
|
||||||
|
|
||||||
symbol-observable@4.0.0: {}
|
symbol-observable@4.0.0: {}
|
||||||
|
|
||||||
synckit@0.11.11:
|
synckit@0.11.11:
|
||||||
dependencies:
|
dependencies:
|
||||||
'@pkgr/core': 0.2.9
|
'@pkgr/core': 0.2.9
|
||||||
|
|
||||||
|
table@6.9.0:
|
||||||
|
dependencies:
|
||||||
|
ajv: 8.17.1
|
||||||
|
lodash.truncate: 4.4.2
|
||||||
|
slice-ansi: 4.0.0
|
||||||
|
string-width: 4.2.3
|
||||||
|
strip-ansi: 6.0.1
|
||||||
|
|
||||||
tapable@2.3.0: {}
|
tapable@2.3.0: {}
|
||||||
|
|
||||||
|
terminal-link@4.0.0:
|
||||||
|
dependencies:
|
||||||
|
ansi-escapes: 7.2.0
|
||||||
|
supports-hyperlinks: 3.2.0
|
||||||
|
|
||||||
terser-webpack-plugin@5.3.14(webpack@5.100.2):
|
terser-webpack-plugin@5.3.14(webpack@5.100.2):
|
||||||
dependencies:
|
dependencies:
|
||||||
'@jridgewell/trace-mapping': 0.3.31
|
'@jridgewell/trace-mapping': 0.3.31
|
||||||
@@ -6569,6 +6876,12 @@ snapshots:
|
|||||||
glob: 7.2.3
|
glob: 7.2.3
|
||||||
minimatch: 3.1.2
|
minimatch: 3.1.2
|
||||||
|
|
||||||
|
text-table@0.2.0: {}
|
||||||
|
|
||||||
|
textextensions@6.11.0:
|
||||||
|
dependencies:
|
||||||
|
editions: 6.22.0
|
||||||
|
|
||||||
tinybench@2.9.0: {}
|
tinybench@2.9.0: {}
|
||||||
|
|
||||||
tinyexec@0.3.2: {}
|
tinyexec@0.3.2: {}
|
||||||
@@ -6770,6 +7083,8 @@ snapshots:
|
|||||||
'@types/istanbul-lib-coverage': 2.0.6
|
'@types/istanbul-lib-coverage': 2.0.6
|
||||||
convert-source-map: 2.0.0
|
convert-source-map: 2.0.0
|
||||||
|
|
||||||
|
version-range@4.15.0: {}
|
||||||
|
|
||||||
vite@7.2.4(@types/node@22.19.1)(terser@5.44.1)(tsx@4.20.6):
|
vite@7.2.4(@types/node@22.19.1)(terser@5.44.1)(tsx@4.20.6):
|
||||||
dependencies:
|
dependencies:
|
||||||
esbuild: 0.25.12
|
esbuild: 0.25.12
|
||||||
|
|||||||
Reference in New Issue
Block a user