Documentation
¶
Overview ¶
Package analyzer provides code analysis functionality for Go projects.
The analyzer examines parsed Go source code metrics and applies configurable thresholds to identify code quality issues. It supports:
- File and function-level metrics analysis (size, complexity, comments) - Anti-pattern detection through a modular detector system - Test coverage integration via go test -cover - Dependency analysis and circular dependency detection - Aggregate metrics calculation (percentiles, averages, top N)
Usage ¶
Create an analyzer with configuration:
cfg := config.Default() analyzer := analyzer.NewAnalyzer(&cfg.Analysis)
Run analysis on parsed metrics:
result, err := analyzer.Analyze(projectPath, metrics)
if err != nil {
log.Fatal(err)
}
The analyzer returns an AnalysisResult containing: - Detected issues (warnings and info) - Aggregate metrics (averages, percentiles) - Test coverage data (if enabled) - Dependency analysis results (if enabled)
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func SortIssuesBySeverity ¶
func SortIssuesBySeverity(issues []*Issue)
SortIssuesBySeverity sorts issues by severity and then by file
Types ¶
type AggregateMetrics ¶
type AggregateMetrics struct {
AverageFunctionLength float64 `json:"average_function_length"`
FunctionLengthP95 int `json:"function_length_p95"` // 95th percentile
CommentRatio float64 `json:"comment_ratio"` // Overall comment ratio
LargestFiles []*FileSize `json:"largest_files"` // Top 10
AverageComplexity float64 `json:"average_complexity"`
ComplexityP95 int `json:"complexity_p95"` // 95th percentile
MostComplexFunctions []*FunctionInfo `json:"most_complex_functions"` // Top 10
}
AggregateMetrics contains statistical aggregations across all analyzed files.
This type provides high-level insights into codebase quality:
- Central tendencies (averages)
- Distribution metrics (95th percentiles)
- Outlier identification (largest files, most complex functions)
- Overall documentation quality (comment ratio)
Percentile metrics (P95) help identify outliers while being resistant to extreme values, making them more reliable than maximums for threshold setting.
type AnalysisResult ¶
type AnalysisResult struct {
ProjectPath string `json:"project_path"`
TotalFiles int `json:"total_files"`
TotalLines int `json:"total_lines"`
TotalCodeLines int `json:"total_code_lines"`
TotalFunctions int `json:"total_functions"`
Metrics *AggregateMetrics `json:"metrics"`
Files []*FileAnalysis `json:"files"`
Issues []*Issue `json:"issues"`
Coverage *CoverageReport `json:"coverage,omitempty"`
Dependencies *DependencyReport `json:"dependencies,omitempty"`
}
AnalysisResult contains the complete analysis results for a codebase.
This is the primary output type from the Analyzer.Analyze() method. It aggregates:
- Summary statistics (file count, line counts, function counts)
- Aggregate metrics (averages, percentiles, top-N lists)
- Per-file analysis details
- All detected issues sorted by severity
- Optional coverage report (if enabled)
- Optional dependency analysis (if enabled)
The result is designed to be serializable to JSON for programmatic consumption and can be passed to any Reporter implementation for output formatting.
type Analyzer ¶
type Analyzer interface {
// Analyze takes parsed file metrics and produces an analysis result
Analyze(projectPath string, metrics []*parserPkg.FileMetrics) (*AnalysisResult, error)
}
Analyzer defines the interface for analyzing parsed code metrics
func NewAnalyzer ¶
func NewAnalyzer(cfg *config.AnalysisConfig, statusReporter status.Reporter) Analyzer
NewAnalyzer creates a new MetricsAnalyzer with the given configuration and status reporter
type CircularDependency ¶ added in v0.2.0
type CircularDependency struct {
Cycle []string `json:"cycle"` // The circular dependency chain
}
CircularDependency represents a circular import dependency between packages.
The Cycle field contains the package names forming the circular chain, where the first and last packages create the cycle. For example:
["pkg/a", "pkg/b", "pkg/c", "pkg/a"]
Circular dependencies prevent compilation and indicate architectural problems that should be resolved by restructuring package boundaries.
type CoverageReport ¶ added in v0.2.0
type CoverageReport struct {
Packages []*PackageCoverage `json:"packages"`
AverageCoverage float64 `json:"average_coverage"`
LowCoverageCount int `json:"low_coverage_count"`
}
CoverageReport contains test coverage analysis results.
This report is generated by running `go test -cover` on all packages (excluding test files, vendor, and other configured patterns). It provides:
- Per-package coverage percentages
- Average coverage across all packages
- Count of packages below the configured threshold
Packages without tests are marked as skipped. Packages with compilation errors include error messages in the Error field.
type DependencyReport ¶ added in v0.2.0
type DependencyReport struct {
Packages []*PackageDependencies `json:"packages"`
CircularDependencies []*CircularDependency `json:"circular_dependencies,omitempty"`
TotalPackages int `json:"total_packages"`
HighImportCount int `json:"high_import_count"` // Packages exceeding import threshold
HighExternalCount int `json:"high_external_count"` // Packages with too many external deps
}
DependencyReport contains comprehensive dependency analysis results.
This report categorizes imports and identifies dependency-related issues:
- Import categorization (stdlib, internal, external)
- Packages exceeding import thresholds
- Packages with too many external dependencies
- Circular dependencies (if detection is enabled)
Circular dependencies are particularly problematic as they can cause compilation errors and indicate architectural issues.
type FileAnalysis ¶
type FileAnalysis struct {
Path string `json:"path"`
Metrics *parser.FileMetrics `json:"metrics"`
LargeFile bool `json:"large_file"` // Exceeds threshold
}
FileAnalysis contains detailed analysis results for a single file.
This type combines the raw metrics from parsing with analysis results:
- All parsed metrics (lines, functions, complexity, etc.)
- Large file flag (exceeds configured threshold)
Used in verbose reporting mode to show per-file details.
type FileSize ¶
FileSize represents a file and its line count for size-based reporting.
Used in the "largest files" report to identify files that may benefit from being split into smaller, more maintainable modules.
type FunctionInfo ¶ added in v0.2.0
type FunctionInfo struct {
File string `json:"file"`
Function string `json:"function"`
Complexity int `json:"complexity"`
Lines int `json:"lines"`
}
FunctionInfo represents function information for complexity reporting.
Used in the "most complex functions" report to identify functions that may need refactoring or additional testing due to high cyclomatic complexity. Includes both complexity and size metrics for context.
type Issue ¶
Issue is an alias for detectors.Issue for backward compatibility and cleaner imports.
This allows external packages to reference analyzer.Issue instead of analyzer/detectors.Issue, simplifying the API surface.
type MetricsAnalyzer ¶
type MetricsAnalyzer struct {
// contains filtered or unexported fields
}
MetricsAnalyzer implements Analyzer for basic metrics analysis
func (*MetricsAnalyzer) Analyze ¶
func (ma *MetricsAnalyzer) Analyze(projectPath string, metrics []*parserPkg.FileMetrics) (*AnalysisResult, error)
Analyze performs comprehensive code quality analysis on parsed file metrics.
This is the main analysis pipeline that orchestrates all analysis phases:
1. Metrics Aggregation:
- Accumulates file and function counts
- Detects large files and long functions
- Identifies high-complexity functions
2. Anti-Pattern Detection:
- Runs all enabled detectors (parameters, nesting, returns, magic numbers, etc.)
- Requires re-parsing to obtain AST for pattern matching
3. Statistical Analysis:
- Calculates aggregate metrics (averages, percentiles)
- Identifies outliers (largest files, most complex functions)
- Computes overall comment ratio
4. Coverage Analysis (optional):
- Runs go test -cover if enabled
- Detects packages below coverage threshold
5. Dependency Analysis:
- Categorizes imports (stdlib, internal, external)
- Detects circular dependencies
- Identifies packages with too many dependencies
The function is fault-tolerant: coverage and dependency failures produce warnings but don't fail the entire analysis.
Returns a complete AnalysisResult with all findings, or an error if the core analysis cannot be completed.
type PackageCoverage ¶ added in v0.2.0
type PackageCoverage struct {
PackagePath string `json:"package_path"`
Coverage float64 `json:"coverage"`
Error string `json:"error,omitempty"`
Skipped bool `json:"skipped"`
}
PackageCoverage represents test coverage results for a single Go package.
The Coverage field is only meaningful when both Error and Skipped are false. Skipped indicates no test files were found. Error contains compilation or test execution errors if any occurred.
type PackageDependencies ¶ added in v0.2.0
type PackageDependencies struct {
PackageName string `json:"package_name"`
StdlibImports []string `json:"stdlib_imports"`
InternalImports []string `json:"internal_imports"`
ExternalImports []string `json:"external_imports"`
TotalImports int `json:"total_imports"`
ExternalImportCount int `json:"external_import_count"`
}
PackageDependencies represents import analysis for a single Go package.
Imports are categorized into three types:
- StdlibImports: Go standard library packages (e.g., "fmt", "net/http")
- InternalImports: Project-internal packages (same module path prefix)
- ExternalImports: Third-party dependencies (external module paths)
This categorization helps identify external dependencies and assess package coupling.