quality-metrics
Manage quality through automated dashboards, DORA metrics, and policy-driven quality gates. Track defect density, MTTD, and test effectiveness to maintain high-velocity software delivery.
Introduction
The quality-metrics skill is a specialized engineering toolkit designed for teams operating at high velocity who need objective, data-driven visibility into their software development lifecycle. It moves beyond simple pass/fail testing by providing actionable insights through automated dashboards, historical trend analysis, and proactive quality gate enforcement. This skill integrates with the broader agentic-qe fleet to synthesize metrics from test execution, production intelligence, and coverage analysis into a unified view for stakeholders.
-
Automated Dashboard Generation: Automatically creates and populates Grafana-ready dashboards covering DORA metrics (deployment frequency, lead time, change failure rate), stability (MTTD, MTTR), and process health (code review time, flaky test rates).
-
Policy-Driven Quality Gates: Defines and enforces complex quality gates at commit, PR, and release stages. It handles conditional logic such as blocking deployments if test coverage dips below 80% or if critical security vulnerabilities are detected.
-
Advanced Trend Analysis: Predicts future quality states by analyzing 90-day trends. It compares current performance against historical data, enabling teams to proactively address degradation before it impacts production.
-
Fleet Coordination: Orchestrates specialized agents including qe-quality-analyzer, qe-test-executor, and qe-production-intelligence. It acts as the central brain for deciding whether a software build meets the rigorous criteria required for promotion.
-
Strategic Visibility: Translates raw execution data into stakeholder-friendly reports, helping teams justify infrastructure and testing investments with clear data on defect escape rates and test effectiveness ratios.
-
Use this skill when establishing KPIs for continuous delivery pipelines or performing post-mortems on release stability.
-
Input requirements include build IDs, environment targets (staging/production), and policy definitions (in JSON or TS format).
-
Adhere to the core philosophy of measuring outcomes rather than vanity activities: prioritize bug escape rates and MTTR over raw test case counts.
-
Ensure thresholds are set to drive behavior; use blocking gates only for mission-critical code quality requirements to avoid developer friction.
-
The skill maintains a memory namespace under aqe/quality-metrics/ for storing historical trends and past gate evaluation results.
Repository Stats
- Stars
- 329
- Forks
- 65
- Open Issues
- 4
- Language
- TypeScript
- Default Branch
- main
- Sync Status
- Idle
- Last Synced
- Apr 28, 2026, 01:05 PM