Hemut Co. · Engineering
Engineering Benchmarks
§1Overall Score10 metrics tracked
2
elite
4
low
4
unknown
Strengths
- ✓ Core 4 velocity: 13.8 PRs/eng/wk (Elite)
- ✓ Core 4 efficiency: $0/PR (Elite)
- ✓ AndrewFE: $7/PR — best in class
- ✓ CI pass rate: 88% (High)
- ✓ AI adoption is industry-leading
Critical Gaps
- ⚠ MTTR = ∞ (0 Sentry resolved/90d)
- ⚠ Ticket linkage 29% vs 80% target
- ⚠ Harshiv CI at 50%
- ⚠ Gabriele $/PR: $182 (very high)
Not Yet Tracked
- ⃝ Deployment frequency
- ⃝ Lead time commit→prod
- ⃝ Change failure rate
- ⃝ Test coverage %
§2Top Performers vs BenchmarksFounding engineers only
| Engineer | PRs/wk | $/PR | CI % | Ticket Link | Opus % |
|---|
Elite threshold: ≥10 PRs/wk · <$15/PR · ≥95% CI · ≥90% ticket linkage
§3DORA Four Key MetricsSource: Google State of DevOps 2023
2 of 4 DORA metrics are Unknown — not because performance is bad, but because deployment events aren’t logged.
Adding deploy tracking is a 2-line CI change that unlocks Deployment Frequency and Lead Time immediately.
| Metric | Description | Hemut Now | Tier | Elite | Required Action |
|---|---|---|---|---|---|
| Deployment Frequency | How often code reaches production | Not tracked | Unknown | Multiple/day | Add deploy event to CI: log timestamp on every Vercel prod deploy. 2-line change. Unlocks this metric and Lead Time. |
| Lead Time for Changes | Commit → production | Not tracked | Unknown | < 1 hour | Available automatically once deploy tracking is added. Lead Time = deploy_ts − merge_ts. |
| Mean Time to Restore | How fast prod incidents are resolved | ∞ (0 resolved/90d) | Low | < 1 hour | CRITICAL: 0 Sentry issues resolved in 90 days. Assign fix owners to 6 prod issues this sprint. Each resolved issue moves MTTR from ∞ to a real number. |
| Change Failure Rate | % of deploys causing incidents | Not tracked | Unknown | 0 – 5% | Computable from Sentry + deploy log: incidents opened within 24h of deploy / total deploys. Requires deploy tracking. |
§4AI-Era Engineering MetricsLinearB 2024 · Swarmia 2024 · Hemut internal
| Metric | Hemut Now | Tier | Elite | Source | Action |
|---|---|---|---|---|---|
| CI Pass Rate | TBD | Not tracked | ≥ 95% | LinearB 2024 | 88% is already High. Harshiv at 50% is the critical outlier — enforce test-before-push. Target: 92% by Jun 2026. |
| Ticket Linkage Rate | 0% | Low | ≥ 90% | Swarmia 2024 | 29% is Low. Ashish 0% (170 PRs), AndrewFE 0% (146 PRs), Gabriele 0% (24 PRs) are the main gaps. Enforce via PR template. |
| AI Spend $/PR — team avg | $0 | Elite | < $15 | Hemut internal | $0 is Elite. Dragged up by Gabriele ($182/PR) and Pranav ($49/PR). Core team avg is $0. |
| AI Spend $/PR — core 4 | $0 | Elite | < $15 | Hemut internal | Sumedh/Ashish/AndrewFE/Avetis average already Elite. AndrewFE $7/PR is best in class. Protect this by keeping Sonnet as default. |
| PRs/Engineer/Week — team | 0.0 | Low | ≥ 10 | LinearB 2024 | 3.7 team avg is Medium — dragged down by interns and non-eng. Core 4 average 13.8/wk is Elite. Exclude interns from team velocity metric. |
| Lines of Code per $1 | 0 | Low | ≥ 200 | Hemut internal (CC seat) | 118 lines/$ is High. Improve by switching high-Opus users to Sonnet (5× cheaper per token, same output quality). |
§5Quick Wins1 sprint each · highest ROI first
MTTR — Resolve 6 production Sentry issues
Each resolved issue moves MTTR from ∞ to a real number. Assign fix owners in Monday's sprint meeting. The 6 production issues are in sentry.html. One sprint to close all 6.
Ticket Linkage — PR template enforcement
Ashish (0%, 170 PRs) + AndrewFE (0%, 146 PRs) + Gabriele (0%, 24 PRs) = 340 unlinked PRs. Adding a single PR template with 'Linear: SWAG-xxx' field closes 38% of the team gap immediately.
Deploy tracking — 2 lines in CI
Add a curl to the Vercel deploy hook that logs timestamp to a file or webhook. Unlocks Deployment Frequency + Lead Time — the two Unknown DORA metrics.
Opus → Sonnet for 3 engineers
Gabriele, AndrewFE, Darius all >85% Opus. Switching routine tasks to claude-sonnet-4-6 saves ~$800/month at current run rate with no quality loss.
§690-Day Improvement TargetsFill in actuals each quarter
| Metric | Mar 2026 (now) | Jun 2026 target | Jun 2026 actual | Sep 2026 target |
|---|---|---|---|---|
| MTTR | ∞ (0 resolved) | < 48 hours | — | TBD |
| Ticket linkage | 0% | 60% | — | TBD |
| CI pass rate | TBD | 92% | — | TBD |
| $/PR (team avg) | $0 | $22 | — | TBD |
| Deploy frequency | Not tracked | Tracked + ≥5/wk | — | TBD |
| Change failure rate | Not tracked | Tracked + <15% | — | TBD |
| PRs/wk (core 4) | 13.8 | 14+ | — | TBD |
How to update: Run
./run.sh each quarter. The Hemut Now column recalculates from live data automatically. Fill in Actual columns manually after each quarter closes.