The interesting thing isn’t the spike. It’s what happened after.

607 commits in a single week makes a good headline. But look at the 52-week chart and you’ll notice something more telling: the weeks after each breakthrough didn’t fall back to baseline. The floor rose. Twice. And each time, it stayed up.

That’s the story worth telling. Not a peak — a permanent shift in what “normal” output looks like, visible in 52 weeks of public GitHub contribution data at github.com/bhandrigan.

Before the Brain: March through October 2025

For eight months, I was building across multiple codebases — platform work, data pipelines, analytics infrastructure, websites. Real work, real output. But look at the contribution graph for that stretch and you’ll see a familiar rhythm: bursts followed by valleys. A 345-contribution week in May, then single digits. A 295-week in August, then back to 63. The active-week average was roughly 50 contributions. On many weeks, it was zero.

I know this pattern. Every developer knows this pattern. You sit down Monday morning with a different project than you left Friday afternoon, and the first hour — sometimes two — goes to context rebuilding. Re-reading architecture docs. Re-establishing the conventions you’d decided on three days ago. Pasting code snippets into prompts so the AI has some idea what you’re working on.

The AI tools I was using at the time had no memory between sessions. Every conversation started from scratch. I’d describe the same architecture, the same constraints, the same preferences — over and over, across every tool, every project. The tools were individually capable. The problem was that each session existed in isolation. Knowledge didn’t accumulate. Patterns didn’t carry forward. The output bursts you see in that pre-brain chart were fueled by sheer effort, and they were unsustainable because the overhead of context management scaled with the number of projects I was running simultaneously.

That friction was the problem I set out to solve.

Building the Brain: November 2025

What happened in the second week of November 2025 is visible as the tallest bar on the chart: 816 contributions. The all-time peak.

This was two weeks of raw construction velocity — the intelligence infrastructure being built from scratch. The system that would become the grāmatr℠ pipeline: persistent context management, an embedding pipeline for project knowledge, a graph structure for entity relationships, a security model for per-user encryption. Everything that would allow an AI session to start where the last one left off instead of starting from zero.

Greenfield construction. Every commit shipped directly into the new system because there was nothing to merge into yet. The 816-week and the 475-week before it are the highest contribution weeks in the entire 52-week dataset, and they’re the first proof that the brain worked. The output was already higher than any prior week — months before the routing breakthrough that came next.

That’s the first step-change. The second one was bigger.

The Floor Rises: November 23, 2025 through February 2026

The first brain-assisted week landed at 198 contributions. Then a dip to 76 as the system stabilized. Then something I hadn’t experienced before: sustained elevation.

Week after week, the numbers held above pre-brain baseline. 391 in early December — building data pipelines, analytics infrastructure, architecture documentation across multiple projects. 209. 189. Even over the holidays, output held. January started slow — 45, 34 — but that was holiday recovery, not a regression. By late January: 170. Then the week of February 1: 518 contributions.

Here’s what changed in the day-to-day experience. I’d start a session and the context was already there. Not because the system retrieved my last conversation — because it had learned the project patterns. When I switched from writing infrastructure code to reviewing content drafts for a different project, the context switched with me. Not because I told it to — because the system recognized the shift in intent and adjusted what it delivered.

The constant overhead of context management — the ten to thirty minutes at the start of every session explaining what I was working on, what the constraints were, what conventions we’d established — was gone. That time went directly into building.

The average across this entire phase was roughly 200 contributions per week. Four times the pre-brain baseline. Not a spike. A new floor.

And I want to be clear about what this felt like, because the numbers alone don’t convey it. It felt like flow states became easier to reach and harder to break. The friction that used to knock me out of deep work — hunting for architecture decisions, re-explaining project context, switching between tools and losing the thread — was absorbed by the pipeline. I could hold more projects in active rotation because each project’s context persisted and loaded automatically. The cognitive load dropped. The output rose. And it stayed up.

The Investment Period: February through March 2026

Then I stopped shipping and started building again.

The chart shows it: 80, 130, 187, 49, then near zero. Visible output dropped because I was investing in the next layer of infrastructure — the classifier, the routing engine, the generation pipeline. The system that would take the brain-assisted approach and make it architecturally smarter: pre-classifying every request before it reached a language model, routing context dynamically instead of loading everything, running classification on-device in under 100 milliseconds.

One week near zero includes time away from the keyboard entirely. But most of this period was deep technical work that doesn’t show up as GitHub contributions — testing local model fidelity, building classification heads, structuring the routing architecture that would eventually become patent-pending.

From the outside, it looked like the productivity story had ended. From the inside, I was building the multiplier for the next step-change.

The Second Floor Rise: March 2026

The routing engine came online the week of March 22. The second breakthrough hit.

562 contributions that first week. 376 on the GitHub calendar the following week — and the git log shows what those 376 actually contained: 607 commits touching 354,489 lines of changed code across 1,203 files. The discrepancy matters, because it tells you something specific about what the routing breakthrough enabled.

The March work shipped through feature branches, pull requests, code review, squash merges, automated CI/CD, and 15 tagged production releases. The GitHub contribution calendar shows 376 because squash-merged branches collapse into a single contribution — the calendar is the public lower bound; the git log is authoritative. Velocity and engineering discipline grew together, driven by the routing breakthrough that made both possible at the same time.

Better practices AND more output, simultaneously. That’s the compounding story. The industry tradeoff has always been: ship fast or ship clean. The routing breakthrough is what made both possible at the same time.

The pre-classification pipeline meant every request got routed to exactly the right context before the language model ever saw it. Instead of dumping 40,000 tokens of project context into every interaction, the system delivered approximately 1,200 targeted tokens — a 97% reduction in context noise with improved response quality. The routing wasn’t just faster. It was smarter. And it ran entirely on-device, no API calls, no latency, no cloud dependency for the classification step.

The post-routing average settled around 470 contributions per week. Nearly ten times the pre-brain baseline.

The Story the Chart Tells

Pull back and look at the full 52 weeks. Not the peaks — the floors.

Phase one: roughly 50 contributions per week on active weeks, with many weeks at zero. Effort-driven output that couldn’t sustain.

Phase two: the brain comes online. The floor rises to roughly 200 per week. Four times baseline. Sustained across months, through holidays, through project switches.

Phase three: the routing engine comes online. The floor rises again to roughly 470 per week. Nearly ten times baseline. Velocity and discipline together — feature branches, pull requests, code review, automated CI/CD, and 15 tagged production releases.

Two architectural breakthroughs. Each one permanently raised the baseline. The second one compounded on the first — it didn’t just add velocity, it added velocity while improving code quality practices. The contribution calendar shows the sustained elevation. The git log data confirms the scope behind the numbers.

This isn’t a story about one exceptional week. It’s a story about two step-changes in what a normal week looks like — visible in a year of public data, verifiable by anyone who checks the contribution graph.

What This Means

I want to be careful about overclaiming, because it would be easy to extrapolate one person’s data into universal conclusions.

What the data shows concretely: persistent context that compounds across sessions produces a measurable, sustained floor rise in output. Not a one-time spike. A permanent elevation. And when you add intelligent routing — delivering the right context at the right time instead of all context all the time — the floor rises again.

The question I keep coming back to: what would it mean if the floor rose for every practitioner on a team? Not the peak performance — the baseline. The sustained, week-over-week, month-over-month output floor. Run the math on even a modest floor rise across twenty or fifty or a thousand practitioners and the numbers get very large very fast.

The full data, methodology, and verification details are on the proof page. The contribution graph is public at github.com/bhandrigan. Detailed git log data is available on request for due diligence. Contribution activity is publicly disclosed — repositories are private, but the activity data is independently verifiable.

This is 52 weeks of compounding intelligence, visible in the data. Two breakthroughs. Two floor rises. The interesting thing was never the spike. It was always what came after.