Stratum← Stratum Journal
← Stratum Journal
ResearchMarch 10, 20267 min read

Your Lab's Institutional Memory Is Graduating This May


Every spring, the same thing happens in computational research labs worldwide.

A PhD student defends. There's a reception with sheet cake from Safeway. Photos get posted to the lab Twitter. Two weeks later, the PI sits down to run the analysis workflow that student spent three years perfecting — and realizes they can't reproduce it from the documentation.

Not because the student was careless. They documented everything. There's a README. There's a wiki entry. There's a GitHub repository with commits.

But the README doesn't explain why they chose ENCUT=520 eV for that particular class of perovskites, and not 450. The wiki entry doesn't capture the three-month detour into a dead-end functional that taught them something crucial about the system. The commits show what changed, not why.

The student carried all of that in their head. They graduated. And now it's gone.

The Numbers Nobody Talks About

A computational materials science PhD student spends roughly five years accumulating domain expertise. In that time, they develop:

Reproducing that expertise from scratch takes a new student twelve to eighteen months minimum — and they'll make most of the same mistakes along the way.

If your group has four PhD students at any given time, and each one takes a year to reach full productivity after joining, you're losing roughly 4 person-years of productive research time per graduation cycle just to knowledge reconstruction.

At a fully-loaded cost of $60K/year for a grad student stipend plus overhead, that's $240K per four-year cycle walking out the door — not in salary, but in unrealized time.

Why Documentation Doesn't Solve This

The instinct is to document more. We've all tried it.

The problem isn't quantity of documentation — it's that documentation captures procedures, not reasoning. A new student can follow the procedure and still not understand the reasoning behind it. When the procedure doesn't work on a new system (and it won't), they don't know whether to change the KPOINTS, the cutoff, the smearing, or all three.

The reasoning lives in the connections between experiments. It lives in the conversation the PI had with the student in a group meeting about why titanium dioxide and tin dioxide need different approaches even though they look structurally similar. It lives in the literature review the postdoc wrote in a draft that never became a paper.

Notion, Confluence, GitHub wikis — they're all containers for information. None of them are systems for connecting that information when someone asks a question they've never thought to document. This is why research wikis fail — not because of which tool you chose, but because of the structural gap between documentation and reasoning.

What Actually Works

The labs that handle this best share one trait: they treat knowledge management as a first-class activity, not an afterthought. They structure it around research context, not deliverables.

The difference: deliverable-focused documentation asks "what did we do?" Context-focused documentation asks "what did we learn, and what does that mean for everything we're working on?"

A VASP calculation has a deliverable (the output files) and context (why we ran this calculation, what hypothesis we were testing, what we found that surprised us, and what we'd do differently now). The deliverable is reproducible from the files. The context is what lets the next person build on it rather than reconstruct it.

This is the core insight behind ResearchOS. It captures and connects research context — across experiments, literature, and conversations — in a way that makes the institutional memory of the lab queryable rather than buried.

When a new student joins your group and asks "how do we handle surface terminations for this class of catalysts," ResearchOS can draw on three years of your group's specific decisions, failures, and learnings to give them an answer grounded in your lab's actual practice — not generic documentation.

The Graduation Season Test

Here is a practical test. Pick any PhD student in your lab who is within twelve months of defending. Then ask:

Graduation Test — answer honestly

If they left tomorrow, what would take more than two weeks for someone else to reconstruct?

Could a new student reproduce their most complex workflow from existing documentation alone?

How long would it take to get the lab's full methodological context into someone new joining today?

If the honest answer to any of these makes you uncomfortable, you have the institutional memory problem.

The good news: this is a solvable problem. It just requires treating knowledge management the way you treat your code — as something worth designing, not just accumulating.

Every PhD graduation is not just a milestone. It's a scheduled knowledge loss event. The labs that treat it that way are the ones that build infrastructure before it happens.


Probe / ResearchOS

ResearchOS was built for the reasoning problem, not the storage problem. If you're running a computational research lab and want to see what that looks like in practice, we're working with founding labs at R1 universities through June 2026.

probe.onstratum.com →
Sean / Stratum
© 2026 Stratum · hello@onstratum.com · onstratum.com