Attempting to map the first stirrings of blockchain research is, in some ways, an exercise in patience. By 2009, the concept was barely on the radar for most research institutions. Bitcoin’s white paper had just started circulating, and outside a handful of cryptography forums, few recognized how transformative distributed ledgers might become. For statisticians and economists hoping to capture those early research and development efforts, the ISIC code system seems, at first glance, like an imperfect but necessary tool. Specifically, code 7210—Research and development on natural sciences and engineering.

 

But there’s the rub. ISIC 7210 is as broad as it is necessary. It includes not just computer science and cryptography but also physics, chemistry, biology—almost any field one could imagine. Blockchain, tucked away in its infancy, is unlikely to appear as a separate line item. Instead, the work hides in generic project titles, in university grant proposals, and in the technical output of corporate labs, often camouflaged by more conventional research themes.

 

So how does one begin to document blockchain R&D in 2009 using this classification? It’s a process built on inference and careful screening. First, gather a comprehensive list of research organizations and labs—university-based and corporate—registered or active under ISIC 7210. National registries may help, but here, too, there’s often a need to dig through annual reports or research grant databases. Some countries are more transparent than others.

 

From this pool, the next step is to look for signals—subtle or otherwise—of blockchain-related inquiry. In 2009, almost no one would have used the term “blockchain” outright. More likely, projects fell under cryptography, distributed computing, peer-to-peer networks, or data integrity. It’s not elegant, but screening project abstracts and publication titles for phrases like “distributed ledger,” “consensus algorithm,” “decentralized database,” or even “trustless network” can be surprisingly revealing.

 

A further layer of screening comes from reviewing faculty or research staff profiles. University computer science departments, particularly those with strengths in cryptography or distributed systems, are natural places to start. Sometimes, you find a cluster of researchers who would later be known for blockchain work. Retrospective analysis of their early publications—occasionally only tangentially related to blockchain—can reveal the trajectory of the field before it had a name.

 

Corporate labs are both easier and harder to assess. On one hand, some larger technology firms maintain archives of their R&D projects. On the other, much of the work is either unpublished or cloaked in non-disclosure. Still, patent databases offer another route. Searching for applications related to cryptographic ledgers or distributed consensus filed in 2009 will, inevitably, surface some of the earliest commercial blockchain experiments, even if the inventors themselves didn’t describe them that way at the time.

 

It’s tempting to rely on keyword searches alone, but there are pitfalls. Language is slippery, and terms that seem definitive now—“blockchain,” for instance—were almost nonexistent then. Instead, analysts must lean on context. Was this a project about securing peer-to-peer transactions? About enabling distributed timestamping? About trustless value exchange? Sometimes it’s only with hindsight that the true nature of a research initiative becomes clear.

 

A peculiar challenge emerges in the university sector. Research grant agencies may assign projects to broad ISIC codes without much specificity. So, a grant for “innovative data structures in distributed systems” might cover early blockchain experiments, or it might not. Here, interviews and direct correspondence can be surprisingly effective. Reaching out to principal investigators, or reading interviews they’ve given in later years, occasionally produces the kind of color that official records lack.

 

In drawing up guidelines for documenting early blockchain R&D under ISIC 7210, a few best practices stand out. Begin with as wide a net as the data allows—don’t filter too early. Use layered screening, starting with keywords but moving quickly to abstracts, patent filings, and, where possible, staff or inventor histories. Pay attention to the ambiguity of project titles and remember that misclassification is almost inevitable at this early stage. Keep a running log of methodology: what sources were checked, what filters applied, and, perhaps most important, what was missed or left uncertain.

 

The process, in truth, is as much about recognizing absence as presence. There will always be projects that slip through the cracks—work done quietly, outside the bounds of formal classification, only later understood for its significance. By combining registry data, public records, and patient archival work, it becomes possible to reconstruct a plausible account of how blockchain research began to seed itself in the institutional landscape. Some signals remain faint, but a patient approach can reveal connections invisible to more hasty methods.