Why I shared my Intelligence Notebook with Data Maturity Matters
A guest post by Barry Hurd
Note from the editor: Barry has been part of the Data Maturity Matters open-source consortium for over a year, contributing directly to the development of the academic development of the Data Readiness Levels (DRL) and data maturity standards frameworks. As the consortium grows, we are keen to create platforms that amplify the expert voices, innovations, and thinking driving this work forward. Thanks to Barry for agreeing to share his here:
Over a year ago I started working with Antonia Manoochehri on Data Readiness Levels : the standards framework the Data Maturity Matters consortium is building to help organisations understand where their data actually stands before they throw money at a problem they can’t describe.
The data problem we were addressing served as yet another reason I was building the Intelligence Notebook: a knowledge graph visualiser that turns structured data into interactive 3D networks you can think with. And the more I worked defining Data Readiness Levels (DRL), the more I realised these two things are the same pipeline and problem space. Data Maturity Matters is building the upstream discipline. The notebook is what becomes possible downstream when that discipline is in place.
So I want to share what the notebook does and why this community is the right audience for it.
It offer a Different Analytical Lens
The notebook currently has 20 view modes.
Not for spectacle: because different questions demand different spatial arrangements of the same data.
Working in 3D graph environments is not for everybody.
For the critical minded analyst it serves a critical point: a fraud analyst needs a different spatial arrangement than a geopolitical strategist, who needs a different view than a compliance officer, who needs a different view than an HR analyst mapping organisational influence.
Same data, radically different cognitive frames. That’s not a feature list, It’s the difference between seeing data and understanding it for interpretation with an applied purpose.
It Accelerates Research Work: Structurally, Not Incrementally
Most researchers don’t start with structure. The key of the Intelligence Notebook the first investigation requires building the concept into a graph. It does this automatically with AI and basic data sets or problem statements. Every subsequent investigation inherits everything that already exists. New entities connect to old ones. Old relationships inform new questions. The marginal cost drops while the marginal value increases with every node added.
The notebook accelerates this with AI Import that extracts entities and relationships from unstructured text, and a Deep Research Panel that generates structured research prompts across ten domains : background, network mapping, financial assessment, risk evaluation, geopolitical context, technology positioning, media sentiment, personnel profiling, competitive intelligence, future outlook. The purpose is to run them against live web search to return findings that auto-populate as new graph entities.
For me this magnifies research that takes a team days and collapses into hours or minutes. Not because the thinking is automated, but because the mechanical overhead of information management disappears. You stop holding relationships in your head and start actually analysing them.
Enhancing Analysis at Stage Zero
This is the part most relevant to what Data Maturity Matters is building with Data Readiness Levels (DRL) and Total Data Quality Management (TDQM).
A senior analyst’s edge isn’t just knowing more facts: it’s carrying an internal map of how things connect. Experienced researchers have learned to connect the dots: which entity types cluster together and what relationship patterns signal risk.
That mapping capability took years of exposure to build.
The notebook externalises it.
When a junior analyst opens Confidence Depth and sees an entire high-priority cluster sitting on the “unconfirmed” layer, the visual encoding delivers the insight directly - no ten years required. When they switch to Counter-Intelligence View and see a low-confidence external entity two hops from a critical asset, the spatial arrangement does the analytical work that would otherwise require deep institutional knowledge.
Yet graph visualisation doesn’t replace expertise. It democratises the structural intuition that expertise provides. For organisations investing in People Analytics and AI enablement, this closes the gap between data readiness and analytical capability across your entire team, not just your most experienced people.
It doesn’t replace humans: Strategic Experts Get Multiplied Tenfold
On the other end of the spectrum, senior analysts get multiplied rather than merely assisted or replaced.
A strategic expert who would manually research one entity’s background, then its financial exposure, then its network connections (sequentially, over days) can now launch all ten research dimensions simultaneously through the Deep Research Panel and review synthesised findings as they return.
The AI handles the mechanical research.
The expert applies judgment.
When you pair experienced human judgment with parallel automated research across ten analytical dimensions at once, you’re not saving someone 20% of their time. You’re ten-folding what they can produce in a given period while keeping quality rooted in the expertise that actually matters. That’s not incremental improvement: it’s a structural multiplication of your best people.
The Bridge Between Data Maturity and Data Intelligence
You can’t build a meaningful knowledge graph on immature data. The entity extraction, relationship mapping, confidence scoring, and provenance tracking that make the notebook useful all depend on data that’s been collected, structured, and validated properly.
Data Maturity Matters (DMM) builds the upstream discipline. The Intelligence Notebook demonstrates what becomes possible downstream. Together they tell a complete story: mature your data foundations, then transform that mature data into visual, interactive, explorable knowledge that accelerates every analyst on your team.
The notebook processes 500+ OSINT data types across 50+ domains, supports hundreds of entity fields and relationship fields, and integrates NATO/Admiralty grading and ICD 203 quality frameworks. It was built for intelligence work, but the philosophy applies anywhere you have complex, connected data and humans who need to make decisions about it.
At the end of the day, I just want to leave data professionals thinking about working with better data at scale (both creatively and scientifically)
Looking forward to the conversation with this community and to continuing the work with Antonia on bridging the gap between data maturity and data intelligence.
Feel free to reach out and ask questions (https://www.linkedin.com/in/barryhurd/): I like a good intelligence conversation.


