A Strategic Brain for the R&D Ecosystem
How to build a shared analytic capability for public R&D investment
When UKRI was set up seven years ago, one of its aims was, in the words of its first Chair Sir John Kingman, to provide a ‘strategic brain’ for the UK’s R&D system.
It’s not a phrase you hear much nowadays, but the sentiment hasn’t gone away; it resurfaces periodically in institutions and documents, including the Office of Science and Technology Strategy and the Science and Technology Framework. What all these policies have in common is the reasonable belief that better understanding of the research and innovation landscape in the UK would make for better public investment decisions.
I think it’s time we had another crack at this idea. There is a growing appreciation across Government that good data is essential for good investment decisions, and a welcome drive across DSIT to prioritise how public investment supports the development of so-called ‘critical technologies’ like AI and quantum. And with budgets tight, it is important to be strategic in how public money is spent.
What follows is a rough outline for how we might build an analytic capability to support innovation investment decisions.
Summary: A new analytic capability, shared between DSIT and UKRI, whose job is to provide a timely, detailed source of R&I analytics to inform decision-making, measure impact, and make the R&I system more legible to investors and to the public.
Why is it needed? When I last worked in Government, less than a decade ago, UK R&D policy focused on increasing the quantity of research funded. The May Government’s headline research policy was to increase UK R&D investment to 2.4% of GDP, by increasing public R&D funding to drive this growth. (The quality of research funded was to some extent assumed to be a solved problem, given strong incentives towards “excellence” in the system itself compounded by the influence of the successive Research Excellence Frameworks.) But in the intervening years, UK policymakers have increasingly focused on the quality of R&D investment too: the idea that funding the right research and innovation can lead to disproportionate benefits. This makes some sense if you believe (a) that value of R&D investments is very heterogeneous and (b) that better investment processes increase the odds of backing high-return projects.
“Mission-oriented innovation” is one flavour of this - predicated as it is on the idea that innovation investments can be coordinated with one another, aligned with broader government actions, and usefully directed, and therefore deliver greater returns. A totally different policy, but based on the same underlying conceptual logic, is the establishment of funders like ARIA. Or the government’s R&D investment in so-called “critical technologies”. Or the aim of aligning R&D funding with the emerging opportunities presented by AI.
If you want to make good investments, you need a good understanding of the R&D landscape: where the UK’s strengths are; where historical investment has gone; how academic research is linked to business growth; and what opportunities fall between the cracks of academic disciplines or old-fashioned SIC codes.
This in turn relies on having good information, and on making the most of the large and growing amount of data that exists about the innovation system and about government spending on it, and on the potential of new analytic techniques (including AI, which is already being used to great effect to analyse R&D portfolios in other industries).
As well as helping make better investments, having better data and analysis would help hold ourselves to account for how public money is spent, and help DSIT act as an “intelligent customer” for the R&D it funds. It would also help communicate the benefits that come from investment in science - an important part of public accountability. And of course, data helps mitigate the kind of groupthink that can take hold (as for example in Paul Nightingale and James Phillips’s refreshing report on UK R&I strengths in key technologies).
We also need to think carefully about workflows: how the new R&D analysis gets used in practice to inform decision-making. I remember attending ministerial meetings on back in 2017 where decisions were made about investments in the Industrial Strategy Challenge Fund. These were large applied R&D projects, worth tens of millions of pounds; large packs of analysis were prepared for each proposal, but were often referred to in… let’s say, a fairly cursory way. I’m convinced we can do better than that. There’s an important service design project to be done here, to work out the best way of making better R&D data salient during the investment process. This is very doable, but it is a non-negotiable part of the investment.
With that in mind, here are four design principles that a world-class R&D analysis function should follow:
Create useful knowledge: Analysis needs to be useful; data for data’s sake… is pointless? (Are pointless? Take your pick.) The aim should be to produce insight in a timely way that can inform UKRI and DSIT investment decisions. (This is likely to involve some iteration and careful design work – an interesting applied social science question.)
Be intellectually ambitious: this is an area where the UK can genuinely aspire to global leadership among research funders. The unit should seek to be an exemplar for other research funders around the world, and should seek to exploit novel data sources and the potential of AI to make sense of the data we already have but don’t use. Let’s think big.
Work in partnership: currently UKRI and DSIT have their own analytic functions. To some extent this is understandable, as they have different remits; but the need to understand the UK’s R&I ecosystem and the impact of public investment is a shared interest. Pooling resources will reduce duplication and (because this is a fixed-costs game) increase quality. The unit could also work closely with the Economic Growth mission board, which suffers from a lack of good innovation data, and with HM Treasury and HMRC (who among other things control data on R&D tax credit – the dark matter of public R&D funding).
Be pragmatic between “buy” and “build”. There are some excellent data resources on the R&I ecosystem produced by businesses (including scientometrics, business microdata, innovation input and output data such as patents, creative economy inputs, or primary equity investment, and novel analyses such as bespoke industry classification). Researchers in fields like innovation studies have been working on the analysis of national systems of innovations for decades. Equally, the Government has access to useful proprietary information like UKRI’s Databank or administrative data on firms. The new unit needs the capability to decide pragmatically what to buy access to, what to build within government, and where to partner with external experts, for example through fellowship schemes.
I think the time has come to up our game when it comes to R&D analysis. As global competition intensifies and budgets tighten, the countries that can see their innovation landscapes most clearly will have the strategic advantage. This plan presents a serious opportunity to increase the effectiveness and value for money of our investments in R&D, by being significantly more ambitious in how we understand our innovation ecosystem.
If you’re interested, please get in touch!
Stian,
one of the big challenges here is persuading the Research Councils and Research England to do anything differently on the basis of the data analytics that you suggest collecting. We have always had good data about geographic distribution of grant money for example, but there has never been any willingness to do anything to the "proposal-peer review-issue grants" system that would alter the outcome in a way consistent with policy objectives like levelling up. That said, I fully support your thinking - we have focused for may years on growing the _inputs_ to the research system, with a pleasant but naive assumption that if we looked after the inputs the outputs would take care of themselves. I don't think we can or should take that for granted.
good luck!
John
Hi Stian - as you can imagine this is all music to my ears!
Additional considerations:
- Invest in the boring stuff: build data and modeling infrastructure to deliver insights closer to real time vs. discrete reports and papers that are out of date by the time they are published.
- Co-locate data scientists and policy experts: ensure that government bodies have embedded analytical capabilities that can respond to urgent policy priorities; complement this tactical capability with R&D centres of expertise / external teams developing new methods to push the frontier. A hot take here is to minimise reliance on consultants who lack local context / advanced capabilities.
- Use mixed methods: combine data analysis, experiments and qualitative information from interviews and stakeholder engagement to create virtuous cycles of pattern recognition / hypothesis development and testing.