Independent report

Organising for Digital Delivery

Published 22 July 2021

Organising for digital delivery - Report from the Digital Economy Council, 9 September 2020

Foreword

Report presented to the Digital Economy Council, 1 October 2020. The findings from this report are being taken forward by the new Central Digital and Data Office (CDDO) and the Modernisation & Reform Team in the Cabinet Office. The report will also inform decisions made in the next Spending Review.

Background

The UK Civil Service faces some of its greatest challenges in a generation. As the country manages the coronavirus pandemic and prepares for EU Exit we continue to look ahead to the challenges and opportunities with new technologies and a changing workforce. The opportunity of course is significant. There are some 500,000 civil servants currently engaged in delivering services to the public, many of them spending a material proportion of their time on tasks which could arguably be delivered more effectively and with a better experience for citizens through technology and automation. The Civil Service Digital, Data and Technology (DDaT) function is integral to this work, both at the centre and across departments, and the Government has set an ambition to make UK Government digital services the best in the world, meeting or exceeding the benchmark set globally by the best public or private sector standards.

Against this background, the Cabinet Office and the Department for Digital, Culture, Media and Sport (DCMS) recently invited the Digital Economy Council to advise on how best to organise the Government’s DDaT function going forward to meet these new challenges. The focus was on central Government but the analysis and recommendations may equally have relevance to broader public sector organisations including agencies, non-Departmental public bodies and local authorities. The report is a summary of input from members of the Digital Economy Council, a review of past documentation, interviews with senior leaders including responsible Ministers past and present, practitioners across Government and industry, and discussions with various international agencies. We are grateful in particular for detailed feedback on an earlier draft of this report from a wide range of experts and senior leaders across both Government and industry.

This report summarises the current status of the DDaT function across Government, highlights successes and challenges and makes a number of recommendations based on both current global best practice and the specifics of managing the DDaT function within the UK.

The current situation

GDS was first established in 2011 as a central delivery unit with a remit to raise the profile of digital as a transformation lever across Government, improve the performance of Government Departments in delivering digital service, and with the intent of making interactions between the citizen and government ‘digital by default’. A lot has been achieved over that period with notable successes including:

  • The creation of a central product organisation within GDS delivering important, and in many cases award winning, new cross-Government services such as GOV.UK, Notify, and Pay, helping the UK reach the top spot on the United Nations e-government and e-participation survey in 2016.
  • A significant up-weighting of the capability of digital delivery across multiple Government Departments including the hiring the experienced senior digital leaders at Departments such as HMRC, MOD, DWP and the Home Office and across broader public sector organisations and agencies such as NHSX, the Land Registry and the DVLA.
  • The development of technical and policy guidance across Government on foundational topics such as data security, data protection, Open data, API design, Cloud migration, and supplier selection.
  • Solid progress across multiple departments in migrating legacy technology platforms to lower cost, more scalable, Cloud-based infrastructures with some Departments, such as DFE, now entirely Cloud enabled.
  • Financial benefits estimated at £2.3BN over the four-year SR15 period, primarily from spend controls and consolidation of procurement.

The benefits of this capability building have been seen in a number of recent examples where UK Government Departments or Agencies have been able to build and deploy world-class services at a pace which might have been unimaginable as recently as five or ten years ago. Best practice examples include:

  • NHSX’s performance in delivering the system to support the national rollout of a Coronavirus testing service in less than four weeks from initial conception to deployment.
  • HMRC’s launch of the furlough payment scheme which, again from a standing start, succeeding in paying the wages of over nine million workers with virtually no technical or scaling issues and overwhelmingly positive feedback from both workers and employers.
  • The Home Office’s EU settlement scheme registration service which succeeded in registering over 3.5MM EU, non-UK nationals through an entirely paperless digital service.

Each of these examples shared a number of common elements from global best practice, including:

(1) a willingness to make ruthless trade-offs to define a “minimum loveable product” launch product with the technologists “in the room” for discussions around the trade-offs required the define the most critical launch (often referred to as P0). (2) Cloud-based deployments to enable rapid, secure and cost-effective scaling of services to respond to rapid changes in the volume of demand. (3) Extensive use of service calls through Application Protocol Interfaces (APIs) to re-use existing services for critical product elements such as identity validation and payments. (4) A clear focus on customer interface design, incorporating insights from organisations such as the behavioural insights team, to create the simplest possible customer experience. (5) The deployment of techniques such as A/B testing or WebLab to generate statistically significant insights into user reaction to experimental changes in product design. (6) In most cases, an ongoing commitment to and resourcing of the product development team to ensure rapid iteration and improvement of the product in response to observed customer behaviour or to add features removed from the initial launch scope.

Examples of this nature are a testament to the hard work and significant progress made over the past nine years since the original formation of the GDS, and provide a degree of confidence and reassurance that the Government’s goal should be achievable. They demonstrate that, at its best, the UK Government is entirely capable of delivering world class digital services at pace and scale.

The gap to the ambition

Notwithstanding the significant progress made over the past nine years, our research has uncovered a number of significant challenges with the current approach. Whilst the UK ranked first of the United Nations survey in 2016, it had fallen behind countries such as Denmark, Australia and South Korea in more recent surveys. There is a clear sense emerging from all our investigations that the current arrangements are unlikely to deliver the Government’s ambition to make UK Government digital services the best in the world, consistently meeting or exceeding the benchmark set globally by the best public or private sector standards.

Our research has uncovered seven barriers to delivering the Government’s ambition:

Challenge 1: Uncertain quality of technical product delivery.

Whilst there have been many examples of effective project and service delivery, these have been offset by numerous examples of projects which have failed to deliver to the required specification, or have come in significantly behind time and over budget. The failed attempt to deliver an effective identity service (a service which other Governments around the world and many commercial organisations have had available for many years or even decades) is a case in point. The details of that project were covered in an NAO investigation in 2019 (HC1926) but in reviewing several of these less effective projects, a number of common themes emerge in which the root cause was often failure to follow established global best practices in product development. These include clear technical product ownership, an iterative approach to feature development with a ruthless focus minimising feature and scope creep, and a commitment to ongoing product maintenance and feature development in the light of observed user behaviour – all features observed in the more successful recent projects. The Ministry of Justice gave us a typical illustration of the product maintenance challenge: “Back in 2014/15 we built a pretty good service that helps citizens book visits with people in prisons. We did tons of research and got to a 60% take up after launch. But once built the funding disappeared, the service atrophied, the tech stack atrophied and take up has fallen right away.” Or in the words of one of our technology CEOs, “building technology services is not the same as building bridges – it’s revenue not capex.”

Challenge 2: Unaddressed legacy systems and technical debt.

“Technical debt” refers to the situation where important operational services are provided by out of date “legacy systems” often built on obsolete technical platforms or using programming languages that are no longer widely supported. This brings a number of challenges including very high “keeping the lights on” maintenance costs, data and cyber-security risks, and an inability to develop new functionality on technologies or systems that are no longer widely supported. The challenge is by no means limited to Government services as there is a universal temptation to invest in new feature development vs. the “worthy but dull” task of ensuring security and stability in the underlying platforms. But the situation is particularly acute across multiple Government Departments. A recent analysis by Government Security indicates that almost 50% of current Government IT spend (£2.3BN out of a total central Government spend of £4.7BN in 2019) is dedicated to “keeping the lights on” [KTLO1] activity on outdated legacy systems, with an estimated £13-22BN risk over the coming five years. The study further assesses that some Departmental services fail to meet even the minimum cyber-security standards, and the inability to extract usable data from these legacy systems has been cited by multiple interviewees as one of the greatest barriers to process transformation and innovations across Government. By way of example, the Home Office (the Department with the largest single technology spend), whilst having a clear understanding of the risks and after 3-4 years of effort, has not been able to retire any of their twelve large operational legacy systems.

Challenge 3: Relatively weak operational performance monitoring.

A fairly standard practice in most leading private or public sector organisations is to establish a routine (typically quarterly) cadence of operational performance reviews in which the central functions would review performance of each operating division (or in this case Government Department). This would typically involve reviewing both a set of ongoing operational metrics (system uptime, volume of attempted cyber-attacks, customer satisfaction and adoption metrics, cost and efficiency performance and so forth), along with reviewing progress on major development projects so that any issues can be identified at a relatively early stage and any necessary corrective action taken. This routine discipline is not currently in place in any systematic way across the civil service. GDS put in place a performance management system in 2012 but this has fallen into abeyance with only 12% (91/777) of services currently providing updates2, and even the supporting technology now obsolete and viewed as vulnerable to cyber attack.

Challenge 4: Failure to leverage scale.

A consistent message emerging from discussions with Departmental CDIOs is a level of frustration around the level of duplication in areas such as standard procurement practices (“Why does every Department have to negotiate a separate contact from [standard industry software licence]”), an inability to share best practice between different Departments, and a lack of consistency in technical standards.

Challenge 5: Missed opportunities in leveraging Government held data sets.

A truism of much of the current popular narrative around digital technology, encapsulated in slogans such as “data is the new oil”, is that significant value can be created from access to, control of ownerships of large data sets. The reality is that value is only created when large data sets are combined with the analytical and process capability to use these data sets to influence action or decision making at scale across an organisation. Leading practitioners have a clear understanding that collecting data is a necessary but by no means sufficient condition for value creation, and invest as much or even more effort in building systems that use this data to drive, frequently automated, decision making or action taking. Our investigations suggest that many Government Departments are investing significant sums in collecting and storing often very large datasets but making little use of this data to influence action of decision making. As one senior external hire put it, “My biggest surprise when I arrived was how little we do with our data.”

Challenge 6: Low technical fluency across senior Civil Service leadership.

Underpinning many of the issues surfaced during this review is a general concern around the relatively under-developed level of digital expertise amongst senior Civil Service leadership. This contrasts with the emerging position in the commercial world in which technology is increasingly seen as a critical delivery lever (alongside people and money) and where it is becoming increasingly expected that senior leaders have a clear understanding of how to deploy technology effectively as an organisational lever. At a minimum leaders should be capable of auditing effectively the performance of their digital functions, including having a realistic expectation of how long projects should take, what they should cost, and what questions to ask in order to assess whether delivery is on or off-track.

Challenge 7: Confusion over the role of the central functions.

The Government Digital Service (GDS) has gone through several different incarnations in the course of its 9 year history from an early era of “digital agitation”, the migration of policy responsibility for data to DCMS in 2018, and a move away from central support as individual Departments have developed their own capabilities – albeit different departments have progressed at different paces with a wide variety in capability. This has, perhaps inevitably, raised a concern that GDS has to some extent “lost its way” and would benefit from a refocusing around its central mission. When McKinsey surveyed permanent secretaries about the role of the centre in late 2018 they found that, of all the centrally operated functions, departments had least confidence in GDS.

The remainder of this report sets out a series of recommendations for addressing these challenges.

The way forward

To address these challenges and give the UK the best chance of delivering on its ambition to make UK Government digital services the best in the world, we are making eight recommendations.

  1. Build mechanisms to put the citizen at the heart of all design decisions
  2. Strengthen the accountability of Departments and their Permanent Secretaries 3. Hire a Permanent Secretary level head of function
  3. Re-focus and add teeth to the centre
  4. Create clear investment swim lanes to address the legacy debt
  5. Set up a quarterly business review process
  6. Invest in developing the technical fluency of senior civil service leadership 8. Create a Government data application centre of excellence

Recommendation 1: Build mechanisms to put the citizen at the heart of all design decisions

Our first recommendation is primarily one about culture and behaviours but it is fundamental. Many of the world’s best public sector and commercial organisations have made a point of placing customer or citizen needs at the centre of decision-making and product design. Indeed it was one of the guiding principles when GDS was established. It is such a truism that it is rare to find organisations or leadership teams that will argue against this principle but, to quote Jeff Bezos of Amazon on the topic: “good intentions don’t work, mechanisms do”. The point here is that it is unlikely that it will be enough merely for the Government to assert the importance of placing citizen needs at the heart of the digital transformation. Leading public sector and commercial organisations go beyond this and also put in place best practice mechanisms set up to ensure that these good intentions translate into concrete actions. Examples of these mechanisms include:

  • Amazon’s CXBR review process ensures that, often at an early stage of product design, any product is subject to a structured, independent review by a trained expert in customer experience design to ensure that no opportunity is missed to create the clearest and simplest design from the customer perspective.
  • Clear and timely operational dashboards so that senior leaders have granular, daily visibility into the lived daily customer experience of the services they are providing.
  • Leading organisations go beyond even this and put in place specific mechanisms to ensure that senior leadership remains closely connected to the actual citizen or customer user experience so that those in decision-making positions are constantly informed by a clear understanding of both the data and the anecdotal examples of the day to day reality of the customer experience.

We recommend that the Government establishes and enforces a similar set of mechanisms to make citizen experience central to design decisions. Doing so will be essential if the Government is to build what was expressed to us as “a culture where public servants see their primary obligation as building and operating services around the needs of the citizens who pay for them.”

Recommendation 2: Strengthen the accountability of Departments and their Permanent Secretaries

A constant theme emerging in our discussions has been the balance of activity and accountability between individual departments and the “centre” (whether Cabinet Office, HM Treasury or the GDS). This is similar to the challenges faced by many large multinationals with substantial operating divisions or country operations (in contrast to single product line or single division entities where a single central service often works well). In these more complex situations the general consensus is that a federated “hub and spoke” model is the most effective. This puts the primary accountability for delivery with the operating divisions with a level of “freedom within a framework” but supported by a strong, expert centre with real teeth, responsible for technical architecture and standards but federating out the tooling to the operating divisions.

As with all organisational questions there are advantages and disadvantages to any given approach. But the optimal approach here seems clearly to ensure that the primary accountability for technology delivery should rest with individual Government Departments and their Permanent Secretaries. Crucially, however, this needs to be supported by a relatively small but strong central organisation focused on (1) holding the Departments to account for delivery, (2) setting the technical and data standards to ensure interoperability, and (3) delivering services where it can add value over and above what individual departments can achieve alone. This appears consistent with the direction of travel of recent years with the appointment of strong CDIOs in many individual departments, and aligns with formal financial and delivery responsibilities that rest with the individual Permanent Secretaries and other Accounting Officers. Indeed, a strong central assurance function was something that Departmental CDIOs in our discussions said they would welcome with those that had been hired externally surprised and, to an extent, disappointed to find it absent.

Recommendation 3: Hire a Permanent Secretary level head of function

Whilst we see primary delivery responsibility resting with individual Departments, we see the need to appoint a senior, Permanent Secretary level, Government Chief Digital Officer (GCDO) and head of function along the lines of the Government’s Chief Scientist, Chief Medical Officer, and Chief Statistician. The report team contributed to the advertisement for GCDO role which was launched at the end of August.

Recommendation 4: Re-focus and add teeth to the centre

With the majority of the work in Departments, we see it as essential to re-focus and add teeth to a, perhaps smaller, but more capable central function within Cabinet Office and under the leadership of the GCDO focusing on the areas where it can add value over and above that provided by the work of individual departments. We recommend therefore creating a small team taking ownership for the following six areas which will help to hold Departments to account for delivery whilst driving interoperability between Departments.

• Direct management of the central Government Digital product function owning critical services such as the gov.uk communication portal, identity validation, payment and other current and future digital services which are most effectively delivered centrally across Government.

  • Strengthening and leading a central planning, budgeting and monitoring team and process to ensure that over £5bn of Government investment in technology is being allocated effectively and to monitor and audit the performance of Departments against their technology commitments.
  • Increasing and enforcing the technical standards needed to ensure efficient delivery of systems. This will include the creation of appropriate frameworks around both technology choices, security and privacy requirements, the design of data standards, and APIs to ensure effective interoperability both within and between different Government Departments.
  • Providing leadership of the DDAT function including creating a stronger sense of professional community including the appropriate job descriptions and career paths, developing the appropriate training modules, setting expectations around remuneration and ensuring and effective transfer of talent between Departments, and between Government and industry, for both effective delivery and professional development.
  • Identifying opportunities for cost savings in technology procurement through negotiating of central procurement contracts or, where effective, direct procurement of equipment and services across Government. Various experts made the point to us that the Government procurement machinery was written for a world of outsourcers not cloud platforms. There is a clear opportunity to take a hard look at this and make the current procurement processes and systems more fit for the modern technology world. Some industry leaders equally made the point that different Government Departments have very different levels of knowledge and experience in how to obtain the best value from working with external industry partners.
  • Leading the quarterly business review process set out in Recommendation 5 below. Recommendation 5: Create clear investment “swim lanes” to address the legacy debt

The current spending review process sets capital over four years and revenue/opex over three years. Within this, technology spend appears to be set Department by Department within the overall spending review settlements with no central oversight within Cabinet Office or HMT of the overall level of technology spend, the split between elimination of technical debt versus new feature development, or the trade-offs between different proposals from different Government Departments. One perhaps inevitable consequence of this has been the under-investment in resolving legacy issues and technical debt. We recommend instead that the GCDO should take a more active role in challenging technology spend during the spending review process and in addition lead an annual review of technology spend where appropriate re-allocating funds to different swim lanes and between different Government Departments within the overall envelope of the spending review. Since starting this work we are pleased to see that a clear challenge process along these lines has been put in place for the current spending review round (summary attached at Annex A), and that there is a new initiative to report technical debt to the audit and risk register for departments, with a non-executive champion for each department.

Recommendation 6: Set up a quarterly business review process

One of the more striking observations on the current approach is lack of any regular inspection or audit mechanism of operational performance. We recommend setting up such a process - consistent with wider proposals for improving the measurement of departmental performance, and led by the central planning and monitoring team under the GCDO - to review with each major Department on a quarterly basis their performance on operational metrics (system uptime, volume of attempted cyber-attacks, customer satisfaction and adoption metrics, cost and efficiency performance and so forth), and progress on project delivery. One subsidiary benefit of such a mechanism should also be to improve the flow of best practice learning between Government Departments and / or the demand for centrally built services since a single central team would be reviewing the activity of multiple Government Departments on a regular cadence. In the spirit of ensuring that the centre has “teeth” we would expect to see performance at these review sessions (good and bad) feeding into the overall appraisal of Departmental CDIO and Permanent Secretary performance.

Recommendation 7: Investing in developing the technical fluency of senior civil service leadership

The field of data and informatics is still a relatively new one – British mathematician Alan Turing’s seminal paper, “On computable numbers, with an application to the Entscheidungsproblem”, which first presented the notion of a “universal machine” was published only in 1937. The first recognisable “universal computing machine” came into being only in the 1950s (in Manchester), the world wide web was only launched (again largely through the work of British scientist Tim Berners Lee in 1989). The first iPhone launched in 2007, social media services such as Facebook, WeChat, Instagram and Snapchat in the 2000s and 2010s, and we are already starting to take for granted machine-learning driven applications such as image and speech recognition which only began to launch at scale within the past five years.

In such a fast-moving environment, it is understandable that organisational and leadership understanding of the principles and decision-making frameworks that should underpin the effective deployment of technology have not always kept pace with the speed of development of that technology. The Senior Civil Service is not alone amongst leadership groups in lacking a solid understanding of technology but we feel strongly that this is a gap that needs to be addressed if the UK is serious in its ambition to make UK Government digital services the best in the world. Leading commercial and public sector organisations here are increasingly investing in this area, including making extensive use of online training and collaboration tools to provide both focused education and facilitate best-practice knowledge sharing. The Government itself organised sessions for Permanent Secretaries with the Oxford Internet Institute some years ago but these appear to have fallen into abeyance. We recommend therefore investing in developing training programs to ensure that all senior leaders are, at a minimum, capable of auditing effectively the performance of their digital functions, including having a realistic expectation of how long projects should take, what they should cost, and what questions to ask in order to assess whether delivery is on or off-track.

Recommendation 8: Create a Government data application centre of excellence

Our final recommendation seeks to address the challenge of missed opportunities in leveraging Government held data sets. Whilst we see (relatively) clear ownership, currently residing within DCMS, for national data policy (covering topics such as privacy, security, ethics, and innovation), ownership for the question of how Government should best seek to create value from the data sets.

it owns or controls has only recently been clarified. Given the complexity of this topic5 and the relatively (global) scarcity of top talent in this field we recommend creating a central team under the GCDO to develop best practice thinking and guidelines in this area to support the work of individual Government Departments.

It is worth saying at this point that we do not recommend the approach that is sometime advocated of creating universal “data lakes” in which all data is somehow brought together in a single universal location. The better approach is to put the energy into clearly mapping the location of the canonical Government data sets and building the appropriate technical standards, security protocols, and APIs to enable services to make calls to these data sets when particular information is required. A related question is whether the Government should invest in creating and enforcing a common set of technologies that all Departments should use. Whilst desirable in some ways we do not think the centre needs to (or should) enforce a common set of technologies. We absolutely see a role for the centre is defining the architecture and standards that will be critical for enabling different services to operate effectively together and make the relevant data / service calls. We also see a role for the centre in maintaining the catalogue of which data sets and which services sit where. But the whole point of defining clear data structures and APIs is that you do not then have to move to the (very hard) point of standardising the technologies used to deliver each service. All that matters is that an API call can say: “if I give you x,y,z data in the correct format, return me a,b,c response in a reasonable time”. The internal structure and underlying technology needed to define how the service translates x,y,z into a,b,c is irrelevant provided it meets minimum engineering standards of uptime, security, response time and so forth.

Organisational implications

Adopting these recommendations would have the following organisational implications:

  • The current GDS organisation would be re-focused as a product organisation delivering the specific central cross-government services with a Director General level head reporting directly to the GCDO.
  • The creation of separate, focused, expert central team, reporting directly to the GCDO, with responsibility for: (1) running the planning, budgeting, monitoring and auditing processes to ensure Departments deliver against their technology commitments, (2) developing and enforcing the technical standards needed to ensure efficient delivery and inter-operability of systems (3) professional leadership of the DDAT function, and (4) updating technology procurement.
  • The establishment of a Government data application centre of excellence under the GCDO, most likely building on current best practice which exists in areas where there is already a solid repository of expertise such as ONS (for core national data and statistical correctness), the Met Office (super-computer capabilities and weather-related data), GCHQ (intelligence-related data), and NHSX (health data).
  • Finally, we see it as particularly important that the Government invests in ensuring that critical roles are staffed by expert, fully employed individuals and is cautious about relying too heavily on contractors or external consultants for core design or architectural decisions that require a long-term and holistic view across Government.

In conclusion, we are confident given the progress to date, that if the Government were to adopt these eight recommendations then its ambition: to make UK Government digital services the best in the world is entirely achievable. Equally, though we are clear that the Government is not yet on a path to achieve its goal. These recommendations are designed as a package – we believe that collectively they are sufficient. We do not believe any are unnecessary.

Annex A: Financial Planning Process

To ensure that the Government investment in technology is being allocated effectively and to monitor and audit the performance of Departments against their technology we are proposing that the central planning, budgeting and monitoring team should carry out - Challenge sessions as part of the Comprehensive Spending Review process - Annual reviews of key DDaT initiatives

Challenge Sessions

The Comprehensive Spending Review (CSR) provides a chance to reform and modernise the way government operates; to tackle some of our biggest risks and to accelerate our adoption of secure, scalable and sustainable cloud-based and digital technologies that improve our operations and service delivery.

The Cabinet Office has been working with departments up to COO and CDIO levels for more than a year to make the case centrally for prioritising automation, legacy technology, greater leveraging and sharing of data and digital transformation at this CSR. Departments have a number of ambitious or essential projects to pursue and we want to take full advantage of this opportunity.

Those departments with the largest known legacy technology risks inclusive of wider technology transformation will be invited to challenge sessions. These sessions will provide an opportunity to quality assure their bids and provide the centre with the visibility to ensure these cases are played into the CSR with a determination on priority. The Challenge sessions will review and assess departments’ proposals for automation, legacy and wider digital transformation at this CSR. The ambition is to highlight significant risks or transformation opportunities that may not normally be prioritised in departmental spending when competing with wider policy objectives, to ensure alignment between departments with complementary proposals, and to identify opportunities for scaling initiatives beyond individual departmental bids.

It is inevitable that there will be more good, necessary bids than there will be resources available within the overall envelope. HM Treasury colleagues will have to prioritise and we will look to use this process to support them to do this.

Annual Reviews

Following the completion of the Comprehensive Spending Review a central planning team will schedule annual reviews of the key initiatives that merit regular, central oversight.

The objective of these reviews will be to assess progress not just at individual department level but also across Government by initiative type, for example legacy, automation or data. They will review progress against planned milestones to ensure momentum is being maintained and also audit whether investments are delivering the returns and outcomes anticipated.

The most critical programmes and projects need to have stable and assured funding over longer periods of time, allowing for the strategic improvement of services however the annual reviews should enable re-allocation of funds where appropriate – increasing or decreasing budgets dependent on and need and justification. For example where initiatives are not delivering to plan the process should provide support in identifying remediation options or, where discovery activity has shown that initiatives are not cost-effective it should be possible to reallocate funding to other projects.

There should also be consideration of how to fund worthy initiatives that emerge between spending rounds. Frictionless, seed funding should be made available for small scale discoveries, with the opportunity for scaling funding, creating an innovation friendly environment.

How we might make a more adaptable funding process work in practice will need further consideration. Whilst departments need certainty on investment levels a CSR process every 3-4 years is not hugely flexible and we need to identify how we might create a more agile approach. The CSR process this year would be an ideal opportunity to look at how funding for key initiatives can be assured – potentially outwith – individual departmental budget envelopes, followed up by with an initial annual review process in 2021.