The Data Futures Programme aims to modernise student data collection by funders and regulators in higher education, improving data quality, reducing administrative burden, and enhancing transparency. Launched in 2017, it aimed to modernise student data reporting in the UK. Initial alpha and beta pilots took place in 2018 and 2019, with Jisc joining in late 2019. The HESA Data Platform (HDP) went live in March 2023, with institutions required to submit their datasets by October 2023. However, the Office for Students, Scottish Funding Council, Higher Education Funding Council for Wales and Department of Education – Northern Ireland extended deadlines for data submission due to difficulties in delivering valid returns on time. These delays in submission and the challenges experienced by institutions led to the decision in November 2023 to commission an investigation and report by PWC. This report has now been published – Data Futures: Independent review – Office for Students.

Scope

The independent review focused on the Data Futures Programme, up to and including the 2022/23 data collection, which was concluded in April 2024. The review examined several key areas, including the rationale, roadmap outcomes and benefits of the programme, stakeholder relationships, governance, funding methodology, prior assurances, sector consultations, decision-making processes, and tools for monitoring and evaluating the programme. Findings were shared with a Steering Group comprising members from the Statutory Customers and relevant sector bodies.

Findings from the report included:

  • Governance

The Data Futures governance structure faced challenges like unclear roles, optimism bias, and lack of critical risk assessment and inconsistent tracking leading to stakeholder scepticism and institutional concerns, prompting an independent review by KPMG in 2022 resulting in an overall positive report along with three medium rated risk areas for improvement. No further independent internal audits or independent reviews over the Data Futures project took place during the project.

  • Sector Readiness

Institutions faced inefficient data collection processes, generating poorly structured data and not aligning with statutory data reporting models. These challenges persisted in both in-year and annual data collection. Outdated student records systems, staffing issues, employee turnover, and investments cause ongoing data quality issues and administrative burdens. The 2022/23 data collection required extensive time from data teams due to insufficient individuals with relevant skills and this will impose medium-term impact on the sector’s capabilities. Student records system providers’ investment in the program varied, and there was a gap between readiness and success.

  • Delivery of the Programme

Jisc, as the technical delivery partner, faced recruitment difficulties amid the COVID-19 pandemic, which delayed essential development work. Staffing vacancies, including a prolonged absence of a testing lead, increased the burden on institutions and disrupted delivery timelines.

  • Changing Requirements

The Data Futures Programme has undergone numerous changes over the years, affecting the data model, collection nature, and planned in-year collections. These changes have reduced sector confidence and engagement and compressed the timeline for software development. Technical changes determined as a result of bugs identified after the return had ‘go live’ period, led to increased errors and workload, making it difficult for institutions to know whether the errors were errors in their own underlying data, in the data manipulation required to deliver the return, in the changes implemented by their student records system provider or false negatives introduced by errors in Jisc quality rules. This led to institutions commonly dealing with both a dramatically increased volume of errors relative to the prior year (one institution quoted moving from 10,000 errors to be fixed in early summer 2022, to 250,000 errors to be fixed in early summer 2023).

  • Testing

One vacancy for a substantial period of time (though this is no longer the case) was the absence of a permanent testing lead. Between October 2021 to March 2023, three Test Leads were recruited, but there were gaps between October 2022 and February 2023 where no Test Lead was in post. This was a critical period for the delivery of the 2022/23 collection. As a result, the software testing undertaken by Jisc throughout the programme up to the 2022/23 data collection did not align with commonly accepted standards of good practice, and this had a substantial negative impact on the experience and burden for institutions.

  • Internal Project Management

The ongoing benefits management is crucial for project delivery, ensuring intended outcomes and tangible value to stakeholders, but HESA and Jisc’s approach has not aligned with common good practice.

The project initiated a benefits assessment in 2016 and developed a benefits tracker for the project team. Although benefits realisation was discussed in some of the Quarterly Review Group meetings and, following the February 2023 Quarterly Review Group meeting, there is no consistent formal benefits review process, with clear roles and responsibilities. The tracker has not been regularly reviewed or maintained, with ten benefits with incomplete information and none of the 32 benefits having their realisation status recorded. A benefits review is planned following the publication of the first datasets using the Data Futures return. The absence of a benefits review and tracking process during the programme delivery has limited the program’s ability to assess the value it is delivering over time.

The Programme Board’s Terms of Reference refer to an ‘Independent Advisor’ to the programme board, but none was appointed. The Independent Advisor could have brought additional expertise, skill, and experience of project and programme management to support the Board in its review of information received on progress.

  • Risk Reporting

A detailed Risk Log for the programme was maintained, which tracked risks, issues, decisions, and changes. This was reported upon at each Programme Board meeting, though the concerns raised within the Log were not always acted upon. The Programme Board did not sufficiently challenge the impact of these risks on the viability of the programme delivery, with these risks being repeatedly raised in reports over several months with no clear path to ‘green’ sought and identified.