02.05.18

Inaccurate and unchecked: problems with local council spending data

How can we hold local authorities to account when government figures are wrong?

At first glance it appeared a great story. Eighteen councils in England had nothing held over in their reserves to protect against financial uncertainty. The pressure local authorities are under is well publicised but if these figures were correct, the situation would be far worse than feared.

What gave the story credence was the source. It was not a tip-off or the Bureau’s own calculations, but data supplied by local authorities and published by the government. It would have been a great scoop.

Only, it wasn’t. The figures were either wrong or misleading. And not just one figure, but all of them. The reality was that the councils in question collectively held almost £100m in these reserves, according to their own localised records.   

So it turned out that the story the Bureau had was a different one entirely: the government had published data, which incorrectly showed that more than a dozen local authorities had run out of the money they should hold over to protect from financial uncertainty.   

Not only that, but no-one noticed this discrepancy. Not the councils that compiled the figures, nor the Ministry of Housing, Communities and Local Government, which vetted and then released these misleading figures eight months ago as part of a comprehensive and reliable record of local government spending.

Make change possible

Investigative journalism is vital for democracy. Help us to expose injustice and spark change

Click here to support us

We had only come across the discrepancies ourselves, when fact-checking for our series of deep dive pieces on council finances. And because we did, we took it upon ourselves to rigorously check each data point back to the local source of information.

The Bureau contacted all the affected local authorities. The majority said the figures were not an accurate reflection of their financial position. Some said this was the result of “human error” on their part. Others, including crisis-hit Northamptonshire, which actually had £11.7m in this fund, could not explain what had gone wrong.

This data discrepancy, and the responses from the councils, suggests strongly that the published record of how councils spend public money is not only inaccurate but unchecked.

Stock photograph, of local authority housing, by Katharine Quarmby/the Bureau

Council responses

Local authorities are required by law to know how much money they have in their reserves before they set their budgets. The vast majority split their ‘usable’ reserves - or ‘rainy day’ funds - between an ‘earmarked’ reserve, designated for specific future use (to cover future costs such as redundancy payments or those related to PFI schemes, for example) and an ‘unallocated’ reserve that, crucially, helps protect against unforeseen problems.

It is this latter reserve which some local authorities are not recording correctly.

A finance officer at one of the councils told the Bureau the data entry process had “not been taken seriously enough”, a concern echoed by local government experts speaking to the Bureau.  

Cheltenham Borough Council said it had made an error when the forms were filled out. New procedures are in place to make sure future submissions are “verified and approved” by the Chief Financial Officer, the council said.

South Hams District Council said the latest form was “completed by a new member of staff” who had omitted to include the figure in question. “We can confirm that we have put procedures in place to ensure that this anomaly does not happen in the future”, a council spokesperson said.  

But the Bureau’s research shows this was not an anomaly.

A similar discrepancy had been recorded on last year’s forms as well.

Looking at the data recorded for 2016/17, there are 15 local authorities incorrectly showing nothing in these reserves, 14 of which are on this year’s list - including South Hams and Cheltenham. Oddly, the years prior only showed a handful of local authorities as having empty unallocated reserves.

To our knowledge, no-one spotted these mistakes even though the data is supposed to adhere to the Code of Practice for Official Statistics. According to publication notes for the most recent release, this data had been subject to “rigorous pre-defined validation tests” performed by the local authority, then the Ministry of Housing, Communities and Local Government, as well as the independent financial experts, CIPFA.

The Bureau approached the Ministry with our findings and asked for an explanation. A spokesperson said: “While we apply robust quality checks at all stages of the data collection and compilation process, we are ultimately reliant on local authorities to resolve data quality issues and continue to work with them on the issue.”  

To make matters more confusing, three of the 18 councils really did not have anything in their unallocated reserves, as the data stated. They each told the Bureau they had put the money into their earmarked reserves but could not clearly explain why.

Historical data is only part of the problem

In addition, the Bureau spent four months looking in detail at the future financial plans (known as draft budgets) of local authorities. There are further problems in those reports as well.

Draft budgets contain key information about how councils plan to spend public money in the coming financial year (and beyond). It is important that they are accessible and understandable, for journalists and the public. But in many cases they are not.

Attempting to aggregate this data for the first time, the Bureau met barriers at every stage. Firstly, it was difficult to understand which authorities even publish advance draft budgets, despite requesting this information under the Freedom of Information Act.

Then the reports themselves were extremely difficult to access and standardise. There are 353 local authorities in England. Each report sits on its own page (draft budgets are not centralised in one place) and there are dozens of different ways the reports are named, making it difficult even to locate these reports. In fact, there was little uniformity in any sense, apart from the documents being published, on the whole, in the cumbersome PDF format. To make matters worse, they are often inaccessible for those who use accessibility software due to sight impairment, because most screen readers have difficulty scanning PDF documents.

Local authorities also divide and define their spending differently, which made comparing spending on vital services, such as adult and children’s social care, difficult and time consuming. Croydon splits its spending between three main headings: people, place and resources. Others use more categories. Barnet, for example, has six: adults & safeguarding; assets, regeneration & growth; children, education libraries & safeguarding; community leadership; environment; and policy & resources.

Our aim in making this database was to provide accessibility and transparency around proposed spending cuts at a local, regional and national level. But these issues, small and large, were a big impediment to that.

It meant that in order to provide vital information on proposed local spending to journalists and the public, we would spend months trawling hundreds of council documents and manually recording and verifying information.

These issues are compounded if we return to the source of our initial concerns - the government’s datasets.

Once approved, the information within these draft budgets will form the basis of the figures that councils provide to government. While we don’t have to manually aggregate this data, because all historical budget and spending data is pulled into one place, we have already identified flaws that lie in it.

An experienced local reporter, given enough time, can decipher and report council plans, and they do so up and down the country. However, building an accurate picture of what is happening in neighbouring areas or across the region as a whole, is a big challenge. It is even more daunting for the average member of the public. 

This matters because, as councils cut back services, people are increasingly forced to seek support across local authority boundaries. After nearly a decade of austerity, understanding the cumulative impact of reductions in local government funding has become vitally important.

To do this requires the information published by councils, and by the government, to be standardised and accessible. At the very least it should be accurate.  

If it falls short of these standards, as has been the case here, then local government spending and its impact on people’s lives is harder to measure and scrutinise. This is particularly relevant since the Audit Commission, which used to scrutinise local government, was abolished in 2015.

As Tony Travers, a professor of government at the London School of Economics, told us during our investigation: “With the Audit Commission gone, good investigative journalism is pretty well our only way of finding out where councils face serious financial difficulties.”

That is why, despite the barriers, we set out to build this database and will continue to argue for improvements in the publication and transparency of this vitally important data.

Become part of the Bureau Local network. Investigate issues locally and collaborate with others around the UK to get national reach.

Join here

This article was published in partnership with the Local Government Chronicle.