Bettering local public services in Ireland


Credit:Elmar Gubisch on iStock

In this blog, Dr Greg Stride from the LGIU’s Local Democracy Research Centre examines the Local Government Management Agency’s (LGMA) recent report on Irish local government services. Greg finds that the approach taken by LGMA makes an important contribution to understanding the state of local government in Ireland in the run-up to the local elections and that equivalent organisations in other countries seeking to understand the state of local government services can learn from their research.

In April 2024, the LGMA published a report, the third in a series of annual ‘customer service’ reports on local government across Ireland. The research sought to explore residents’ perspectives on local services, including their awareness of who provides local services, their satisfaction with local services, who uses different services, their engagements with local authorities, where residents get their news about local services, and where they could see areas for improvement. Overall, the exercise provided a detailed overview of the state of local services in 2023 from the perspective of service users and analysed useful trends from across the last three years.

Why measure performance in this way?

The question of how to measure local governments’ performance has been a live one in England – where I do much of my research – for the last few years. A new body has been set up, called the Office for Local Government, that has a strategic objective to “increase understanding – among citizens, civil society, central government and local government itself – about data on the performance of local authorities.” Increasing understanding about data is an interesting phrase, different from increasing understanding of local government, or improving the collection of data, for example. I could certainly do with some help when it comes to understanding data about the performance of local authorities – like what sort of data on the performance of local authorities exists? And how far should we trust it?

It is still not entirely clear, to me or anyone else in the sector as far as I can tell, what the best ways to measure local government performance are. We are quite good at figuring out when things have gone wrong, but no sensible review of performance across any organisation would rate everything as either ‘fine’ or ‘catastrophic.’ The LGMA research has an answer to this question. They ask residents.

Asking residents for their opinions centres service users at the heart of local government performance, as they should be. The methods and rationale for measuring service satisfaction, the difference between customer satisfaction and service satisfaction, and criticisms of previous surveys, are well-represented by the IPA in a paper from 2020.

What did they find?

From a survey of over 2000 residents, the LGMA found several useful results. However, I will limit myself to talking about the three I think are most interesting and relevant to the local elections.

1. Awareness – nearly everyone knows at least something local governments provide, but there are big gaps.

99% of residents were aware that some or all of the 29 listed services are provided by local government. This is undoubtedly good news for local democracy – residents should know what they are voting on to keep local elected politicians accountable for service quality. However, a few services have relatively low awareness (business support is at the lowest with under 50%).

There were some variations based on demographic and regional variables – such as younger residents generally being less aware across the board, and especially less aware when it comes to electoral registration (although whether this is a vote moving issue for anyone is up for debate). In general, any evidence of lower understanding across demographic variables is cause for concern, as it represents less understanding of what your vote can achieve.

What would have been interesting, and may have been included in the survey but not in the results, is a ‘false positive’ option. The inclusion of a policy area that isn’t controlled by local government but plausibly sounds like it could be (policing, maybe) could have demonstrated the extent to which residents ascribe local government with responsibility for local issues they have no powers over. Again, an important question when it comes to local elections.

2. There is variation in satisfaction across services, but no information on where

The survey contains very useful information on which services are considered satisfactory (libraries) and which have relatively higher levels of dissatisfaction (housing). This is useful context, and could translate into local government approaching these policy areas in different ways. The survey, perhaps rightly given this is not the place to make league tables, does not allow for comparison between different local government areas.

This means that service satisfaction is presented as a unified score across Ireland, rather than disaggregated, despite the possibility of local variation. There are very good reasons for this, based on sample size and methodology. But overall, it does mean that when we hear only 44% of respondents are satisfied with ‘housing,’ not only do we not know what they mean by housing, as the survey report explains, but also that we don’t know if this varies across local areas. Therefore, the extent to which any individual local authority can act on this information is limited without significant additional research as to what people mean and where exactly the causes of dissatisfaction lie. The focus groups mentioned at the end of the report are an important way to strengthen and interrogate the conclusions found in these data.

3. People who use services are more likely to demonstrate positive sentiments – but also, who doesn’t use services?

On page 45 of the report, there is an interesting table showing that there are a few areas where service users demonstrated positive sentiments about local government (saying they are improving digital services, for example), especially when compared to people who do not use services.

The key message that I took from this table is slightly different. The total number of respondents looks to be 2,066, and only 85 of those had not used any service. Nearly everyone uses local government services, and this is worth remembering. When it comes to customer satisfaction, it is a rare, privileged and perhaps risky position to have pretty much everyone in the whole country as a customer.

What can other places learn from this research?

In England, there are also attempts to measure resident’s satisfaction with local government. The English LGA research from February 2024, is a good example, and allows for some interesting comparisons with Ireland (England’s 54% satisfaction with library services pales in comparison to 91% satisfaction in Ireland).

The two reports, in particular the carefully constructed and theorised LGMA report, demonstrate the importance of centring residents’ perspectives into how we understand local government service performance. Equally, the LGMA report is a great example of how to present the limitations of data, and how not to claim more than your data allows. There is no indication in the LGMA report that this is the be-all and end-all of local government performance, and no attempt to disingenuously compare local authorities based on limited data.

Overall, the research demonstrates that when we want to understand how well local government is performing, we need to look into a whole host of different measures, and that resident satisfaction is one we cannot afford to ignore. A lesson that may need to be appreciated in England.


Leave a Reply

Your email address will not be published. Required fields are marked *