Journals Library

Cultures of Evidence beyond the Health Sector: Understanding policy decision-making in English local government for improving action on the social determinants of health. Project 2: Ethnography of policy decisions

Project title

Cultures of Evidence beyond the Health Sector: Understanding policy decision-making in English local government for improving action on the social determinants of health. Project 2: Ethnography of policy decision

Project reference


Final report date

01 April 2014

Project start date

1 September 2012

Project end date

31 October 2013

Project duration

14 months

Project keywords

Decision Making; Evidence; Built Environment; Local Authorities; Evaluation; Collaboration; Qualitative

Lead investigator(s)
  • Dr Judy Green, Department of Health Services Research and Policy, London School of Hygiene and Tropical Medicine (Professor of Sociology of Health)
  • Dr Karen Lock, Department of Health Services Research and Policy, London School of Hygiene and Tropical Medicine (Reader in Public Health)
  • Dr Mark Petticrew, Department of Social and Environmental Health Research, London School of Hygiene and Tropical Medicine (Professor of Public Health Evaluation)
  • Dr Gemma Phillips, Department of Health Services Research and Policy, London School of Hygiene and Tropical Medicine 
  • Dr Sarah Milton, Department of Health Services Research and Policy, London School of Hygiene and Tropical Medicine 
  • Dr Fred Martineau, Department of Global Health and Development, London School of Hygiene and Tropical Medicine
  • Dr Lesley Mountford, School for Public Health Research, London School of Hygiene and Tropical Medicine
  • Dr Matt Egan, Department of Social and Environmental Health Research, London School of Hygiene and Tropical Medicine 
  • Mrs Elizabeth McGill (nee Tyner), Department of Health Services Research and Policy, London School of Hygiene and Tropical Medicine
  • Professor David Hunter, School of Medicine, Durham University
  • Professor Margaret Whitehead, Institute of Psychology Health and Society, University of Liverpool 

Project objectives


  1. To conduct an ethnographic study of how decisions affecting the public health are made in local government, with a focus on identifying the role of different kinds of evidence used in those decisions;
  2. To use housing as a case study to identify what contextual factors inhibit and facilitate the conduct of evaluative research in non-health settings.


  1. To conduct two or more case studies of a live policy development within local government settings to understand how decisions in areas impacting on public health (outside the health sector) are made;
  2. To identify the range of actors involved in the policy development and decisions, their roles and the formal and informal sources of knowledge drawn on to inform decisions;
  3. To identify the relative role of research evidence on effectiveness and cost-effectiveness in those decisions;
  4. To understand better the perspectives of those in non-health statutory sectors on their needs for evidence-informed decision making for public health benefit and the contexts in which this is most likely to be utilised;
  5. To interview key informants on trials in housing with public health outcomes to identify key factors influencing the decision to collaborate on a trial in the non-health sector.

Changes to project objectives

The research is completely qualitative, no quantitative data was collected in the case studies. We have used an ethnographic approach to generate the qualitative data on each of the case studies.

We added a series of 3 workshops with Local Government practitioners in the built environment. The stakeholder workshops were not explicitly detailed in the original application, which focused on the ethnographic case studies of policy development and the housing interview study. These were developed to complement these components by allowing more directed discussion about the nature of evidence, its use and production in Local Authority non-health sectors with the users, and thus more specifically address one of the study objectives (Objective d).

We no longer focussed the outputs from the project so explicitly only on knowledge transfer strategies and considered more broadly what we can learn from the study to inform future evaluative research with Local Authority sectors and about how research can be used to influence the policy development process better.

Brief summary

In our programme of work on knowledge needs and practices in English local government we sought to answer the following questions:

  • What knowledge resources are used to inform decisions?
  • Where does academic research evidence fit in this knowledge and how is it used?
  • How is the success of policy and practice assessed? (What does ‘evaluation’ look like?)
  • When and why is academic research used for the purpose of evaluation?

We specifically sought to avoid a normative stance that all decisions should be supported by research evidence and that academic research methods are the best way of evaluating practice, in contrast to much of the previous work in this field. We designed a systematic approach to generate both a breadth and depth of understanding, building on previous research.

We began with a systematic review of qualitative (mostly interview) studies asking local policymakers and decisions-makers in sectors outside of public health to describe their understanding of ‘evidence’ as used in their work and to discuss what determines their use of academic research evidence. This review informed three subsequent primary research studies. An ethnographic study of policy and practice in local government provided in depth data about the types of knowledge used and valued in everyday work and how the ‘success’ of activities is defined and demonstrated by officers. We also conducted a qualitative interview study to understand when, why and how academic research approaches have been used to evaluate work in local government, focussing on housing and urban regeneration programmes. Finally, we used focus groups with policymakers in the built environment sector at the local and regional government level to address more directly the types of knowledge they require in their work and with a focus on their perceptions of what researchers could add to this.

Ethnography: Working for the public health: politics and evidence in local government in England

What knowledge is valued in local government practice?

Local government officers place great importance on their knowledge of the local environment in identifying ‘problems’ and deciding on appropriate responses. This knowledge is based on both:

  • ‘hard’ data (e.g. crime statistics); and
  • experiential knowledge and professional expertise (e.g. judging when a …).

What is the context of decision-making?

Local government officers manage a range of stakeholders and differing agendas in deciding policy and activities (national government policy, local businesses and residents, local politicians, executive managers). Their local data (‘evidence’) is put into conversation with their professional expertise to manage these different needs and agendas and decide on the most appropriate course of action in a given situation and justify this decision.

Are ideas and approaches transferred between local authorities?

A narrative of uniqueness supports the importance of local ideas and knowledge and is accompanied by a sense of competition with other local authorities. ‘Innovation’ in practice is also highly valued in the culture of local government. A public narrative of ‘sharing best practice’ does not necessarily lead to transfer of ideas and approaches because of restrictions in time or finances and because often ideas from other areas are not seen to fit the local needs and issues.

How is success defined and demonstrated?

Evaluative activities are primarily about justifying decisions and practices to local actors or to higher tiers of government in relation to the range of stakeholders and the different frameworks of accountability that they represent (legal, financial, local participatory/consumer democracy, New Public Management). This range of agendas and accountabilities provides a range of outcomes for judging success. In some senses, an activity will rarely ‘fail’ against all of these outcomes.

Evaluation is not about judging the universal merit of an approach and deciding whether to continue or discontinue it. Within the narrative of uniqueness and innovation, practice is continually evolved, responding to new knowledge from a range of sources and to changes in context (e.g. budgets, public opinion, and national legislation).

Housing case study

Taking housing as a case study, this qualitative study sought to explore the factors that contribute to successful evaluative research in the non-health sector. There have been many calls for more evaluative evidence on the upstream determinants of health, with a consequent focus on the barriers to the production (and uptake) of research evidence. Whilst this literature has been useful in outlining the challenges in collaborative research, this study sought to move this forward and focus on why evaluations do happen.

Eight studies were purposively selected from a systematic review of published intervention studies in housing; that both included health outcomes and utilised a controlled design. The sample included a range of countries; a range of scope of the intervention (from large scale regeneration to smaller household intervention); studies which found evidence and which found no evidence of effect. For each sampled study, we interviewed the principal investigator and at least one other participant who was involved in the study (e.g. research user or non-academic collaborator). A total of 16 interviews were undertaken, including four public health specialists, seven academics, four local authority employees and one national authority employee (some individuals were involved with more than one study).

Interviews used a topic guide to elicit information on how and why the controlled studies were undertaken, where the idea came from, who was involved, what the benefits had been to them and their institution, why they got involved and views on what inhibits and facilitates research in the non-health sector. Interviews were audio recorded and transcribed, with participant consent and transcripts were analysed using thematic content analysis. The study was approved by the LSHTM Ethics Committee.

We found that despite reporting similar challenges to practitioners in studies of why evaluations are not undertaken, the 'barriers' detailed were not insurmountable and they did not prevent evaluative evidence being generated and published. In particular, 'lack of understanding' of the need for evaluative evidence was neither a particular barrier, nor as tied to institutional setting than is often assumed (by both the literature and our own interviewees). There was more understanding amongst non-academics surrounding research design rationales than might have been anticipated.  Conversely, academic research was often described as more flexible and ‘messy’ than published papers suggest.

Different ‘cultures of evidence’ across academic and practice sectors were reported but were not seen as barriers. Similarly, that ‘success’ was differently framed across collaborating institutions was not, in practice, a barrier. Both academic and local authority partners reported that anecdotal, qualitative work and common sense might be more appropriate indicators of success than the quantitative evaluation results.

Importantly, most of the studies in our sample were reported as arising from ongoing networks and that the ideas for evaluations often arose organically through the maintenance of these relationships. Crucially, ongoing networks allowed the partners to exploit ‘windows of opportunity’ where funding calls, or planned interventions, provided the possibility for evaluation.

Exploring ‘success’ rather than ‘failure’ enabled us to shed a different light on the issue of how to foster evaluation in the non-health sector. We suggest that for public health practitioners interested in developing the evidence base for areas such as housing, the question is not one of ‘challenges to be overcome’. Instead of focusing our efforts on education about the need for controlled designs for evaluation, a more productive way forward might be to maximise the potential for the right conditions for evaluations to happen. From our interview accounts, these are likely to be: an existing network of collaborators who can take advantage of windows of opportunity for evaluations, and with the commitment, resources and trust to mobilise the larger networks needed to deliver it.


A series of workshops was held with practitioners working in the built environment. These workshops aimed to elucidate how those working on the ‘upstream determinants’ of health conceive of, and utilise, evidence and information in their work. As such, three half-day workshops were held to explore these issues. Participants working on areas of the built environment in local governments were recruited on the basis of the geographical location of their work. The first workshop comprised international participants, the second recruited decision makers from London and the third was formed of practitioners from the North West of England. The workshops utilised a focus group methodology; each had four to six participants and was chaired by a senior academic who used a topic guide and prompts to facilitate the discussion. Each workshop was audio recorded and transcribed. Both deductive and inductive analysis techniques were employed; a coding structure was defined based on existing literature on evidence and policy and researchers’ discussions of emergent findings. The codes were applied to the data and refined throughout the analysis. All participants provided informed consent to partake in the study and ethical approval was obtained through the London School of Hygiene and Tropical Medicine.

A total of fifteen decision makers, from a range of backgrounds including planning, housing and leisure services, participated in the workshops. The workshop analysis is still ongoing, and some of the emergent findings are presented below. The participants had wide conceptualisations of what constitutes evidence; including a range of various routinely collected data sources, surveys, maps, projections, records, government guidelines, anecdotes, case studies, models, drawings, photographs and academic research evidence. More emphasis was placed on utilising data and information, as opposed to ‘evidence’ in a more academic sense. Decision makers in local government underscored that case studies, either of their own innovative work, or from a similar context, are more useful than generalised evidence. Within this discussion, they specifically highlighted the issue of generalisability, specifically in terms of geography and temporality. That is, decision makers in local government believe that evidence emerging from areas that are most similar to theirs, and evidence that is relatively recently produced, is the most useful for making decision.

The decision makers identified a range of influences and considerations to rationalise both the use, and non-use of evidence, when making decisions. Evidence is used when it provides support and justification for actions or programmes; it is also used to create policies that will be tested through public inquiry and where a legal requirement exists to use evidence. Decision makers also describe how evidence is particularly useful when making bids for money in order to show the value of a chosen project or course of action. On the other hand, decision makers also identified some of the constraints to evidence use, such as the political nature of local government. Specifically, politicians are seen as needing to balance a range of priorities and considerations, of which utilising evidence is only one. In addition, decision makers, particularly from the North West of England, highlighted the constraints of the current economic environment. Given the current budget cuts, decision makers describe a climate of reduced resources for gathering and analysing data, a lack of funds to commission research and an overall emphasis on costs and cost reduction. Finally, the decision makers described a number of challenges to utilising evidence, including understanding and employing suitable methods for evaluation, the complexity and relevance of academic studies and the challenge of synthesising large bodies of evidence.

This study highlights the need for locally relevant evidence for local government decision makers working in the built environment. It has shown that public health and the built environment may have different cultures of evidence, but there is now an opportunity, as public health has moved from the NHS to local government, to work with, and learn from, one another. This study helps lay the foundation for useful engagement between public health and the built environment sectors of local government by achieving a deeper understanding of their evidence cultures.

Plain English summary

Using evidence to describe problems and decide on solutions is very important to public health professionals, such as using research from universities to decide how to promote healthier lifestyles amongst the public. In April 2013 public health was moved out of the NHS into local authorities. Public health professionals now have to change their ways of working to fit more with how people work in local government. Many health professionals have been worried that people working in local government do not think of evidence in the same way that they do and therefore question how good their work can be. It is also a concern for researchers – if people in local government aren’t interested in the work that they do and the evidence they provide, what is the point of their work?

Rather than assume that these views and opinions are true we decided to do some work investigating how decisions are made in local authorities around services that might impact on people’s health, such as alcohol and gambling licensing , housing, parks and leisure facilities, community safety and transport. We did three things:

  • We spent time in local authorities watching officers work and asking them about the things that they do;
  • We asked people involved in the evaluation of housing projects using the types of methods used by researchers in universities why they had done this and what they thought about it; and
  • We asked senior politicians and local authority officers about what they think about when they are making strategic decisions, and what things count as evidence.

We found that:

  • Officers in local authorities have lots of people to keep happy: residents, businesses, politicians and national government. They have to work out solutions and ways of working that are acceptable to all these people. This means that thinking about evidence from surveys or university research is not always high on their agenda;
  • Local authorities don’t make much use of research from universities because it doesn’t fit with the way they see the world;
  • They feel that they know what is right for their local population based on their experience of working there;
  • They like to think about issues and solutions locally for themselves and don’t often like to transfer in projects and programmes from other areas without testing them out locally because they don’t feel anyone else can know as well as they do what works in their area;
  • If they are going to think about using ideas from other areas, officers like to have a detailed description of what was done and what the other areas are like. This is more important than lots of information about what happened as a result;
  • Officers often feel that they have a sense of whether a project has worked well based on how the public and people involved respond. This is more important and relevant to them than the types of evidence that university researchers can provide.


Published articles

  1. McGill E, Egan M,Petticrew M,Mountford L, Milton S,Whitehead M, Lock K. Trading quality for relevance: non-health decision-makers’ use of evidence on the social determinants of health.BMJ Open 2015;5:e007053doi:10.1136/bmjopen-2014-007053.
  2. Milton S, Petticrew M, Green J. (2014) Why do local authorities sometimes undertake controlled evaluations of health impact? A qualitative case study of interventions in housing. Public Health. doi:10.1016/j.puhe.2014.10.009.
  3. Phillips, G, Green, J (2015) Working for the public health: politics, localism and epistemologies of practice. Sociology of Health and Illness. doi:10.1111/1467-9566.12214.
  4. Gorsky M, Lock K, Hogarth S. (2014) Public health and English local government: historical perspectives on the impact of ‘returning home’. Journal of Public Health. doi: 10.1093/pubmed/fdt131.
  5. Petticrew M, McKee M, Lock K, Green J, Phillips G. (2013) In Search of Social Equipoise. BMJ. 2013;347:f4016.
  6. Anderson W, Egan M, Pinto A, Mountford L (2014) Planning for Public Health – building the local. The Journal of the Town and Country Planning Association; 341-47 (editorial).
  7. Effectiveness Matters: Housing improvement and home safety. Produced by the Centre for Reviews and Dissemination (CRD) in collaboration with the NIHR SPHR, LSHTM and the MRC SPHSU (University of Glasgow). Nov 2014 (editorial).
  8. Steinbach R. Policy Decisions in the Transport Sector. London: SPHR@L, 2013 (SPHR@L working paper - for practitioners).
  9. Phillips G. An evidence-culture shock for public health in local government. SPHR@L website, 14 Oct 2014 (blog).

Conference presentations

  1. McGill E, Egan M, Lock K, Mountford L, Whitehead M, Petticrew M. How evidence on the social determinants of health is understood and utilised by non-health sector decision makers in four countries: qualitative findings. 7th European Public Health Conference (EUPHA). Mind the gap: reducing inequalities in health and health care. Glasgow, 20 Nov 2014.
  2. Petticrew M. The policy environment. Royal College of Physicians of Edinburgh Symposium on use of evidence in health inequalities policy. Edinburgh, 8 May 2014.
  3. Lorenc T, Phillips G. Evidence beyond the health sector. NIHR SPHR Annual Scientific Meeting. London, 8 Oct 2013.
  4. Lock K, Petticrew M, Green J, McKee M, Phillips G, Lorenc T, Steinbach R, Martineau F, Tyner E. Cultures of evidence beyond the health sector: understanding decision-making processes in English local government for improving action on social determinants of health. NIHR SPHR Annual Scientific Meeting, Sheffield, 10 Oct 2012.

Seminars and workshops

  1. Lock K. The nature of evidence for “healthy planning”. ESRC Seminar Series: Evidence, Governance and Policies for Health and Planning, Bristol, 29 Jun 2015. (presentation)
  2. Craig P, Petticrew M, Egan M. When to evaluate? Balancing pragmatism with rigour. NIHR SPHR Annual Scientific Meeting. Sheffield, 22 Oct 2014. (workshop)
  3. Phillips G. Politics, localism and epistemologies of practice: How will evidence-based public health fare in the world of local government? SPHR@L seminar series, London, 16 Oct 2014. (presentation)
  4. Phillips G, Egan M. Politics, localism and epistemologies of practice: feedback session to local practitioners. Islington Local Authority, London, Oct 2014. (workshop)
  5. Ponsford R, Egan M, Korjonen H, Ford J, Hughes E, Petticrew M, Lock K. Building capacity through knowledge translation: a mixed methods randomised controlled trial of an online community of practice (CoP). LSHTM Annual Symposium, London, 17 Sept 2014. (poster)

Public involvement

This was a year of development work to inform the future research programme at LSHTM. As such, the research aims and objectives were determined by the research team in relation to our current knowledge and understanding. The level of public involvement in the design, conduct and interpretation of these particular research projects was therefore limited to the strategic direction provided by the management committee which includes advisors from local government.


This programme of research has led to two clear actions. First, officers and practitioners identified challenges surrounding suitable metrics and data for local government. As a result, we are currently engaged in producing a practitioner output for planners and developers, that seeks to address some of these issues. In addition, participants also identified ongoing challenges surrounding evaluation of their work. With public health now located within the local authority in England, our work showed that the flow of better evaluation evidence could be increased if academics were to focus more on developing such networks and in finding and developing common areas of interest between practitioners in local authorities and public health researchers (more fruitful than emphasising differences and fostering the view that evaluation research is difficult). In response to these findings and needs, SPHR@L has begun a workshop aimed at local government decision-makers to learn about, and work through collaboratively, their evaluation needs and methods. The first workshop was held in April 2014 and received positive feedback from the participants. We plan to continue revising the agenda and conducting these workshops in the future.

This project was funded by the National Institute for Health Research School for Public Health Research (project number SPHR-LSH-PH1-LAS).

Department of Health Disclaimer

The views and opinions expressed therein are those of the authors and do not necessarily reflect those of the NIHR School for Public Health Research, NIHR, NHS or the Department of Health.