↓ Skip to main content

Missed opportunities in the evaluation of public health interventions: a case study of physical activity programmes

Overview of attention for article published in BMC Public Health, August 2017
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (94th percentile)

Mentioned by

news
1 news outlet
twitter
65 tweeters
facebook
2 Facebook pages

Citations

dimensions_citation
2 Dimensions

Readers on

mendeley
39 Mendeley
Title
Missed opportunities in the evaluation of public health interventions: a case study of physical activity programmes
Published in
BMC Public Health, August 2017
DOI 10.1186/s12889-017-4683-z
Pubmed ID
Authors

Sarah Hanson, Andy Jones

Abstract

Evidence-based approaches are requisite in evaluating public health programmes. Nowhere are they more necessary than physical activity interventions where evidence of effectiveness is often poor, especially within hard to reach groups. Our study reports on the quality of the evaluation of a government funded walking programme in five 'Walking Cities' in England. Cities were required to undertake a simple but robust evaluation using the Standard Evaluation Framework (SEF) for physical activity interventions to enable high quality, consistent evaluation. Our aim was not to evaluate the outcomes of this programme but to evaluate whether the evaluation process had been effective in generating new and reliable evidence on intervention design and what had worked in 'real world' circumstances. Funding applications and final reports produced by the funder and the five walking cities were obtained. These totalled 16 documents which were systematically analysed against the 52 criteria in the SEF. Data were cross checked between the documents at the bid and reporting stage with reference to the SEF guidance notes. Generally, the SEF reporting requirements were not followed well. The rationale for the interventions was badly described, the target population was not precisely specified, and neither was the method of recruitment. Demographics of individual participants, including socio-economic status were reported poorly, despite being a key criterion for funding. Our study of the evaluations demonstrated a missed opportunity to confidently establish what worked and what did not work in walking programmes with particular populations. This limited the potential for evidence synthesis and to highlight innovative practice warranting further investigation. Our findings suggest a mandate for evaluability assessment. Used at the planning stage this may have ensured the development of realistic objectives and crucially may have identified innovative practice to implement and evaluate. Logic models may also have helped in the development of the intervention and its means of capturing evidence prior to implementation. It may be that research-practice partnerships between universities and practitioners could enhance this process. A lack of conceptual clarity means that replicability and scaling-up of effective interventions is difficult and the opportunity to learn from failure lost.

Twitter Demographics

The data shown below were collected from the profiles of 65 tweeters who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 39 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 39 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 9 23%
Student > Ph. D. Student 8 21%
Unspecified 6 15%
Student > Master 5 13%
Other 3 8%
Other 8 21%
Readers by discipline Count As %
Unspecified 13 33%
Medicine and Dentistry 8 21%
Sports and Recreations 6 15%
Social Sciences 5 13%
Psychology 4 10%
Other 3 8%

Attention Score in Context

This research output has an Altmetric Attention Score of 50. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 30 April 2018.
All research outputs
#320,656
of 13,022,627 outputs
Outputs from BMC Public Health
#294
of 8,914 outputs
Outputs of similar age
#13,797
of 266,238 outputs
Outputs of similar age from BMC Public Health
#1
of 1 outputs
Altmetric has tracked 13,022,627 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 97th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 8,914 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 10.9. This one has done particularly well, scoring higher than 96% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 266,238 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 94% of its contemporaries.
We're also able to compare this research output to 1 others from the same source and published within six weeks on either side of this one. This one has scored higher than all of them