Doyle, Aoife M; Mulhern, Emma; Rosen, James; Appleford, Gabrielle; Atchison, Christina; Bottomley, Christian; Hargreaves, James R; Weinberger, Michelle; (2019) Challenges and opportunities in evaluating programmes incorporating human-centred design: lessons learnt from the evaluation of Adolescents 360. Gates Open Research, 3. p. 1472. DOI: https://doi.org/10.12688/gatesopenres.12998.1
Permanent Identifier
Use this Digital Object Identifier when citing or linking to this resource.
Abstract
<ns4:p>Adolescents 360 (A360) is a four-year initiative (2016–2020) to increase 15-19-year-old girls’ use of modern contraception in Nigeria, Ethiopia and Tanzania. The innovative A360 approach is led by human-centred design (HCD), combined with social marketing, developmental neuroscience, public health, sociocultural anthropology and youth engagement ‘lenses’, and aims to create context-specific, youth-driven solutions that respond to the needs of adolescent girls. The A360 external evaluation includes a process evaluation, quasi-experimental outcome evaluation, and a cost-effectiveness study. We reflect on evaluation opportunities and challenges associated with measuring the application and impact of this novel HCD-led design approach.</ns4:p><ns4:p> For the process evaluation, participant observations were key to capturing the depth of the fast-paced, highly-iterative HCD process, and to understand decision-making within the design process. The evaluation team had to be flexible and align closely with the work plan of the implementers. The HCD process meant that key information such as intervention components, settings, and eligible populations were unclear and changed over outcome evaluation and cost-effectiveness protocol development. This resulted in a more time-consuming and resource-intensive study design process. As much time and resources went into the creation of a new design approach, separating one-off “creation” costs versus those costs associated with actually implementing the programme was challenging. Opportunities included the potential to inform programmatic decision-making in real-time to ensure that interventions adequately met the contextualized needs in targeted areas.</ns4:p><ns4:p> Robust evaluation of interventions designed using HCD, a promising and increasingly popular approach, is warranted yet challenging. Future HCD-based initiatives should consider a phased evaluation, focusing initially on programme theory refinement and process evaluation, and then, when the intervention program details are clearer, following with outcome evaluation and cost-effectiveness analysis. A phased approach would delay the availability of evaluation findings but would allow for a more appropriate and tailored evaluation design.</ns4:p>
Item Type | Article |
---|---|
Faculty and Department |
Faculty of Epidemiology and Population Health > Dept of Infectious Disease Epidemiology & International Health (2023-) Faculty of Public Health and Policy > Public Health, Environments and Society |
Elements ID | 132628 |