Project Description
Collating research evidence with implementation experiences
Introduction
The MTBA learning project aims to bring together research and programme evidence by documenting implementers’ valuable knowledge, which rarely makes it into peer-reviewed literature. By collating research evidence with practical experience, the project seeks to elevate programme evidence to the level of research data. As such, programme evidence and research data could interact with one another, helping researchers and practitioners better understand the link between child marriage and girls’ sexuality. In this way, the learning project aims to improve child marriage programme effectiveness and to ensure that future research is relevant for practice.
This article outlines how research evidence and practitioner experience have been collected during the implementation of the MTBA learning project. The summary of methodologies used is followed by a reflection on their implementation, aimed at informing future learning efforts on what programmes should start doing, stop doing, and do differently to bridge the fields of research and practice.
How was research evidence collected?
Research evidence was collected internally and externally. External evidence was compiled from peer-reviewed articles, grey literature, and reports on child marriage and adolescent SRHR into an article database. Furthermore, publicly available data, such as Demographic Health Survey (DHS) data, were analysed to review the situation of child marriage and programmatic interventions in each of the nine selected countries. Internal MTBA data included evaluation data and qualitative data collected as part of the programme. MTBA qualitative data were analysed using Dedoose. In most cases, a thematic approach was followed using multiple coders and a tiered coding system. The analysis of external and internal data was presented within the learning project team and informed the development of tools used for implementation experience collection. Findings from implementation experience gathering, Spark fund, youth-led research, and national learning events informed further, topic-specific research data collection to inform integrated knowledge products. The core team took stock of findings from the international learning event and refined these into learning products on the following four key topics:
- Agency & control of sexuality
- Changing marriageability social norms
- Working with parents
- Building on existing community knowledge and practice
How was implementation experience collected?
The learning project aimed to document implementation experiences within and beyond the More Than Brides Alliance, including experiences from organisations in other Dutch-funded child marriage alliances: Girls Not Brides and the UNICEF/UNFPA Global Programme to End Child Marriage. To produce an overall snapshot and an in-depth understanding of how child marriage interventions address female adolescent sexuality, three different streams of implementation experience data collection were set up:
- Stream 1: MTBA child marriage programme description survey: This survey aimed to document key programme strategies applied in child marriage programming and the extent to which partners who implemented MTBA’s Marriage No Child’s Play intervention addressed connections between child marriage and adolescent girls’ sexuality. The survey was sent out online to all MTBA partners in India, Malawi, Mali, Niger, and Pakistan and received 49 responses.
- Stream 2: Child marriage programme scoping survey: This survey aimed to map the wider landscape of child marriage programming within and beyond the nine MTBA learning project focus countries to understand whether and how child marriage programmes address topics related to female adolescent sexuality. This survey was sent out online through different national and international child marriage networks, resulting in 95 responses from implementers in over ten countries. Many questions included in the programme scoping survey corresponded to those in the MTBA programme description survey, allowing results to be combined and analysed side-by-side.
- Stream 3: Gathering implementation experiences: Building on the results of the two programme surveys, country learning project coordinators conducted 63 interviews across the nine focus countries to document the experiences of programme implementers. These preceded 17 follow-up interviews on selected topics after preliminary analysis.
Survey data were collected using KoBo ToolBox and exported to Excel for cleaning. The results of the internal and external surveys were merged. Analyses were conducted in Stata 16. De-identified data were programmed for interactive dashboards using Google Sheets and Google Data Studio.1 To conduct interviews and oversee different learning activities, country coordinators were recruited in each of the nine learning project focus countries. The selection criteria for recruiting country coordinators included engagement and network in child marriage programming and basic qualitative research skills.2 The MTBA learning project country coordinators proposed interview respondents based on their networks and interesting responses to the online surveys that they felt warranted follow-up correspondence. Interview guides to collect implementation experiences were developed in collaboration between the global learning project team and country coordinators. The interview guide was based on the project cycle in which respondents were first asked about their understanding of the link between child marriage and girls’ sexuality, followed by questions about how sexuality was discussed during their projects’ inception, preparation, implementation, and evaluation phases. The idea to follow this structure was to help understand different aspects of integrating sexuality-related topics throughout the project cycle. To personalise the interviews, interview guides were adapted based on the responses to the online surveys and, if available, other existing knowledge about the programme intervention in question. The country learning coordinators made the final selection of interview respondents. Respondents mostly involved mid-management-level child marriage technical or project coordination staff. Due to the COVID-19 crisis, both interview preparation and most of the interviews themselves happened online or via phone.
The first round of interviews with practitioners was recorded and summarised by the country coordinators. These summaries were then analysed using qualitative data analysis software (MAXQDA). Through deductive coding, an overview of intervention strategies and sexuality themes addressed in programming was created. Through inductive coding, common success factors and challenges to address sexuality in child marriage programmes were analysed. A select number of follow-up interviews were conducted with respondents who had shared interesting insights about one or more of the common success factors or challenges, such as working with parents, engaging religious leaders, or using peer education strategies. Besides collecting practitioner experiences through interviews, country coordinators also collected visual materials of interventions discussed, such as pictures of activities or video interviews with respondents. Due to the COVID-19 crisis, it was only possible to collect these materials in five of the nine focus countries.
How was research evidence collated with implementation experiences?
Integration of research data with practice data was an iterative process that included reviewing research evidence (available first, in many cases) and noting initial ‘findings,’ validating or vetting those findings by reviewing practice data and seeing where agreements and disagreements occurred, and then returning to the research data to challenge or expand previous findings. At the global project coordination level, initial findings were shared during monthly project content meetings. These meetings informed follow-up measures, such as developing the implementation experience methods, interviews guides, follow-up interview plans, and focus areas for the national and international learning events.
In seven of the nine focus countries, the existing research evidence and implementation experience data gathered formed the basis for learning and exchange during national learning events organised in the last quarter of 2020. These events aimed to provide opportunities for learning within the country and test if the preliminary analysis of the data made sense according to the participants. During each event, research background—about the drivers of child marriage in that country and about the links between norms and values about sexuality and child marriage—was presented. The implementation experience presentations mostly focused on key barriers to addressing sexuality and promising intervention strategies. In workgroup sessions, specific intervention strategies were discussed in further detail. In the post-event surveys, most participants indicated that they found the national learning events relevant and insightful and would welcome further learning about the links between sexuality and child marriage.
Insights from research and practitioner experience were brought together with preliminary insights from youth-led research and the learning Spark fund in the first version of a child marriage and sexuality resources section on the website. Further opportunities for learning and exchange based on the outcomes of all learning components were provided in a three-day hybrid online and face-to-face learning event involving over 230 participants across the nine focus countries. Based on the exchange during the international event, additional learning products were written, and the child marriage and sexuality section of the MTBA website was updated.
What did we learn from using this approach?
Reflecting on our experience in the learning project, our team sees the following core strengths and challenges related to the process of collating existing research with practice:
Main Strengths
- This project sought to bring together programme and research evidence and ‘weave’ that evidence together atypically. Researchers tend to focus on ‘research data’ (surveys, focus groups and interviews collected as part of the research process), whereas programmes tend to focus on ‘programme evidence’ (testimonials, feedback from participant surveys and interviews conducted during that particular programme). We tried to mix these methods by staggering them. For example, secondary analysis of research data—such as DHS data— informed context-specific interview tools, and the findings from those interviews influenced further secondary analysis and literature review. In our view, this strengthens our understanding of these topics as they are triangulated across research and programmes.
- This project brought together team members from diverse backgrounds, including researchers and practitioners willing to engage on these topics from their unique perspectives, allowing engaged discussions that fed into creating knowledge products applicable across many different contexts.
- The use of a hybrid interactive international workshop enabled the input of many more participants from each country than would have otherwise been able to participate had the event taken place in one location.
Main Challenges
- At times, it was difficult to guide practitioners to reflect on how their programmes address drivers related to sexuality as they tended to speak about drivers of child marriage more generally.
- It was difficult to gather in-depth reflections from practitioners, particularly in low-resource settings and during the COVID-19 pandemic. Due to COVID-19, interviews needed to be conducted by phone, including those in areas where access to a reliable internet connection is limited. As others have noted, the COVID-19 pandemic also increased caretaking responsibilities as schools were closed, making it difficult to find mutually agreeable times to connect across time zones.
- Power and trust dynamics may have influenced the content that was shared with the learning team, as practitioners often tried to showcase the success of their interventions instead of offering critical reflections on the challenges they faced. This limitation is particularly difficult to overcome through distanced interviews.
- The learning project’s diverse, multi-organisational team confronted the challenge of bridging different ways of viewing the same data, information, or tasks to effectively create coherent products. There was a tendency for researchers and NGO staff to conform to their respective ‘roles’ and perhaps to apply too narrow of a lens in the review of others’ products at various moments of the project.
- Further insights about the methods from researchers and practitioners who participated in interviews and during events are published in the final evaluation of the MTBA learning project. Based on our internal reflection on the project’s efforts to collate evidence and practice, we offer the following recommendations for future learning trajectories in general and to those focused on child marriage programs specifically:
Start doing:
- Explore ways to connect programme implementers and researchers more regularly to understand what each is doing and learning ‘as a field,’ such as we did in the programme scoping survey. Too often, researchers and practitioners are siloed in projects and don’t look across to see what the main trends are within a field or topic area from a research or practitioner perspective.
- Stimulate more applied research that can provide in-depth recommendations for practitioners in a certain country context.
- Rigorously collect and publish evidence from programmes in scientific journals where possible, so this evidence gets the same visibility as research evidence.
- Make more and stronger linkages between research insights on the drivers of child marriage and the programming experiences of those working on these issues on the ground.
- Integrate more in-depth critical reflection and learning about drivers of child marriage in the design process of a programme to maximise interventions’ relevance and effectiveness.
Do differently:
- Allow sufficient time for training and reflection with interviewers to gain more in-depth reflections from interviews.
- When possible, conduct interviews face-to-face to facilitate a stronger rapport between interviewers and respondents.
- Inform research reflection more by practitioner recommendations. For example, what does research tell us about how programmes can work more with parents?
- Involve more country-based research experts in tool development, interviewing, and analysis to gain developed context-specific perspectives and contribute to strengthening linkages between research and practice in focus countries.
- Use available tools and training materials to enhance the effectiveness of implementation experience gathering.3
Stop Doing:
- Stop asking practitioners broad questions about child marriage drivers and interventions as this does not lead to the sharing of in-depth perspectives and insights. Instead, acknowledge power structures in North-South collaborations and donor-grantee relationships and first focus on building rapport and trust. When a connection is built, interviewers may consider discussing specific pathways by which norms and fears related to adolescent girls’ sexuality lead to child marriage in a given context and ask specific follow-up questions. Discussions may also be led in another direction based on input from both the interviewer and the interviewee.