When Danish Refugee Council (DRC) started its three-year gender-based violence (GBV) prevention program with refugee and host communities in Jordan and Lebanon in 2022, the regional and country teams had already planned to use pre- and post-tests and satisfaction surveys to monitor self-reported attitudinal and behavioral changes at an individual and household level, but sought to integrate another approach that would enable them to measure behavior change and impact at the community level over the full duration of the program. In consulting InterAction’s Gender-Based Violence Prevention Evaluation Framework (GBV PEF), a resource aimed at helping organizations measure and evaluate the outcomes of their GBV prevention work in humanitarian contexts, the teams identified Results Journals as a potentially innovative and intuitive approach.
- Results Journals
Results Journals are a tool for collecting data about behavior change over time. What makes it a “journal” is the use of a community-based record of changes over time. What makes it a “results” journal is the focus on behavior changes within the community itself, rather than recording progress in delivering a program or set of activities.
The teams were initially attracted to the Results Journal method because it appeared simple and aligned closely with concepts that GBV teams were already grappling with, rather than using purely monitoring and evaluation lingo. As a result, both the protection and MEAL teams were open to and enthusiastic about the idea. The long-term funding, largely stable operating contexts, and collective buy-in made the program ripe for testing the approach.
“We jointly decided [with the MEAL team] to use the Results Journals, because out of the evaluation approaches outlined in the PEF, it appeared the most straightforward and easiest for protection staff to understand and implement.”
Aleta Morn, Global Protection Advisor, DRC
Following an initial consultation with InterAction, the DRC teams undertook an intensive process of adapting and operationalizing the guidance in the GBV PEF in order to launch the Results Journals. First, program and MEAL team members participated in a workshop to refine the program theory of change, which was contextualized to the country-level dynamics and to the specific prevention curricula being utilized by the program. This process enabled the teams to clarify their expected results of the program and to formulate around these results a total of ten progress markers that were standardized across both countries and multiple curricula. DRC’s teams subsequently convened a series of validation sessions with community participants to further refine the phrasing of the progress markers to align them with local realities and to generate community buy-in. With support from the regional coordinator, the teams designed the journal tool on the basis of the agreed language.
- Progress makers
Progress markers are indicators of community-based change in behavior, attitudes, beliefs, and norms, which mark the steps along the path to the broad-based change identified in the program vision. Progress markers are often identified at three levels: “Expect to see” (basic changes that we strongly expect to see as a result of direct participation in programmatic activities); “Like to see” (desired changes that we hope to see as a result of participation in programmatic activities); and “Love to see” (transformational changes among direct and indirect participants).
Prior to rolling out the tool, the teams developed a set of standard operating procedures (SOPs) to detail the methodology and outline how colleagues within the program and MEAL teams would each be involved throughout the process. It was also proposed to engage a mix of local facilitators of the prevention curricula, civil society partners, and direct participants to serve as journal keepers, who would regularly collect data. This diversity of actors was selected to ensure that a range of perspectives was being captured, including those with knowledge of the GBV prevention curricula, as well as those with deep and ongoing experiences of community-level dynamics. The DRC teams provided training to these various journal keepers on the Results Journal method and produced an accompanying reference document aimed at building and maintaining their understanding of the initiative. To date, DRC has conducted eight months of data collection, with consistent involvement from the journal keepers and, in January 2024, they produced an initial report of findings from Jordan.
Lessons Learned
Through its rigorous piloting process, the DRC teams identified several key challenges. First, the team struggled to design appropriate progress markers. The initial phrasing was determined to be too conceptual and heavy with humanitarian jargon. Despite efforts to give space for community feedback and refinement during the validation sessions, DRC found that this approach was largely unsuccessful due to prevailing power dynamics and that a more collaborative design process with participants could have been more effective. Journal keepers were also confused about how to document observed changes. This was especially true in cases where observed changes were negative, given that progress markers are typically framed as positive changes. As a result, many journal keepers marked the frequency as high, based on their personal beliefs about what “should be,” but then provided qualitative examples that demonstrated a weak presence of the progress marker, leading to inaccuracies in the data. The teams found that rephrasing the progress markers as questions about the extent to which certain dynamics were present would enable the journal keepers to respond more accurately. Issues of literacy and data protection among the community participants who served as journal keepers further undermined data quality.
The DRC teams also faced difficulties formulating progress markers that bridged the gap between representing desired behavior changes within the wider community and results attributable to program activities. Notably, most of the examples being collected to demonstrate the progress markers do not reference DRC programming, which could undermine efforts to articulate the specific contributions of DRC activities to desired results and the overall change story to external stakeholders moving forward. Overall, further guidance on how to identify relevant progress markers and how to phrase them as precisely and simply as possible could have ameliorated these issues.
“The progress markers need to be connecting how the community feels about GBV to what the programming is doing.”
Maria Makayonok, Regional Protection Coordinator, DRC
The DRC teams further ascertained that their attempt to launch the Results Journals across all program locations without first piloting the approach and resolving any natural difficulties was overly ambitious. Notably, the process of preparing to roll out the Results Journals took considerably more time than expected. By the time that the SOPs and tools were ready and the journal keepers began collecting data, program implementation had already been underway for nearly a year. As a result, the first round of data did not represent a proper baseline. Moreover, the teams had attempted to reflect three separate prevention curricula within one standardized set of ten progress markers. Monitoring all of these progress markers proved challenging for the teams. Ultimately, during a review meeting in March 2024, the teams decided to further reduce the number of progress markers to five, sacrificing comprehensiveness in order to improve feasibility. Moreover, the budget had not accounted for the extensive staff time invested in the design, training, data collection, and analysis activities. For example, the team lacked a dedicated Information Management Officer to support the process. The team also faced dwindling motivation from MEAL colleagues, who tended to see the Results Journal activities as an extra step that wasn’t always clearly linked to program reporting. As a result of these factors, the team accumulated a large amount of data very quickly, which created a backlog in data entry and analysis. Nonetheless, the teams found the Data Entry and Exploration Platform (DEEP) software to be a free and useful collaborative tool for efficiently storing, processing, and analyzing their data. Data analysis and reporting for Lebanon is still ongoing.
“Once we got to the step of analyzing the Results Journals, the qualitative data they provided was impactful and really helped to illustrate examples of change. This only becomes evident once the journals entries were analyzed, showing the added value of using the Results Journals approach.”
Aleta Morn, Global Protection Advisor, DRC
Despite these challenges, the DRC teams recognized the Results Journals as a valuable method at generating detailed and formative information that improved their understanding of current community perceptions and reaffirmed that the GBV prevention programming was focused on the most relevant issues and needs. It also highlighted areas where DRC teams could further tailor their future programming on issues of self-esteem of adolescents, for example. During its recent review workshop, team members also acknowledged how the findings from the Results Journals had surfaced new opportunities to introduce activities related to policy advocacy to complement existing awareness raising activities at the community level.
Recommendations for Practitioners
· Keep it small – When piloting Results Journals for the first time, consider testing them in only a few locations and with no more than five to six progress markers in order to address any challenges and familiarize teams with the process before adopting them at scale.
· Let communities lead – Engage members of affected communities directly in the process of identifying and phrasing context-specific progress markers to embed their perspectives in the design and overcome the implicit power dynamics that tend to arise during validation sessions.
· Allocate sufficient time for design and analysis – Ample time is required to develop tools and processes for data collection and analysis and to train teams and journal keepers. Preparations should ideally be finalized before program implementation begins. Similarly, the process should leave time to translate and digitize handwritten entries, and to analyze and interpret the data.
· Nurture buy-in from MEL and program teams – Leadership should proactively support MEL and program teams to embrace a new outcome-oriented way of thinking and to work closely together to design and implement the approach.
· Budget for required resources – Ensure sufficient staff time and resources to effectively design and implement Results Journals, including dedicated positions such as Information Management Officers, as well as funding for printing of journals and training of journal keepers.