Blog
What we’re thinking about, what we’re working on, and what we want to accomplish.
In a world with increasing access to data, indices serve a purpose: to allow for complex concepts to be conceptualized in a single data point and compared across different contexts.
It was with this in mind that we created the Girls’ Opportunity Index in partnership with World Vision. World Vision sought to explore the relationship between the opportunities a girl was provided and child marriage rates. The activity involved only secondary national-level data and the creation of an index seemed perfect to all for country-to-country comparisons.
As members of the American Evaluation Association (AEA), our core values reflect those of the Association. One of AEA’s Guiding Principles for evaluation focuses on common good and equity: Evaluators strive to contribute to the common good and advancement of an equitable and just society. Our recent client, the Network of the National Library of Medicine (NNLM) at the University of Washington, contracted our team with this goal in mind. In partnership with their team of experts, Informed designed a suite of resources for conducting evaluations with specific populations – K-12 Education, LGBTQIA+, Race & Ethnicity, and Rural Health – that encourages sustainability in M&E processes and procedures.
InformEd had the pleasure of partnering with Davidson College during the fall of 2020 to work with one of their students as part of their new ‘High Impact Experiential Learning’ class. Grace was a phenomenal intern for our organization, lending her design expertise to finalize the pitch deck for our latest program, Learning Links.
The COVID-19 pandemic has been hard. Each person in the world is adjusting to this new (temporary) reality, facing his or her own challenges or heartache. We’re thinking of you. We’re wishing we could do more to help. With that desire to do more to help, we’ve recently connected with the flower farmers from Pike Place and Seattle Neighborhood Farmers markets.
Earlier this month, InformEd facilitated a capacity building workshop sponsored by the Washington State Arts Commission (ArtsWA) during their Creative Forces Summit II. Over 20 organizations were present at the Summit where we led a series of activities that culminated in logic models for arts programs providing services to military and veteran populations.
Most of the projects we are working with ultimately aim to benefit children. Getting the perspectives and opinions of the primary beneficiaries is key but facilitating effective and insightful discussions with children can be much more difficult than interviewing adults. Here are a few strategies I recently found useful while facilitating child Focus Group Discussions (FGDs) in Nepal.
InformEd International is using a Principles-Focused Evaluation approach to guide the Developmental Evaluation (DE) for Save the Children’s School Leadership and Management (SLaM) model. Guiding principles give direction to what should be accomplished by the project. Guiding principles are helpful in complex interventions which need to be adaptive and responsive but also organized around some clear ways of navigating decisions and action. In the face of complexity, guiding principles provide grounding and orientation on how to proceed.
InformEd International recently supported a review of World Vision International’s Literacy Boost pilot programme results from 2013 to 2017. The review analyzed reading assessment results from 11 randomized control trials within 10 countries (11 programmes) in sub-Saharan Africa and South Asia.
InformEd International is using developmental evaluation (DE) to facilitate the creation of Save the Children’s School Leadership and Management (SLaM) model.
Last month, the Informed team traveled to San Francisco, joining more than 2,500 education professionals for the 63rd Comparative and International Education Society’s conference under the theme ‘Education for Sustainability.’ We shared two presentations on the role of school leadership and management as a sustainability driver – one a paper entitled, “Exploring and Measuring the Relationship between the School’s Learning Environment and Learning Outcomes: Results from Cambodia, Uganda, and Zimbabwe,” and a PechaKucha session on “Trials and Tribulations of Generating Evidence in Contextualized Education Programming.”
This week, staff from the InformEd team are in Nepal beginning a 2-year developmental evaluation. In partnership with Save the Children and the government of Nepal, we will be working toward the creation of a program for improving school leadership and management. Over the coming months, we will be sharing what we learn through a series of posts. Before diving into our evaluation plan, though, let’s review why we are focusing on school leadership and management, why the teams decided to use developmental evaluation for the process, and our initial theory of change.
To build a piece of that vibrant community locally here in the Pacific Northwest (and to fill the void between annual AEA conferences), we’re kicking off the Seattle Evaluation Association. We’ve been working with the American Evaluation Association to establish the official Local Affiliate. While we’re spearheading the initiative, we’re looking for folks across all different sectors of evaluation to join. Our goal is to establish a vibrant community of evaluators in the greater Seattle area. We want to create a space for collaboration, support, and the exchange of ideas.
Most of the initiatives and programmes that InformEd works with target children. Gathering, understanding, quantifying child perspectives is essential. I’ve spent several years trying different mechanisms for child focus group discussions. The challenge has always been, how do we get the quieter child to voice her opinion? I’ve tried disaggregating groups by sex but there still seems to be one or two children who dominate the conversation. Quickly children follow suite, simply agreeing with their colleagues.
Recently, while carrying out an evaluation for Save the Children Norway, I tried the H-Method for child participation in the evaluation. It worked well and it ignited some very informative and interesting discussions among the children. Here’s how it works so that you can give it a go yourself:
The InformEd team spent last week in Cleveland, Ohio attending the American Evaluation Association (AEA) conference. If you haven’t had a chance to check out the AEA, I highly recommend it. It’s a great conference and I promise that you will leave with new skills, ideas, and perspectives. I first attended AEA in 2013. I still remember sitting in on my first data visualization presentation. I hadn’t given much thought to data visualization up until that point and then it dawned on me – a chance to combine creativity with data. More importantly, a way to keep attention and effectively communicate complex topics.
Flash forward to tonight, the discussions from AEA are swirling in my head, I’m coming off a 2-day workshop on Communicating Data Effectively with the very impressive and inspiring Stephanie Evergreen, and I’m watching the mid-term election results. Graphs and statistics are flying across the screen as pundits analyze incoming results.
While InformEd works in international development, it’s beneficial for us to learn from other data folks. Elections provide us with a unique opportunity to look at how different media outlets are communicating the same results. Let’s look at some of the best and worst data visualizations on election night.
How much did the program cost for each new student who was able to reach a high level of literacy? We didn’t have the right data for a cost-effectiveness analysis, so we carried out a cost-impact comparison.
The method might be appropriate for you, so we've broken down our analysis into five simple steps: