Friday, November 30, 2018
I can’t believe it’s almost December, and that means I’m coming to the end of my year as AEA president. It’s been an amazing ride! In this, my last installment of the newsletter column, I’d like to take a moment to reflect on the year and share a few highlights, a nudge and a wish.
First highlight: CLEVELAND!! Wow – it was great to see so many AEA members at the Evaluation 2018 conference in Cleveland! In fact, total attendance on-site was 3,206. Incredible!
I was teased quite a bit this year for my unbridled enthusiasm for having the conference in Cleveland, but I hope that those of you who were able to attend realized why I was excited. I know the weather wasn’t great, but the theme, terrific sessions, networking opportunities and exchange of knowledge kept attendees engaged. The food trucks with live music, top floor bar-with-a-view, science center and Rock and Roll Hall of Fame weren’t bad, either!
In addition to the in-person Cleveland conference experience, we had 628 people attend the virtual conference. These folks were able to view the plenaries and presidential strand sessions online in real-time. If you weren’t able to attend in person or virtually during the conference dates, keep an eye out for more information from AEA, or information on eval.org, on how you can access the virtual conference archives.
Another highlight for me this year was the opportunity to visit eight of the AEA Local Affiliates. I so enjoyed meeting evaluators in the Boston area (where I live), Denver, Atlanta, Ohio, Chicago, Vermont and at the Eastern Evaluation Research Society conference in New Jersey. I visited a range of venues (bars, coffee shops, a resort, universities, and even an aquarium) and heard from all kinds of evaluators who are focused on a variety of projects, programs, initiatives, theories, practices, approaches, populations and challenges. I hope to be able to continue these visits into my year as past-president, and with those visits hear more about how we can build a stronger relationship between AEA and its affiliates.
One of my real joys this year was working with the AEA staff – Anisha, Zachary, Natalie, Laura, Milos, Derrick, Lauren, Zack, Kelly, Brigid, Jayne, Elizabeth, Ryan, Jessica, Pete and Kristin. They are fabulous! They are consummate professionals who have our values front of mind in every interaction and decision. As president-elect, I managed the AEA hiring process that resulted in bringing our Executive Director, Anisha Lewis, on board. I’ve worked closely with her this year as she learned about AEA (and all our quirkiness) and brought her association management savvy to the role. We are one lucky association, and I can’t wait to see what great things Anisha does. If you were at the conference, I hope you had the chance to get to know our AEA team. If you haven’t met them in person or virtually, reach out – they love hearing from members!
Over the course of my year as president, I focused on working with the board to shape up our governance policies and procedures, and to govern in a disciplined and visionary way; ensuring that we’re on strong footing financially now and in the future; reaching out to Local Affiliates; enhancing transparency through activities like the Town Hall Forums; and collaborating with Anisha as she got settled in her role. I worked with a terrific group of AEA staff and program committee volunteers to plan a great conference, too!
In all of these interactions, I’ve been reminded that AEA members are smart, dedicated, opinionated, funny, creative, persuasive and passionate professionals, committed to using their super powers to speak truth to power, bring evidence to bear on decision making and make this world a better place. And, being evaluators, we don’t always agree on who, how, when, where and what to do, or even what a better world would look like when we achieve it and how we would know it when we see it.
With this in mind, I have one nudge for our association and our field: Whether we’re expressing strongly-held opinions in late-night emails, on EvalTalk, on Twitter, in conversations at the conference, at local, national and international meetings, or even to AEA staff and leadership, let’s keep it respectful. We all work hard in our day job and we all have lives outside of our evaluator-lives. Our membership in AEA helps us to connect, learn, share and grow professionally and personally. We all want AEA to best the best it can be – whatever that means to each of us – and we need to work together to make that a reality.
My wish for AEA? That as an association – a membership association – AEA offers members a place to call their “professional home,” and that we create, together, the theoretical, practical, technical, analytical, social and relational foundation that allows us to nurture our evaluation super powers and use them to speak truth to power, bring the wonders of evaluation to even more people and make the world a better place. President or not, I’m in it to make my wish a reality. Join me!
Walking the Talk is a column authored by AEA members, sharing what the association's values mean to them and how these values guide and impact their work. Interested in contributing? Email the AEA editor, Kristin Fields, at kfields@eval.org.
I became an evaluator because I love my community and I want to play a role in making our communities better for everyone. Early in my career, I recognized the power of evaluation for nonprofits, foundations, governments and others in the social sector to understand what works and what doesn’t to improve their programs and services. I have been deeply committed to the evaluation field as a tool for social change ever since, but I have struggled over the years to figure out how an individual evaluator could drive social change. Here I outline two ways I have tried to approach this.
Being intentional about the power of evaluation from the start.
This year, my team at Vantage Evaluation and I dug deep into this question: How could a for-profit evaluation consulting firm contribute to the larger evaluation and nonprofit fields? How could a for-profit evaluation consulting firm help evaluation deliver on its potential beyond the walls of our offices?
I am thrilled to report that this process has reaffirmed my dedication to evaluation as a tool for social good. Our team committed to do more than plan and execute evaluations, and intentionally evolve the way that purpose-driven organizations in our community and beyond think about and use evaluation. From here on out, our focus is to change the conversation about evaluation from collecting data “because we have to” to using evaluation as a learning process for strategic improvements. That change in focus starts from the acts of individual evaluators. Every time a single evaluator helps a single social sector leader experience the power of evaluation, a small shift happens. The way to change the world is not through big leaps but through the accumulation of thousands of these micro-moments.
Building social change into every moment.
Towards this end, my team and I work to educate all leaders (inside and outside the nonprofit field) about the power of evaluation, train and support on-the-ground program staff to infuse basic evaluation practices into their day-to-day work, and partner with organizations to plan and execute evaluation projects with an eye toward strategic learning.
These practice areas are infused with our values as an organization and the values of AEA membership: We define “good” evaluation as high-quality, culturally-responsive and useful work that contributes to program and policy improvement to enhance the public good. We continually seek ongoing opportunities to learn and grow so we can evolve our understanding of evaluation practices and by extension, our clients’ understanding. And we value different ways of understanding from different evaluators and program partners with different backgrounds, perspectives, ways of thinking and approaches. It is only by articulating and embracing the values that underlie our evaluation practice that we can move toward a world where evaluation drives social change.
Elena Harman, PhD is the CEO and Founder of Vantage Evaluation. Elena is the author of “Learning on Purpose: How Each Nonprofit Position Can Use Evaluation To Get the Answers They Need,” coming in January 2019 from CharityChannel Press. She is the 2019 President of the Colorado Evaluator’s Network.
Do you lead or participate in one of AEA's Topical Interest Groups (TIGs)? We want to hear from you and spotlight your work. Send an email to the AEA editor, Kristin Fields (kfields@eval.org) to share news, updates and articles for consideration in an upcoming AEA newsletter.
From Nick Hart, chair of the Evaluation Policy Task Force (EPTF)
AEA’s Evaluation Policy Task Force (EPTF) announced during the fall conference, Evaluation 2018, that its members are exploring potential updates to AEA’s framework for helping agencies develop and sustain evaluation capacity. AEA’s “Evaluation Roadmap for More Effective Government” was originally produced in response to then President Barack Obama’s calls to increase the use of evidence in policymaking.
Since its publication nearly a decade ago, the framework has been used by several U.S. federal agencies for informing the development of their written evaluation policies. While the framework has been useful in shaping numerous policies, much has also changed in the evaluation landscape.
In 2017, the U.S. Commission on Evidence-Based Policymaking issued its unanimous recommendations in 2017 that suggested formalizing the role of evaluation in federal agencies with chief evaluation officers. AEA publicly applauded the commission’s approach and the recommendations about evaluation capacity. In 2018, the White House’s Office of Management and Budget announced a government reform proposal that would encourage greater evaluation capacity, with senior evaluation leaders across government. AEA leaders similarly embraced the promising approach as “welcome efforts to strengthen government’s evidence-building and evaluation capacity.”
Given the changing landscape, AEA’s EPTF is considering updates to the Roadmap to ensure its continued relevance to current discussion of evaluation policy. While the task force received direct input during a session at the conference in Cleveland, we encourage all members to provide input to the process.
The Task Force welcomes input from AEA members through January 31, 2019, on the 17 recommendations in the Roadmap that address scope and coverage, management, quality and independence, and transparency issues relevant for developing evaluation capacity in agencies. AEA members are encouraged to offer suggestions for improving the recommendations, including those that might be prioritized, added, modified or removed, along with supporting rationale. Comments can be provided directly to evaluationpolicy@eval.org.
From Sheila B. Robinson, Potent Presentations Initiative Coordinator
Did you present at Evaluation 2018? Did you come home with any reflective notes about your presentation? Did you attend any presentations and take notes as I suggested in last month’s newsletter article?
Now is a good time to take stock – revisit your memories or notes about your own presentations or ones you attended and harvest key ideas that can serve to improve your presentation practice going forward.
Why is reflection important?
Reflection is an important part of learning. In order to learn, you can take in new content and ideas through listening, reading, observing and conversing with others. You can integrate this new knowledge with your prior understanding of a topic. If we’re talking about improving your practice, however, you need a representation of your current practice in mind in order to know where and when to apply new learning. This is where reflection comes in.
Consider these notions about learning and reflection from “Learning By Thinking: How Reflection Improves Performance,” by Giada Di Stefano, Francesca Gino, Gary Pisano and Bradley Staats:
What’s the best way to reflect?
One of the best ways to intentionally engage in reflection for learning is to ask yourself a series of questions and try to answer them. Reflective questions can start with the obvious:
Once you have considered these, ask yourself the following questions about how the audience reacted to your presentation (or a presentation you attended). When did they:
Analyzing questions from the audience can help.
Capturing the audience’s questions is a source of great feedback for you about how they received your presentation. It’s also one of the most powerful strategies for boosting the power of reflection. To get the most out of this exercise, determine the nature of the questions:
Analyzing audience questions in conjunction with the answers to your other reflective questions can result in a potent set of ideas to inform refining your Message, Design and Delivery for future presentations. And of course, I’ll take this opportunity to remind you about the free p2i tools available at www.eval.org/p2i. Imagine the power of revisiting these tools after a hearty session of reflection for learning!
Please contact me at p2i@eval.org and let’s talk! I’m happy to help, offer guidance, or collaborate on any of these.
From Shawna Hoffman and Cindy Clapp Wincek
This month’s post will share a few brief updates from AEA’s International and Cross Cultural Evaluation (ICCE) Topical Interest Group (TIG), and one from EvalPartners.
News from the ICCE TIG The 2018 AEA conference was another great one for evaluators that work internationally. With a total of 62 programmed sessions (panels, multi-paper sessions, skill-building workshops, bird of a feather, etc.), the ICCE TIG was the most represented TIG at the conference. Presently, the TIG has 632 members, making it AEA’s second-largest.
With another successful AEA conference now complete, and in preparation for the year ahead, transition is underway within the ICCE TIG. Current TIG Chair, Veronica Olazabal, will be transitioning from her three-year term in a leadership role with the group – thank you Veronica for your service! Xiaoxia Newton will be taking on the role of TIG Chair, Shawna Hoffman will be anchoring the 2019 proposal review process, and we are pleased to welcome a new member to the TIG’s leadership: Mishkah Jakoet.
In her new role as co-chair, Mishkah will coordinate the international travel awards and contribute to this newsletter on the TIG’s behalf, as of January 2019. If you are interested in learning more about the TIG or how you can get involved, please reach out to Xiaoxia at xnewton@uncc.edu.
An Update from EvalPartners EvalPartners – a platform for sharing knowledge on country-led monitoring and evaluation (M&E) systems worldwide – put out a newsletter in October with a few updates. Notably, it announced an innovation challenge, seeking proposals on the theme of localizing democracy, rights and governance evaluation practices. The challenge is supported by the U.S. Department of State as part of its grant to strengthen the role of voluntary organizations for professional evaluation (VOPEs) in democratizing learning and evaluation. For more details, including how to submit a proposal, please visit the EvalPartners Innovation Challenge 2018 webpage.
Join President Leslie Goodyear and Executive Director Anisha Lewis for an AEA Virtual Business Meeting, Friday, December 14 at 2 p.m. ET.
For those who were not able to join the AEA Annual Business Meeting in person at the conference, we invite you to join us for this Town Hall. We’ll share updates on this year’s activities and plans for the coming year. And as always, we’ll make time for questions!
When: Friday, December 14, at 2 p.m. Eastern Time (ET)
Register here.
From the AEA Education Team
The Digital Knowledge Hub is an online platform featuring professional development opportunities for evaluators, by evaluators. Check out the latest prerecorded eStudy now available for purchase:
eStudy 083: Introduction to Consulting Presented by Gail Barrington, an independent consultant, this practical eStudy will teach you what’s needed to be successful at applying management consulting, entrepreneurial, and small business skills to the evaluation setting.
Save the date for these upcoming live eStudy courses.
December 4, 6, 18 and 20, 12-1:30 p.m. ET | Presented by Errol Goetsch, Founder, 4Sight Prediction Solutions Pty. Ltd.
This eStudy will introduce evaluators to the world of M&E and give existing professionals new tools and a common language for monitoring and evaluating any project in the international development aid sector. This course is for anyone who interacts with and wants to influence donors, sponsors, regulators, lobbyists, media, beneficiaries, government departments, aid agencies, service providers, program or project or partner managers and staff and project designers or auditors, using a language and system that elegantly captures the key features of each and every project, using words, numbers and pictures.
eStudy 097: More than two options: How to collect LGBTQ inclusive data
December 11, 13, 12-1:30 p.m. ET | Presented by Ash Philliber, PhD, Senior Evaluator, Philliber Research & Evaluation
This eStudy focuses on training evaluators to ask questions in ways that are inclusive of LGBTQ communities. This is an ever-changing field and no forms have been developed that can be used nationwide without issue. Here we will discuss common terms, things to be avoided, and how to develop forms that will be appropriate in different places and with different groups.
eStudy 098: Working with Assumptions to Unravel the Tangle of Complexity, Values, Cultural Responsiveness
January 15, 29, February 12, at 12-1:30 p.m. ET | Presented by Jonathan Morell, Ph.D. Principal, 4.669 Evaluation and Planning/Editor, Evaluation and Program Planning; Apollo M. Nkwake, CE. Ph.D. International Technical Advisor, Monitoring and Evaluation, Education Development Center; Katrina L. Bledsoe, Ph.D. Research Scientist, Education Development Center/Principal Consultant Katrina Bledsoe Consulting
It is impossible for evaluators not to make assumptions that simplify the world in which programs, initiatives and “wicked problems” exist. Simplification—and parsing out to key values—is necessary because without it, no evaluation can reveal relationships that matter. We always need a model that provides a simple and straightforward guide for the construction of evaluation designs and data interpretation. The model may be formal or informal, elaborate or sparse, formally constructed or implicit. But always, there is a model, and always, to be useful, the model must provide a parsimonious explanation of the phenomena—and world—at hand.