Date: Sunday, May 31, 2020
Hello AEA Members! I hope this finds you well and would like to begin by expressing my deepest sympathies to anyone who may have lost someone close to them during the pandemic. My heart hurts with you. In my continued effort to focus us all on shining our lights this year and how I believe each of us has a unique heart frequency, I’d like to share a global effort to create heart energy cohesion. It is fascinating research occurring with the Global Heart Coherence Pulse Check. These times we are in can feel very overwhelming; I’d like to draw us all back to our hearts and the love that connects us all. As you saw in Anisha’s April newsletter, and my letter to the membership, your AEA staff and Board leadership are meeting regularly, tracking details daily, and considering all best options for AEA programming. Your AEA Board has held extended meetings this month and would traditionally be on-site for in-person meetings in June. We are still going to hold those times for meeting, but will do so virtually. That is about 8-10 hours of meetings in June. Anticipate updates in the June newsletter coming out of those meetings. As I also mentioned in my letter, we are being well served by the association management company with assistance from attorneys, financial experts, and strategic decision-making. We should collectively be grateful for these resources during this time.
A major goal of the board and staff right now are to be available to you and offer all of the information we can as we are able. We are receiving messages nearly daily from AEA members asking about the conference and AEA’s plans. We understand everyone’s desires to have some information and gain some control over the seemingly endless out of control variables in all of our lives. We get it. I hope that those of you who are reaching out receive desired responses from the staff or board leadership and we encourage you to please keep reaching out to us! By the time you receive this, there will have been a Town Hall held for you to be able to connect with your board during all of this. Please know that your AEA staff and board leadership care deeply about you and wish to serve members as well as shepherd the organization’s financial conditions responsibly.
As you’ve now heard me say a few times, I never knew the level of complexity that goes into running a nearly 7,000-member professional association until I stepped into board leadership. I hope that many of you chose to put your names in for leadership this year! Elections will be coming up soon! Even in the face of uncertainty, we must push forward with processes that ensure the AEA’s leadership well into the future. Traditionally, our voting turnout is low (around 15-18%), I’ve been working for years to try and increase that. Please be on the lookout in late June for the election ballot and vote! Encourage your friends and colleagues who are AEA members to vote! Let’s collectively work to broaden the representation at the board level with wide voter turnout rates. Read about your candidates, feel free to reach out and ask for more information or look them up on LinkedIn.
Thank you all for your commitment to and membership within AEA. We are proud to have so many dedicated and passionate members focusing on the use of evaluation during a global crisis. We are stronger together!
From Sheila B. Robinson, Potent Presentations Initiative Coordinator
The classic David Bowie song includes the lyrics, “Ch- ch- ch- changes; Turn and face the strange ch- ch- changes; Just gonna have to be a different man.”
By now, we’ve had at least a few months’ experience working online and participating in video conference calls. People have even taken to holding their dissertation defenses, job talks, graduations, and even weddings online!
Last month, I offered some tips for looking and sounding your best online. This month, I’ll share with you some of the changes I make when taking what should have been an in-person presentation online.
We need your help!
Please contact me at p2i@eval.org and let’s talk! I’m happy to help, offer guidance, or collaborate on any of these.
Affiliation: International Technical Advisor, Monitoring and Evaluation at Education Development Center
Degrees: PhD, Social Development, Credentialed Evaluator
Years in the evaluation field: 19
Joined AEA: 2011
Why do you belong to AEA?
I appreciate what AEA has for its mission: to improve evaluation practices and methods, increase evaluation use, promote evaluation as a profession, and support the contribution of evaluation to the generation of theory and knowledge about effective human action. [I] am a firm believer in the need to apply evaluation theory and knowledge to ultimately improve lives. Therefore, AEA has been my professional home for the past nine years.
To date, AEA has offered me many professional development opportunities and great mentors including Nathan Morrow, Donna Merten, Jim Rugh, Micheal Bemberger, Jonny Morell, Katrina Bledsoe, [and] Madhabi Chatterji, to mention but a few! I moved to the United States in 2010 to work as a Design Monitoring and Evaluation Specialist for World Vision United States. So, I was very keen to join AEA for my new professional comradery. My first AEA conference was in 2011 in Anaheim, California. Since then, the AEA conferences, affiliate journals, and listservs have been at the core of my professional development.
Why do you choose to work in the field of evaluation?
As I stated earlier, I like it when evaluation helps to improve peoples’ lives. The potential for evaluation to contribute to program success fascinates me! That’s why am in this profession. My evaluation career started in 2001, just after I completed my bachelors in social work and social administration. I was a research assistant at a World Bank funded project aimed at improving nutrition in northern Uganda. My tasks were diverse—from photocopying workshop materials to writing workshop reports to disseminating research findings. Most of all, I was very fascinated (and I still am) by how insights from data can improve program success. Right then, I decided that program evaluation would be my line of work. And I’ve never thought otherwise.
What’s the most memorable or meaningful evaluation that you have been part of?
That’s difficult to determine, as all evaluations have strengths and constraints. But the 2013-14 Evaluation of the Emergency Capacity Building (ECB) Project stands out for me. The sheer scope of the intervention-on three continents and over 10 years, presented a worthwhile challenge. This was one of the largest multi-agency capacity development undertakings in the Humanitarian sector. This scope in time and geography also allowed the evaluation to contribute lessons for the broader global Disaster Management Sector. The Active Learning Network for Accountability and Performance in Humanitarian (ALNAP) recognized this evaluation as the evaluation of the month because of its “mixed methods approach, in which data was collated beyond the ECB to examine trends in capacity building in the sector,” and “because of its 10-year span, this evaluation also captures a considerable scope of dynamism within the sector, offering lessons and potentially reliable directions for the future of humanitarian capacity development” (ALNAP Humanitarian Evaluation and Learning Portal, 2014). In addition, I liked the fact that evaluation commissioners facilitated access to various stakeholders, allowed for sufficient methodological triangulation, and provided time for reflection on learning.
Is there anything else you would like to add?
Aspects of evaluation that am most passionate about:
I am very passionate about assumptions-aware evaluation. Carol H. Weiss once stated that “One program's overelaboration is another program's clarification.'' I have done evaluations for a while and I have learned that evaluators could do a better job in working with assumptions —assumptions of stakeholders and evaluators about programs and evaluations. Unexamined assumptions can be a huge risk to program success, and to helpful evaluations. Douglas E. Scates correctly and eloquently stated that “Looking for assumptions in research is something like looking for shadows in the late afternoon. They're everywhere. And once one starts noticing them he suddenly becomes aware that the world is full of them.” Examining assumptions can thus be aided by a number of tools. For a while now, my aspiration is to encourage and contribute to a conversation about how to work with assumptions in program evaluation. A starting point for this conversation is getting to a common understanding of the critical assumptions—what is worth examining and what is not. In the book Credibility, Validity, and Assumptions in Program Evaluation Methodology (2015), I propose a typology of evaluation assumptions structured according to a cycle of decision points in an evaluation process.
This book follows on the book Working with Assumptions in International Development Program Evaluation (2013, 2020), (with a French translation) where I introduced some typologies of program assumptions, and tools for examining them. You may also be interested in the Evaluation and Program Planning Special issue on Working with assumptions, guest edited with Nathan Morrow.
I hope that these resources will encourage and support evaluators to examine our own assumptions about the programs we evaluate, and the methods and tools we use.
What advice would you give to those new to evaluation?
Evaluation offers a great avenue for contributing to positive change in society. It gives us tools to think of and engage with change with more clarity of what success should look like. Here are two things that I’ve found to be very helpful to my evaluation career growth:
On March 11, the World Health Organization (WHO) declared COVID-19 outbreak a global pandemic. Less than two weeks later, leaders from the Disaster and Emergency Management Evaluation (DEME), Translational Research Evaluation (TRE), and the Health Professions Education and Evaluation Research (HPEER) TIGs collaborated to convene a Cross-TIG Town Hall for evaluators to connect with other evaluators and identify needed resources and areas of support.
With support from AEA, we are convening another related Town Hall to continue these collaborative discussions. The upcoming Cross-TIG Town Hall, “A Conversation with Evaluators on the Frontlines of COVID-19,” will feature an expert panel of evaluators to provide insights about evaluation practice, education, and research amid the pandemic.
In an effort to give you a sneak preview of the upcoming discussions that will be held during this Cross-TIG Town Hall, I asked several questions of the expert panelists and have highlighted their responses below.
What do you feel is the most important contribution that the field of evaluation can make to the COVID-19 response?
When discussing the most important contribution that evaluators bring to the COVID-19 response, the experts most often cited the adaptability and strategic focus of the evaluator.
Evaluators bring strength and resiliency into focus during the pandemic. Changes are occurring at a rapid pace and the evaluators are able to quickly design pragmatic studies that are also scientifically rigorous.
Patrick Barlow, co-founder of the HPEER TIG, states that “being able to help folks who don’t usually do this kind of research to design and conduct the studies quickly but with a focus on quality is one of our most important contributions during this time.” During the pandemic, Dr. Barlow has not only developed a mental health impact study for medical students, but also worked with the assistant dean of the medical school to design a study to examine the impact of dramatic changes to clinical rotations on student outcomes assessment. He further explains the unique perspective that evaluators bring to the process. “Even though we are deep in this process with our front-line colleagues, we can take a step back and think strategically and see the big picture and connect the activities strategically because we are not in the trenches.”
How has the COVID-19 pandemic affected your work as an evaluator?
The rapid pace of change has affected the role of evaluators in the field of translational research. Clara Pelfrey, chair of the Translational Research TIG, states that “this is translational scientists’ finest hour…we are finding out what we can do.” Dr. Pelfrey explains that the pandemic has vastly accelerated the translation of COVID-19-related research findings into real-world improvements in health and medical treatments. It can take 20 years to successfully translate something “but now we are being asked to do this in real-time because we need a cure for this and we need it fast.” She cites that the Clinical and Translational Science Award (CTSA) infrastructure in the 60+ academic centers nationwide are now including evaluators at the table to participate in the discussions of many of the COVID-19 applications. “It is imperative that we can show congress that we can ramp up research really fast to demonstrate our value as evaluators and as a translational research consortium.” She further emphasizes the importance of creating templates and processes that can be used again for other disasters so that we are more prepared, have systems in place, and are more ready for the next disaster, if one should occur.
How has the COVID-19 pandemic helped to “Shine a Light” on the role of evaluation?
It has been argued that the pandemic has put a spotlight on problems that we were already facing; or in the words of the 2020 conference theme, helped to “shine a light” on existing challenges such as ensuring health equity and resiliency for our vulnerable populations.
Scott Chaplowe, co-founder and former chair of DEME TIG, states that the pandemic brings “an opportunity for a transformative change in the field of evaluation, how evaluation is conducted, and operates to support the urgent need for transformative change globally.” Dr. Chaplowe suggests that as a profession in the business of assessment in solving problems, evaluation can play an important role in the essential shift to more environmentally responsible and risk informed approach to policy, strategy, and programming. The pandemic provides an opportunity for the continued uptake and practice of complexity theory and systems thinking in evaluation that he feels “should consider environmental systems on equal par with human systems, as the survival of both are interdependent.” It should be noted that each of the experts’ stresses that, as we move forward, the pandemic provides an opportunity for change for the field of evaluation that will continue in the new normal that occurs.
This is just a preview of the questions that our panelists will discuss in more depth during the Cross-TIG Town Hall “A Conversation with Evaluators on the Frontlines of COVID-19.” I, along with the DEME, TRE, and HPEER TIGs, invite you to participate in what promises to be a lively discussion of front-line experiences and insights on the role of evaluators and evaluation in the context of the COVID-19 pandemic.
Do you lead or participate in one of AEA's Topical Interest Groups (TIGs)? We want to hear from you and spotlight your work and actions you're taking amidst the COVID-19 crisis. Send an email to the AEA editor, Cady Stokes (cstokes@eval.org) to share news, updates and articles for consideration in an upcoming AEA newsletter.
Are you new to evaluation? Do you have questions, curiosities, or concerns about the industry? Are you debating career opportunities, upcoming goals, solutions to current issues, or are just seeking some friendly advice? AEA welcomes you to participate in our new series: Ask AEA.
The only way to grow in your profession is by asking questions. And we want to provide our members with as many resources as possible. Submit your questions for the chance to be featured in AEA's monthly newsletter. Make sure to stay up-to-date on the latest issues to receive answers to your questions from professionals in the field.
Submit your questions here.
Measures for Clinical Practice and Research edited by Joel Fischer, Kevin Corcornam and David W. Springer is the definitive reference volume on assessment measures for both practice and research in clinical mental health. This new edition includes hundreds of standardized measures, including new instruments for measuring children’s clinical conditions, new measures for couples and families and target searches for instruments in health care conditions, personality disorders, and addictions.
Volume 1: Couples, Families, and Children (9780190655792): $89.95
Volume 2: Adults (9780190655808): $99.95
Two volume set (9780190655815) $160.00
Just as a reminder—and this is of course something you should feel free to promote with your members—all AEA members can receive a 20% discount when they order through the website www.oup.com/academic using the discount code AEA20.
If you are a publisher and would like to participate as an AEA publishing partner, or if you are an author of an evaluation-related text from an alternate publisher that you would like to see participate, please contact the AEA office at info@eval.org.
AEA's top priority at this time is the health and well-being of its members and the evaluation community as a whole. We understand this is a strenuous and difficult time, and are dedicated to providing you with support and resources to help you navigate the evolving effects of the COVID-19 outbreak.
We want to remind you of a few of our resources to help you through this time.
Topics covered include Reflecting on the Role of Evaluator During this Global Pandemic, Tips + Resources for Virtual Gatherings, and Self-Care in the Age of Coronavirus.
Click here to subscribe to AEA365. We will continue to share resources and experiences of our community.
While you are looking to stay connected to your teams, we recommend browsing the AEA Coffee Break library on the Digital Knowledge Hub. These 20 minute webinars are free to all members.
If you have resources you think would be valuable to the evaluation community, share them with us by contacting AEA at info@eval.org.
In this section, we spotlight events of interest to the AEA community, suggested by fellow members. Please note these events are not sponsored by AEA. If you would like to suggest an upcoming event, email Cady Stokes, AEA newsletter editor, at cstokes@eval.org.
From the AEA Education Team
The Digital Knowledge Hub is an online platform featuring professional development opportunities for evaluators, by evaluators. See eStudies available for purchase like the ones below.
While many of us are working from home the next few weeks, we wanted to remind you that AEA membership provides several exclusive resources to expand your knowledge in the comfort of your own home. Discover the online resources that are available:
Upcoming Live E-Learning Opportunities
Learn more about these events.
The COVID-19 outbreak has forced many in-person events to cancel, include the Summer Evaluation Institute. Even though we cannot meet in-person, we can still learn and connect with each other in a format that is safe and cost effective. We are happy to bring you five unique workshops that were scheduled to take place at the Summer Evaluation Institute this June.
AEA staff worked with accepted Summer Evaluation Institute presenters to identify workshops that can conform to a digital learning experience. These digital workshops provide opportunities to have meaningful discussions with your peers, ask presenters questions to work through challenges, and learn first-hand from experts in the field.
Digital Workshops:
Adding Costs to Help Your Evaluation Get Used: Cost-Effectiveness and Cost-Benefit Analyses for Health and Human Services
When: Wednesday, June 10, 12:00 pm - 3:00 pm EST
Presenter: Brian T. Yates, Ph.D.
Price: Members: $150; Nonmembers: $200
Including costs of programs can help your evaluation get funded, read, and used.
Evaluating the monetary outcomes (aka “benefits”) of programs, such as reduced client use of health services and increased client productivity and income, can further influence decision-makers. Incorporating consumer, provider, and funder costs into cost-effectiveness, cost-benefit, and cost-utility analyses provides the foundation for calculating cost per Quality-Adjusted Life Year Gained as well as Social Return On Investment (SROI) estimates. This digital workshop will show you what your cost-inclusive evaluation can be and how it can elevate your evaluation. Together, we will review examples from real evaluations of substance abuse prevention and treatment programs, and health as well as mental health services so you can gain real-word understanding of the benefits of cost-benefit analyses.
Learn more.
View the full Summer Series schedule.
The Digital Knowledge Hub contains live and recorded eStudies. eStudies offer in-depth lessons on trending evaluation topics, skills, and tools. Expert speakers share their experiences and offer time to answer your individual questions.
Upcoming eStudies:
Student eStudies:
Here are a few Coffee Breaks you might be interested in:
In this section, we spotlight events that may be of interest to the AEA community, as suggested by fellow members. Please note these events are not sponsored by AEA. If you would like to suggest an upcoming event, email Cady Stokes, AEA newsletter editor, at cstokes@eval.org.
In this section, we spotlight events of interest to the AEA community, suggested by fellow members. Please note these events are not sponsored by AEA. If you would like to suggest an upcoming event or highlight actions members are taking during the COVID-19 crisis, email Cady Stokes, AEA newsletter editor, at cstokes@eval.org.
AEA would like to recognize and thank some of its most longstanding members. Click here to view individuals who are celebrating 5+, 10+, 20+ and 30+ years with the association this month!
AEA would like to welcome those who have recently joined the association. Click here to view a list of AEA's newest members.