Date: Monday, November 23, 2020
Hello AEA Members! This is the final newsletter of my very unusual presidential year! I hope this finds you well and you’re continuing to remain safe during the pandemic. I hope that all of you who attended the 2020 AEA virtual experience enjoyed it, and those who were not able to do so live can review recorded sessions over the next few months. The registration page remains open until the sessions expire in January 2021. This was definitely one of the most interesting conference years I can remember as an AEA member. I sincerely appreciate all whose volunteer time contributed to the success of that experience. There were hundreds of AEA members who worked hard to make this a meaningful experience for us all. Thank you for that.
I hope that the theme of “How Will You Shine Your Light” has been meaningful and inspiring for you to focus on during the course of what became a challenging year for us all. I will encourage you all one last time to check out the HeartMath Institute to join in the global efforts to bring more love to the work we all do, regardless of our profession. More heart intelligence. Given the heightened stress of our times, the work they are doing is critical. We all need to learn how to manage our stress and bring more coherence to our planet (Global Coherence Initiative).
We, at the board level, are rounding out what has been an extremely busy year and are preparing for next year’s 35th anniversary celebration. Thank you to all the AEA Board for the hard work during a very challenging year. For more information about next year’s theme check here. On our radar for the next year is work on improving the awards processes, our communications processes, and creating greater community among AEA membership with a new and improved AEA website and community groups application. We will be working on our own communications from the board to the membership next year and hope to be more transparent about how we work and why we may make certain decisions. We serve this organization at your election. We are all, collectively, AEA. Thank you for the honor of serving as your president, I have enjoyed it thoroughly despite the challenges!
Finally, thank you to all of the AEA members who voted in this year’s election. We had a nearly 21% voter turnout rate this year. The highest rate in over 10 years! We are hoping to continue this upward trajectory of voting rates for the organization, so please consider getting involved in leadership at any level this upcoming year! It takes all of us to run AEA successfully. Congratulations to the new president-elect and the new board members-at-large. As I have always said, you should feel free to reach out to any of us in leadership. We serve you!
From Sheila B. Robinson, Potent Presentations Initiative Coordinator
The world is changing…fast. And so should your slides, especially for an online presentation! Now of course we know our attention spans are not what some people who have us believe – the goldfish comparison has been debunked. But constant and multiple distractions are a reality. No doubt you have at least one other device running when you’re on a video call or participating in a webinar or course. And like so many of us, you’re probably also working in a place with family members, pets, or colleagues.
Online presentations must keep our attention to be effective. One way to do this is to have ever-changing visuals. We’re not talking about a random string of kaleidoscope effects here. We’re talking about intentional slide design that supports your key messages and helps tell the stories you need to tell.
Make Multiple Slides This is perhaps the easiest way to keep things moving. Use photos, graphs, icons, or short snippets of text with just one or two elements on any given slide, and change slides often. People process visuals very fast and if you continue to talk long after they’ve processed the element on the screen and without YOU to watch (as they would in person), other visual distractions become powerful temptations.
Use Animation I know, I know. You’ve heard that animations and transitions are passe or unprofessional. Not so! I’m not advocating checkerboards, origami, or objects randomly flying in and out of your slides just for visual effect. Choose subtle effects such as fade and wipe to change the scene and have objects come, go, and fade into softer colors.
Build As You Go If you have multiple objects or must show some text on slides, add these in one at a time. If you have three bullet points on a slide, your audience will be done reading these long before you’ve finished talking about the first one. So why not just show them one at a time? Even better, show them something visual instead of the text! Use photos, icons, graphs, or other visual elements that support what you’re saying, not repeat it.
The wonderful thing about animations and transitions are that they can be customized in infinite ways. You can change the timing, the effects, the triggers, the paths object take, and much more. If you’re on newer version of PowerPoint, you can enjoy the wonders of Morph transitions to support your message. Don’t be afraid to explore and experiment!
It’s all relatively simple to learn, but a very powerful strategy to keep and hold your audience’s attention.
We need your help!
Please contact me at p2i@eval.org and let’s talk! I’m happy to help, offer guidance, or collaborate on any of these.
From Kotoe IKEDA, Ph.D, Assistant Professor, Shigakkan University, Japan
“Today’s number of new confirmed COVID-19 cases are 2,376. It’s the worst number ever.”
Japanese TV news tells viewers updates like this data every day. We then use the data to consider what we should do during this catastrophe. As an evaluation professional, have been committed to the evaluation of diverse efforts.
In such a circumstance, Eval20 Reimagined was the first online conference I attended. Participating from Japan, the time difference was 14 hours. I was teaching classes and providing thesis guidance with Zoom during the day. At night, I opened my laptop and pondered various things with a cup of strong coffee: Sure, that's important, but it's been overlooked! ; Well, this framework has evolved like this! It's amazing! ; Oh! Evaluation is so wonderful! However, somewhere in the back of my mind, there was another thought: In the local world around me, there is a completely different reality.
During 2009 in Japan, "The Government Revitalization Unit" was held as a review of the budget formulation for the next year, and there was a big evaluation movement. I was a doctoral student at that time, and I was conducting a school evaluation project with the empowerment evaluation framework.
The progress of the budget screening was reported in the media every day, and the public interests were focused on the evaluation. However, what was reported in the media was that the evaluators threw hard words toward the officials, and they were unhappy. A "scary" image of evaluation was created; the public had a misconception that evaluations only cut budgets and hunted down implementers, and the many government offices trembled with anger. Anyway, the evaluation movement scale quickly became smaller.
In 2007, my research field at the university I work for was obliged to implement a school evaluation. I began working with schools, which is confused with new legislation, to try to implement school evaluations based on the empowerment evaluation model using Getting to Outcome (GTO). In this action, I have seen principals and teachers not be fearful, but joyful in the improvement of school organization and educational activities through evaluation. By disseminating these strategies, we aimed to accomplish the autonomous school management, but we could not reach a large-scale implementation. Although I am keenly aware of my powerlessness, the school evaluation act has increased the burden on teachers and created psychological anxiety about receiving complaints from parents or fear of being denied their educational activities. School evaluation has become a dead letter for paperwork for each year.
Even in my own position as an assistant professor at a university, the evaluation is hated. Many universities are constantly busy preparing documents for evaluation. Those who have been appointed as members of the university evaluation working group feel that they have been punished.
The class evaluations also started accompanying university evaluation, and students are required to answer evaluation questionnaires for each class at the end of the semester. The content was shocking to me. Although there are some differences between universities, most of them have a checkmark for questions, such as "Was the class easy to understand?" or "Were you satisfied with the class?" Or there is a space for students to write freely. This evaluation was conducted by the student affairs office. Student affairs staff review the content and return it to the faculty member without any analysis or processing. To be honest, it is uncomfortable and even scary, even if the student responses are not negative.
As I walked through this reality as a psychologist, I first considered this problem from psychological theories of social perception, such as cognitive dissonance theory. How do we resolve the cognitive dissonance when we receive negative evaluation comments, even though we have been desperately thinking about the program stakeholders or participants?
Also, I thought with theories related to test anxiety. It is well-known that test anxiety interferes with performance.
The anxiety and fear of evaluation is further enhanced in these experiences. But still, I am attracted to the world of evaluation because of the reflective efforts of the evaluators themselves, who have produced such things as utilized-focused evaluation, collaborative evaluation, and empowerment evaluation. I believe that there must be a way to get rid of anxiety and fear and use the evaluation approach to be and to do better.
Thank you for giving me the chance to write this topic! With your help, and working with everyone who has faced similar challenges of anxiety and fear, I would like to break through the state of evaluation in Japan.
Are you new to evaluation? Do you have questions, curiosities, or concerns about the industry? Are you debating career opportunities, upcoming goals, solutions to current issues, or are just seeking some friendly advice? The only way to grow in your profession is by asking questions. In our new series, Ask AEA, we want to provide our members with that opportunity by providing as many resources and guidance as possible. Answers come from our member community. Submit your questions here.
Read below for advice from Laura Peck, Ph.D, Principal Scientist, Social & Economic Policy Division, for some ways in which evaluators can adjust to changes in data collection during the COVID-19 pandemic.
Question submitted by: Sondra LoRe
Title: Evaluation Manager
Company: National Institute for STEM Evaluation & Research
Years in the Field: 20
In what ways have you adjusted to changes in data collection during the COVID-19 pandemic?
The COVID-19 pandemic has had a major effect on program evaluation. In addition to the implications for research that I highlight in THIS brief video, I have co-authored THIS paper on the topic. My colleague Michael Link has a two-part video series on collecting data in these times (see HERE and HERE); and I will share a few more thoughts here. In two categories, let me address first the mode or process of data collection, and second the content of data collection.
Mode. For surveys, we can no longer rely on in-person field interviewers and instead must consider alternatives: relying on telephone, mail, SMS/text, or other online modes. Substantial innovation has taken place around qualitative data collection too. Think: virtual focus groups, and Zoom-style meeting rooms, for example. People whose experiences we want to capture still want to be able to contribute to research; and it is incumbent on us to figure out how best to do so. Observational data collection might take place through remote sensing or via key informant reporting, where the safety of research participants (and places) must be our top priority.
Content. Other pandemic-inspired adjustment to data collection have involved changing what questions we ask. For example, for my one HUD study, where the long-term follow-up survey was in the field and only half-way complete as of March 2020, we added two questions. Because this study concerned homeownership and financial well-being, of course we collected data on study participants’ assets, savings, and debt. We added a probe to our survey to capture, for example, whether their reported savings reflected any stimulus income from government. We added another probe, an open-ended question, to ask people whether and how their homeownership and financial situation was affected, if at all, by the pandemic.
In closing, let me state that this topic is huge, and I have shared only a couple of examples of how the process and content of data collection has responded to the pandemic conditions that we face. I look forward to additional, forthcoming scholarship and commentary that will help connect all of us around the smart practices being implemented in these challenging times.
*Questions may have been edited for AEA style purposes
Submit your questions for the chance to be featured in AEA's monthly newsletter. Make sure to stay up-to-date on the latest issues to receive answers to your questions from professionals in the field.
From Tessie Catsambas, AEA Past President and Board Secretary
Dear AEA Members,
We hope you had all enjoyed the AEA virtual event 2020! Board members are proud of our executive director Anisha Lewis and staff who managed an excellent event with engaging sessions and opportunities to connect and learn together at a distance. Board members are also proud of our president Aimee White who, unthwarted by the challenge, put together a strong online presidential strand program.
We are also pleased to report on our well-attended virtual AEA business meeting. After sharing our updates with members, we met in smaller groups for a deeper dive and dialogue on several important topics where we received excellent inputs. Here are some highlights:
In the Presidents group (Aimee White, Tom Grayson, and Tessie Catsambas), we discussed our board priorities in the current context and challenges (COVID-19, elections, White House Executive Order on Diversity), reaffirmed our commitment to the local affiliates network, our strategies for engagement in elections, our support of the Eval4Action campaign that features a commitment to young and emerging evaluators in support of the Sustainable Development Goals, and the work on evaluation competencies. President-Elect Tom Grayson said a few words on the 2021 theme AEA at 35: Meeting the Moment.
In the Youth group, Bianca Montrosse-Moorhead offered an update on Young and Emerging Evaluator (YEE) work at the AEA and around the globe where our AEA EvalYouth representative is Zach Tilton. Internationally, EvalYouth is working to develop a toolkit for guidelines on the importance of engaging YEEs in VOPEs. A central focus of our work in this area is the Graduate Evaluation Diversity Initiative (GEDI) program, which focuses on equity and inclusion work with young and emerging evaluators (we have 12 GEDIs this year, housed at University of Illinois, Urbana-Champaign with Rodney Hopson and Brandi Gilbertson as co-Directors). The evaluation of GEDI is coming up soon, and we look forward to learning from that mature program. More work in this area takes place in the GSNE TIG.
In the Finance group, Felicia Bohanon and Tom Kelly discussed our active management of finances given the COVID challenge, and how we find ourselves in a relatively healthy financial condition for FY2020 (especially compared to peer organizations). We also discussed the importance of the board being an effective governing board of a large nonprofit with now full-time executive director and staff. Participants shared local affiliate innovations especially in virtual outreach and conferences that the AEA can learn from.
In the International group, Hanife Cakici along with our IOCE Board representative Donna Podems heard member suggestions for making sure that our growing number of international AEA members feel and know that “they belong” in our organization even though the AEA is based in the United States. Participants also offered a vision of breaking out of siloes within (e.g. TIGs, IWGs, VOPEs), because today's pressing problems are global with repercussions for everyone wherever they are located. They encouraged the Board to keep active in global efforts and initiatives are already taking place (such as Eval4Action, Blue Marble Evaluators, EvalPartners, etc.). Other issues discussed were the possibility of the AEA forming institutional partnerships with other networks, our role in improving national evaluation capacities, and taking advantage of digital technologies.
In the AEA Mission/Vision group, Jara Dean-Coffee and Karen Jackson suggested that we are asking ourselves who is the AEA best positioned to serve today, and welcomed ideas and questions for our inquiry journey forward. Participants asked for transparency in the process, and suggested that we examine questions of methods, epistemology, axiology or values and purpose of evaluation.
In the Membership Engagement group, Lisa Aponte-Soto along with our ED Anisha Lewis hosted. There was interest in our evolving strategies and communication for engaging members, as well as our support of TIGs particularly for programming. Participants urged the board to be strategic in the membership engagement, and to offer a clear way to navigate through options for engagement.
In the Awards group, hosted by Eric Barela, participants urged us to ensure to make the nomination process less burdensome.
In the Communications group, Libby Smith provided an update on the status of the Communications Task Force and the effort to improve our communications guidelines. Members would like to see even more transparency and timeliness. Members are eager for the return of EvalTalk and hope to see guardrails for members to engage in important conversation in respectful ways. Finally, members are looking for ways to communicate across TIGs and hope the TIG Scan and new website will result in improved communication between staff and TIG leaders.
We have now turned our attention to shaping our agenda for 2021, and all your excellent inputs is part of our deliberation.
Respectfully,
Tessie Catsambas
Eval20 Reimagined: A Virtual Experience was an all-out success! Thank you to those who joined us from October 27-30 for several days of learning, connecting and engagement.
Now, attendees have access to session recordings through January 31, 2021, allowing them to relive their favorite moments.
Missed out on the experience? Register by January 8 to access recordings here.
Thank you for taking the time to vote for your 2021-2023 AEA leadership. The race was close and we're certain the choice between all of the talented candidates was not an easy decision to make. It is with great pleasure that we share with you the outcome of the 2020 election for the American Evaluation Association Board of Directors.
The election was open from October 2 through November 2 and received 1253 votes (21.9% response rate). For comparison, recent historical voting rates are noted below. Full election results can be found here.
Recent Historical Voting Percentages:
Congratulations to our newest AEA Board Directors and thank you to those who took the time to vote!
Guilford is happy to offer American Evaluation Association members 30% off the list price of all Guilford titles—plus free shipping (US & Canada) to enhance your research, teaching, and professional development. Just go to our AEA member page to receive your special discount: Guilford Publications AEA member page.
If you are a publisher and would like to participate as an AEA publishing partner, or if you are an author of an evaluation-related text from an alternate publisher that you would like to see participate, please contact the AEA office at info@eval.org.
AEA's top priority at this time is the health and well-being of its members and the evaluation community as a whole. We understand this is a strenuous and difficult time, and are dedicated to providing you with support and resources to help you navigate the evolving effects of the COVID-19 outbreak.
We want to remind you of a few of our resources to help you through this time.
Topics covered include Reflecting on the Role of Evaluator During this Global Pandemic, Tips + Resources for Virtual Gatherings, and Self-Care in the Age of Coronavirus.
Click here to subscribe to AEA365. We will continue to share resources and experiences of our community.
While you are looking to stay connected to your teams, we recommend browsing the AEA Coffee Break library on the Digital Knowledge Hub. These 20 minute webinars are free to all members.
If you have resources you think would be valuable to the evaluation community, share them with us by contacting AEA at info@eval.org.
In this section, we spotlight events of interest to the AEA community, suggested by fellow members. Please note these events are not sponsored by AEA. If you would like to suggest an upcoming event, email Cady Stokes, AEA newsletter editor, at cstokes@eval.org.
From the AEA Education Team
The Digital Knowledge Hub is an online platform featuring professional development opportunities for evaluators, by evaluators. See eStudies available for purchase like the ones below.
While many of us are working from home the next few weeks, we wanted to remind you that AEA membership provides several exclusive resources to expand your knowledge in the comfort of your own home. Discover the online resources that are available and learn more about upcoming events.
The Digital Knowledge Hub contains live and recorded eStudies. eStudies offer in-depth lessons on trending evaluation topics, skills, and tools. Expert speakers share their experiences and offer time to answer your individual questions.
Upcoming Live Courses:
See more live courses here.
Student eStudies:
Evaluation workshops are now open for registration! We will be hosting 10 workshops inspired by Eval20 Reimagined: A Virtual Experience! These workshops provide in-depth lessons on evaluation techniques and best practices.
When: December 4, 12:00 p.m. EDT
Presenter: Kimberly Fredericks, Dr.
Interest and use of social network analysis (SNA) as a methodology within evaluation continues to climb. As such we look for new ways and understanding to help with this analysis. This workshop will dive more deeply into the parameters of when and how to use SNA within your evaluation, data collection methods, and statistical analysis of the findings. The workshop is very hands-on, emphasizing the software and using the concepts and methods to answer research questions. It also covers use of network analysis in applied settings. UCINET and Netdraw will be the base programs utilized to analyze data and discuss results.
When: December 8, 12:00 p.m. EDT
Presenter: Kirk Knestis, PhD
Typical graphical logic modeling approaches illustrate elements of a program’s theory-of-action in terms of the relationships among inputs, activities or processes, outputs, and short- and longer-term outcomes (GAO, 2012; W.K. Kellogg Foundation, 2004). While particular guidance for how to structure such models may vary slightly (e.g., substituting “impacts” for long-term outcomes), the constraining assumptions and structures associated with conventional logic models can leave evaluators, program designers, and managers hung up on vocabulary (is it an “output” or an “outcome?”) or stuck forcing complex programs into too-simple frameworks.
When: December 10, 12:00 p.m. EDT
Presenter: Michael Quinn Patton
Developmental evaluation (DE) guides innovative initiatives in complex dynamic environments. Principles-focused evaluation (P-FE) is one special application of DE focused on evaluating adherence to effectiveness principles for achieving results and guiding adaptive action. Blue Marble Evaluation (BME), addressing global challenges of sustainability and equity, is the latest advance in principles-focused developmental evaluation. The essential principles of DE, P-FE, and BME will be examined and applied. Participants will learn to use the GUIDE framework, an acronym specifying the criteria for high-quality principles: (G) guidance for action, (U) utility, (I) inspiration, (D) developmental adaptation, and (E) evaluable. Participants will apply the GUIDE framework to their own projects. Integrating DE, P-FE, and BME moves beyond project/program evaluation to evaluate strategies, collaborations, diverse interventions, and systems change initiatives. Complex concepts, systems thinking, and AEA Guiding Principles will be incorporated. Participants will also learn DE, P-FE, and Blue Marble Evaluation methods, designs, applications, and uses.
When: December 15, 12:00 p.m. EDT
Presenter: Meri Ghorkhmazyan
This workshop will be based on the curriculum of World Learning’s Transforming Agency, Access, and Power (TAAP) Toolkit and Guide for Inclusive Development phase two on Social Inclusion Analysis. Thus far in the field of gender equality and social inclusion, many guidelines have been principle-based and have not offered practical tools, templates that would allow practitioners to consistently implement social inclusion analysis studies. The TAAP Social Inclusion Analysis provides a consistent framework, following the steps of general study design and offers data collection, analysis and reporting tools to collect, understand and visually demonstrate identities of individuals based on their experiences of inclusion and exclusion to inform development initiatives. This is done against six domains of social fabric:
Workshops require separate registration on the Digital Knowledge Hub.
View complete schedule.
As an AEA member, you have free access to our library of Coffee Breaks. These short, 20 minute webinars are great for sharing lessons with your students or other colleagues, while you are apart.
Here are a few Coffee Breaks you might be interested in
In this section, we spotlight events that may be of interest to the AEA community, as suggested by fellow members. Please note these events are not sponsored by AEA. If you would like to suggest an upcoming event, email Cady Stokes, AEA newsletter editor, at cstokes@eval.org.
In this section, we spotlight events of interest to the AEA community, suggested by fellow members. Please note these events are not sponsored by AEA. If you would like to suggest an upcoming event or highlight actions members are taking during the COVID-19 crisis, email Cady Stokes, AEA newsletter editor, at cstokes@eval.org.
AEA would like to recognize and thank some of its most longstanding members. Click here to view individuals who are celebrating 5+, 10+, 20+ and 30+ years with the association this month!
AEA would like to welcome those who have recently joined the association. Click here to view a list of AEA's newest members.