Tuesday, February 27, 2018
From Leslie Goodyear, AEA President, with contributions from Lana Rucks, AEA LAWG Chair
As expected, this year is off to a busy start. As you know, we hired our new Executive Director, Anisha Lewis, and her first day on the job was Monday, February 19. We are so happy to have her on board and appreciate Denise Roosendaal for helping Anisha get to know us. Look for a newsletter feature from Anisha in the next edition, and a Virtual Town Hall Forum with her on Friday, March 23.
Don’t forget that conference proposals are due Thursday, March 15. You can find out more about the conference theme, Speaking Truth to Power, here. The recording of the first Virtual Town Hall Forum, which focused on the theme, can be found here.
Speaking of the conference, the AEA board met in Cleveland last month to scope out the conference venue, explore the city and meet with the Local Arrangements Working Group (LAWG). The LAWG and the folks at Destination Cleveland are thrilled to help us make the most of our time in Cleveland, whether it’s visiting the Great Lakes Science Center and the Rock and Roll Hall of Fame (both only a short walk from the conference); meeting with local evaluators, nonprofits, philanthropies, and universities; or discovering great local landmarks and restaurants. While visiting, we learned that Cleveland is a dynamic city of friendly people and great places – a great place to have our 2018 conference.
For this month’s newsletter, I’ve invited Lana Rucks, this year’s chair of the LAWG, to share some of the planning that is underway in preparation for our conference in Cleveland this fall . Lana is also President of the Ohio Program Evaluators’ Group and is principal consultant with The Rucks Group. Take it away, Lana.
Thanks, Leslie! The LAWG and the Ohio Program Evaluators’ Group (OPEG) look forward to welcoming AEA to Cleveland! The theme Speaking Truth to Power resonates with the LAWG and OPEG because Cleveland, and Ohio more broadly, is at the intersection of many of the nation’s challenges and the world’s “wicked” problems. Cleveland’s vibrancy, vast diversity, and rich history, particularly within the evaluation space, creates a fitting backdrop to explore these pressing issues. The LAWG and OPEG are excited to welcome AEA members to what used to be called “the mistake on the lake” but is now a City that Rocks! As we work to create opportunities for attendees to experience fully Cleveland, we want to provide the following contact information for early planning:
Do you lead a Topical Interest Group (TIG)? We have many Cleveland locals who can help make connections for TIGs interested in organizing education tours or learning opportunities. If you’re interested in connecting with local groups, contact Jan Noga, jan.noga@alumni.stanford.edu.
Looking to arrange a social gathering? Cleveland has many venues that can host small and large groups. There are restaurants, of course, but there are also a number of other venues perfect for non-profit groups across a variety of budgets. If you’re interested in learning more, contact Clara Pelfrey, clara.pelfrey@case.edu.
Interested in volunteering? We are looking for individuals to serve as volunteers in advance of the conference or at the conference site in Cleveland. If you’re interested in volunteering, contact Tom Williams, tgwilliams2234@gmail.com.
As we continue to develop local information and resources, we will update contact information. Until then, you can always reach out to me at lrucks@therucksgroup.com with any other questions, suggestions, or comments. See you in Cleveland!
Leslie and Lana met with the LAWG in Cleveland earlier this year. Pictured from left to right: Tom Williams, Data Research Specialist, Alcohol, Drug Addiction, and Mental Health Services Board of Cuyahoga County, LAWG Co-Chair and OPEG Board Member-At-Large; Leslie Goodyear, PhD, Principal Research Scientist at Education Development Center, AEA President; Lana Rucks, Principal Consultant, The Rucks Group, LAWG Chair and OPEG President; Clara Pelfrey, Evaluation Director - Clinical and Translational Science Collaborative, Case Western Reserve University, LAWG Volunteer; Thomas (T.J.) Horwood, Principal, ICF, LAWG Volunteer and OPEG Treasurer
From Ziad Moussa, Co-Chair of EvalPartners, immediate past President (2015-2017) of the International Organization for Cooperation in Evaluation (IOCE)
In January 2012, a group of like-minded evaluation professionals, led by UNICEF and the International Organization for Cooperation in Evaluation (IOCE), came up with the idea of EvalPartners. At that time, the evaluation community was mostly inward-looking and interested in the technical aspects of the professional practice. The idea of breaking down silos and harnessing the energy and talent of evaluators to advocate for evaluation differently and creatively proved to be an important milestone in the growth of the profession.
EvalPartners has already changed the EvalWorld. For the first time in history, an international professional year – International Year of Evaluation 2015 – was declared by the global evaluation community itself and then endorsed by the U.N. General Assembly in a special resolution that also endorsed the need to build evaluation capacity at national levels. Evaluation was also explicitly mentioned in the resolution concerning the Sustainable Development Goals (SDGs), where follow-up and review the implementation of the SDGs should be “rigorous and based on evidence.”
The 100+ activities around the International Year of Evaluation culminated with the launch of a Global Evaluation Agenda 2016-2020 (EvalAgenda2020), which synthesizes the inputs of thousands of evaluators and clients of evaluation from different countries and organizations who took part in these 100+ activities. Simply put, EvalAgenda2020 constitutes the DNA of EvalPartners and lays down a road map towards the vision for the future of the evaluation profession. It was no surprise that AEA was the first Voluntary Organization for Professional Evaluation (VOPE) to develop “13 different ways to support EvalAgenda2020” and has been “walking the talk” since.
I was privileged enough to accompany the development of EvalPartners since its official birth in November 2012, so I allow myself to highlight three “living” issues that shaped and are still shaping EvalAgenda2020:
1. How to address the gap between potential value and current acceptance of evaluation: While EvalAgenda2020 notices growing success and acceptance of evaluation globally, we should also be realistic in accepting that evaluation has not yet been embraced as widely as it should be. There is still inadequate appreciation of what evaluation is and how it differs from policy research, performance measurement or performance auditing, and how it can improve policy and programming on a practical level.
Interestingly, debates around methodological approaches that were monopolizing the discussion for a number of years, have given place to a more systemic discussion on how we can communicate our belief in the utility and use(s) of evaluation to non-traditional evaluation actors, and how we can break away from what Michael Quinn Patton calls a “self-limiting project mentality” in a recent publication. While the answer would need multiple volumes of this newsletter, it is very important to keep promoting the value of evaluation beyond the comfort zone of our classical circles, while remaining committed towards improving evaluation quality and usefulness.
2. Is evaluation a value-neutral management tool? One of the most passionate debates during and beyond the launch of EvalAgenda2020 was around the statement that “Evaluation is not simply a value-neutral management tool” and how it should be grounded in values of equity, gender equality, and social justice while embedding shared principles of partnership, innovation, inclusivity, and human rights. The multiple facets of this debate are often mirrored during the strands dedicated to professionalization in the evaluation conferences I am invited to attend, with a clear polarization between those who believe that an evaluator should be trained and certified just like a medical doctor or a lawyer, and those who believe in the democratization of the profession by minimizing barriers to entry such as financial, cultural or linguistics barriers. Again the answer would need multiple volumes, but AEA’s recent move towards developing AEA evaluator competencies guidelines is an important step in what Robert Picciotto calls “the long evaluation professionalization march.”
3. New kids on the block? The vision set by EvalAgenda2020 envisages that “… evaluation has become so embedded in good governance that no policy maker or manager will imagine excluding evaluation from the decision-making toolbox, dare hold an important meeting or reach an important decision without having reviewed relevant evaluation information…” In order to walk the talk and empower new actors, EvalPartners is steadily developing into a network of networks catering for young and emerging evaluators (EvalYouth), bringing an equity-focus and gender-responsive dimension to evaluations (EvalGender+), valuing the strengths of indigenous evaluation practices (EvalIndigenous), adding value and learning through evaluation of the SDGs (EvalSDGs), in addition to a special initiative targeting parliamentarians as key decision-makers through the Global Parliamentarian Forum for Evaluation (GPFE). See more information via the Networks tab at evalpartners.org
Early lessons show that more than 60 percent of the Young and Emerging Evaluators (YEEs) that signed up to be part of the EvalYouth mentoring program are not yet members of their national VOPE. This is alarming by all senses as YEE are the future of the profession. Similarly, the GPFE has chapters now across the six continents except North America and Europe (despite AEA’s tireless efforts one should say). A long march still lies ahead, so let’s keep walking the talk.
From Zachary Grays, AEA Headquarters
AEA is proud to announce and welcome the newly selected fellows for the 17th MSI fellowship. With an incredibly competitive pool of applicants, it goes without saying that narrowing down the impressive selected seven fellows was difficult. Meet the 2018-2019 MSI fellows, pictured below.
You can get to know the newest fellows and read more on their background here. Please take the opportunity to introduce yourselves to them during Summer Evaluation Institute 2018 and Evaluation 2018 in Cleveland, OH.
AEA would also like to thank Dr. Art Hernandez, MSI program director, for his continued leadership and exemplary guidance of the MSI program. Visit the AEA website to learn more about the MSI program and this year’s fellows.
Publisher Promo/Discount
Oxford is pleased to publish the Pocket Guides to Social Work Research Methods. Each succinct, user-friendly book in the series serves as a roadmap to established and emerging research and evaluation methodologies for social work.
Recent publications in the series include Participant Recruitment and Retention in Intervention and Evaluation Research by Audrey L. Begun, Lisa Berger, and Laura Otto-Salaj; Data Analysis with Small Samples and Non-Normal Data by Carl F. Siebert and Darcy Clay Siebert; and Developing Cross-Cultural Measurement in Social Work Research and Evaluation by Thanh Tran, Tam Nguyen, and Keith Chan.
All of the titles in the series are 20% off to AEA members when you use the discount code AEA20 at checkout.
From Cheryl Oros, Consultant to the Evaluation Policy Task Force (EPTF) Last year, the EPTF completed its tenth year of working to influence government evaluation policy. This year, the EPTF is updating its work plan and has added three new members on three-year terms. You may want to consider working on the Task Force in the future as part of your contribution to the field of evaluation in your career planning, or also consider becoming involved in volunteering on current efforts.
AEA instituted the EPTF to develop an ongoing capability to influence evaluation policies critical to practice, such as defining evaluation and its requirements, methods, implementation, resources and budgets needed. The EPTF is now and has been working with congressional committees, the White House, and agencies to promote sound evaluation policies in the federal government. Its projects have included: informing guidance on evaluation for the Office of Management and Budget's (OMB), Executive Office of the President; helping draft legislative language relevant to evaluation; exploring state evaluation efforts; and assisting U.S. federal agencies and representatives in foreign governments in setting policies to expand evaluation capacity. Learn more about the EPTF’s work and review policy guidelines in The Evaluation Roadmap for a More Effective Government.
In 2018, The EPTF will be comprised of ten members, with seven returning:
Nick Hart, EPTF Chair, current director of the Bipartisan Center’s Evidence-Based Policymaking Initiative; previously served as the policy and research director for the U.S. Commission on Evidence-Based Policymaking; a senior analyst and special assistant at the Office of Management and Budget (OMB) in the Executive Office of the President; as OMB’s representative on the White House steering committee for President Obama’s My Brother’s Keeper Task Force; and past President and board member of Washington Evaluators and on the board of the Eastern Evaluation Research Society.
Katrina Bledsoe, Director, ThinkShift, DeBruce Foundation, focused on solving the challenge of upward mobility and economic security; former consultant to the Annie E. Casey Foundation, the Office of Juvenile Justice and Delinquency Prevention, and the National Science Foundation; taught evaluation and policy; author of publications on evaluation; on the editorial board for the Journal of Multi-Disciplinary Evaluation; and recipient of AEA Multiethnic Issues in Evaluation Topical Interest Group’s Scholar Award and Eastern Evaluation Research Society’s Invited Author Award.
Katherine Dawes, adjunct Professor, George Washington University Trachtenberg School of Public Policy and Public Administration; former Director of the Environmental Protection Agency Evaluation Division; member of the Environmental Evaluators Network; and recipient of the AEA Gunnar Myrdal Government Evaluation Award.
George Grob, former Chair, member and past consultant of the EPTF; Director, Center for Public Program Evaluation; former Deputy Inspector General for Evaluation at HHS and the Federal Housing Finance Agency; conducted numerous evaluations; taught evaluation; served on the Editorial Advisory Board of the American Journal of Evaluation and the Advisory Board of the Eastern Evaluation Research Society; developed performance management systems; former Executive Director of the Citizens' Health Care Working Group; former Director of Planning and Policy Coordination at HEW; former Co-Chair of the Evaluation and Inspections Round Table of the President’s Council on Integrity and Efficiency; and recipient of the AEA Alva and Gunnar Myrdal Government Award.
George Julnes, Professor, School of Public and International Affairs, University of Baltimore, Maryland; served on AEA Board of Directors; on the editorial boards of the American Journal of Evaluation, New Directions for Evaluation, and Evaluation and Program Planning; authored publications on evaluation theory and methodology; and was recipient of the AEA Lazarsfeld Award for Contributions to Evaluation Theory.
Mel Mark, former EPTF Chair; Head of the Psychology Department at Pennsylvania State University; former President of AEA; author of numerous book chapters and articles on evaluation; former editor of the American Journal of Evaluation; and recipient of the AEA Lazarsfeld Award for Contributions to Evaluation Theory.
Stephanie Shipman, Assistant Director, Center for Evaluation Methods and Issues in the Applied Research and Methods Team, GAO; author of numerous GAO reports on evaluation methods and policies; founder of Federal Evaluators, an informal network of officials in the federal government interested in the evaluation of public programs and policies; recipient of the AEA Alva and Gunnar Myrdal Government Award, and the Eastern Evaluation Research Society’s Invited Author Award.
Three new members will be joining the EPTF this year:
Thomas Chapel, Chief Evaluation Officer for the Centers for Disease Control where he sets guidelines and standards for evaluation and provides hands-on facilitation, training, and practical tools and resources; previous vice president of Macro International (now ICF International); AEA Board member; and recipient of the AEA Myrdal Government Evaluation Award.
Diana Epstein, Evidence Team Lead at the Office of Management and Budget (OMB) in the Executive Office of the President, where she collaborates with other OMB offices on setting research priorities and appropriate evaluation methodologies, embedding findings from research and other forms of evidence into program design, and developing agency capacity to build and use evidence; Previously a research and evaluation manager at the Corporation for National and Community Service; and a program evaluator and policy analyst at Abt Associates, the American Institutes for Research, and the RAND Corporation.
Demetra Nightingale, Institute Fellow at Urban Institute; former Chief Evaluation Officer at the U.S. Department of Labor; Advisory Board Member of the Comptroller General’s (U.S. Government Accountability Office) Advisory Committee on Government Audit Standards; Fellow of the National Academy of Public Administration; Professorial Lecturer, Trachtenberg School of Public Policy and Public Administration, George Washington University; Director of the Federal Agency Evaluation Workshop Series; author of many books, articles, and evaluation and research reports; and recipient of the Exemplar Award of the Association for Public Policy Analysis and Management and the GWU Trachtenberg School Distinguished Alumni Award.
Kathryn Newcomer, Past President of AEA and liaison to the EPTF from the AEA Board, currently serving as Director of the Trachtenberg School of Public Policy and Public Administration at the George Washington University; Fellow of the National Academy of Public Administration; member of the Comptroller General’s (GAO) Educators’ Advisory Panel; Past President of the National Association of Schools of Public Affairs and Administration; and author of books and articles on evaluation, public administration and leadership.
The EPTF will be supported by the new AEA Executive Director, Anisha Lewis, and Denise Roosendaal, current ED; and Cheryl Oros, Consultant to the EPTF. For further information and to volunteer to assist the EPTF, contact Cheryl at EvaluationPolicy@eval.org.
From Sheila B. Robinson, Potent Presentations Initiative Coordinator
Presenters are looking for ways to improve their slides and engage audiences with interactive presentations. Thousands of blog articles, dozens of best-selling books, and even a few research studies are devoted to helping them up their game. Presenters want better ways to improve their slide design skills, share data with audiences, and increase their capabilities in working with images and data visualization. They are also looking for ways to capture their audiences’ attention and sustain it throughout a presentation.
Presenters also want to improve their messaging and presentation content. They want to be able to convey complex concepts to their audiences and deliver presentations with confidence. They want to improve their overall public speaking skills and more specifically, increase their storytelling skills and ability to use humor in presentations.
The Survey In December, I offered an opportunity for evaluators to participate in a brief, informal survey about presenting to help inform future work on our Potent Presentations Initiative (p2i). The survey asked about people’s past and future presentation work and what might help them improve their practice.
I shared the link to “You, the Presenter: What Would Help You Up Your Game?” in an aea365 post and on EvalTalk. Over the course of about three weeks, 190 people responded. Here is a little bit about who they are:
People of all ages responded, the youngest in the 21-30 range, and the oldest in the 71+ range. About 75% fall between 21-60. Nearly all (99%) do some sort of evaluation work. Many work in a higher education setting, while others work in a research or evaluation firm, non-profit, or government setting. Internal presentations and conference presentations are the most common types of presentations respondents have given, and the most commonly anticipated types. Only 20% reported having given professional development (PD) workshops in the last one to three years, but nearly two-thirds (64%) anticipate giving one in the next year or so.
Stay tuned! Coming up next month you’ll learn about respondents’ priority learning areas and who their respondents’ favorite speakers are.
p2i Needs Your Help!
Please contact me at p2i@eval.org and let’s talk! I’m happy to help, offer guidance, or collaborate on any of these.
On Thursday, March 15, AEA will present a special coffee break conducted in Spanish, "Evaluaciones Colaborativas Paso a Paso" ("Collaborative Evaluations Step-by-Step"). Join Dr. Liliana Rodríguez-Campos, professor and director of the Graduate Certificate in Evaluation at the University of South Florida, and Dr. Rigoberto Rincones Gomez, Vice President for Institutional Planning, Research, and Effectiveness at Edward Waters College, as they lead a highly interactive workshop for those looking to engage and succeed in collaborative evaluations.
In clear and simple language, the facilitators explain how to apply the Model for Collaborative Evaluations (MCE) to real-life situations, with an emphasis on elements that facilitate stakeholders' involvement. The facilitators share their experiences and insights regarding this topic in an easy to understand fashion, so the information learned can be used immediately.
Participants will learn fundamental components of collaborative evaluations, advantages and disadvantages of collaborative evaluations, a framework for planning, executing, and reporting sound collaborative evaluations; and step-by-step resources to guide collaborative evaluations.
For more information and registration details, visit the AEA Coffee Break page.
AEA will offer two eStudy courses in March and April. The first, "Developing Quality Survey Questions," will be presented by Sheila B. Robinson, Ed. D, Greece Central School District; and Kimberly F. Leonard, Senior Research Officer, Oregon Community Foundation. The two-part session, which takes place March 27 and 29, will guide participants through the survey design process through a series of activities, and help them develop an understanding of the cognitive aspects of survey response and question design. Participants will increase their ability to craft high quality survey questions, and leave with resources to further develop their skills.
The second eStudy session, "Non-Parametric Statistics: What to Do When Your Data Breaks the Rules," will take place in two parts on April 10 and 12. Presented by Jennifer Catrambone, Director of Quality Improvement & Evaluation, RMR CORE Center, this session will provide a brief overview of parametric statistics in order to contrast them with non-parametric statistics. Data situations requiring non-parametric statistics will be reviewed and appropriate techniques will be demonstrated using screenshots of SPSS analysis software. The instructor will demonstrate how to run the non-parametric statistics in SPSS.
To learn more and register for these eStudy courses, visit the AEA eStudy page.