Date: Thursday, April 3, 2025
AEA365, please allow me to introduce myself. I’m Julius Najab, an evaluator in the Research & Data Analytics department at the American Association for the Advancement of Science (AAAS). I serve as the internal evaluator for a program that aligns with one of AAAS’s core values: ensuring that everyone is included in the scientific enterprise.
The program I evaluate is SEA Change, which is designed to empower institutions of higher education in the United States to create environments in which science, technology, engineering, mathematics, and medicine (STEMM) faculty and students can thrive in their fields. In this post, I will share a couple of lessons learned and a hot tip from my perspective as an internal evaluator working on a law-attentive, systems-change empirically-based program driven to improve the enrollment and retention of university STEMM faculty and students—while facing challenges from a rapidly evolving legal and public policy environment.
Clarity in role. Clarity in responsibilities. Clarity in capabilities.
As an internal evaluator at a scientific professional organization largely comprised of individuals from the physical sciences, I quickly realized that many (though not all) key personnel needed clarity on what program evaluation entails. Additionally, it was important to communicate that my role as an evaluator was not to judge their professional competencies.
Being an internal evaluator also comes with distinct contractual and resource considerations—such as time, survey software, and data analysis tools. It’s crucial to clarify both to yourself and the team what you are realistic given the available resources (e.g., budget and time).
SEA Change is a program designed to help member institutions use data to self-assess their own work and then develop action plans to create an environment where everyone can succeed to the best of their talent and interest. This requires coordination, buy-in, understanding, and data collection across an entire university—a complex challenge in itself. However, external forces that impede or even outlaw this type of work make the mission even more difficult.
Because SEA Change is a law-attentive and member-driven program, flexibility is essential. The program provides educational resources, training, meetings, and counseling to support members in reaching their goals based on their institutional mission. As external conditions shift, the program must adapt—and so must the evaluation process.
Data collection methods and timelines need to be flexible while still maintaining scientific rigor. Similarly, data reporting and analysis must also remain adaptable, as the SEA Change team relies on empirical data to make informed decisions but cannot always wait until the originally planned reporting period when decisions are required now rather than weeks later.
Communication is a two-way street, whether you’re an internal or external evaluator. I benefitted from actively listening and clearly articulating my role, responsibilities, perspectives, and capabilities.
Listening to team members helped me understand their concerns about evaluation and their expectations. At the same time, as the evaluator, it’s essential to proactively communicate what is feasible given the available budget and resources. Regular communication ensured I stayed informed about program developments, allowing me to anticipate evaluation needs rather than being caught off guard. This, in turn, enabled me to conduct professional and rigorous evaluations.
Clarity, flexibility, and communication are essential to a successful evaluation process—regardless of the client’s education, previous evaluation experience, or area of expertise.
*This post was edited with the assistance of the ChatGPT Large Language Model.
The American Evaluation Association is hosting STEM Education and Training TIG Week with our colleagues in the STEM Education and Training Topical Interest Group. The contributions all this week to AEA365 come from our STEM TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.