TIG Leadership


Stephen Axelrad
Booz Allen Hamilton
Arlington, VA

thumbnail image
Stephen Axelrad is an experienced consultant and technical assistance provider of evaluation topics, via informal (e.g., mentoring) and formal (via lecture) learning modes. In his role (2012-2017) as Lead Research/Evaluation Analyst working to build evaluation and applied research capacity of Department of Defense policy offices, Dr. Axelrad developed and presented multiple white papers, presentations, and consultations (on evaluation topics such as logic modeling, evaluation planning, stakeholder engagement, data analysis, needs assessment) for military leaders, policy analysts, and program specialists.  The capacity building outcomes, as a result of his project’s webinars and consultation events, have been presented to senior government leaders and at professional conferences (American Evaluation Association).

He is currently the lead research psychologist and evaluation SME on two DoD contracts with Booz Allen Hamilton to perform needs assessment, formative evaluation, implementation evaluation, and performance monitoring services.  In this role, he consulted on, developed, and created tools and frameworks for logic models, stakeholder maps, outcome measurement, and after-action reports for program evaluation and implementation pilot studies.  He develops protocols for use in consulting -and facilitating evaluation practices to advance whole-population health, resilience, and effectiveness of DoD’s civilian and military workforce.

Dr. Axelrad is an interdisciplinary evaluation expert. He has led strategic planning, performance monitoring and evaluations across diverse sectors such as human resources, training, public health, and cyber. He holds a PhD and Masters in Industrial-Organizational Psychology. He is a long-time AEA member and presenter of evaluation topics (since 2002).  From 2002-2004, he taught psychology and research methods courses to adult learners at Harris-Stowe State and Columbia Colleges.   He is the current and founding Chair of the Military and Veteran Evaluation Topical Interest Group.

Program Chair

Julianne Rush - Manchester

Silver Spring, MD


thumbnail image
Research and Evaluation Director. Nearly 20 years of Leadership roles (Principal Investigator, Project Manager, Lead Evaluation Specialist) in non-profits and on federal contracts (DHHS-HRSA and Department of Defense-DoD).  Senior Methodologist on current DoD contract. Secured over 3 million in federal and state funding (contracts and cooperative agreements) from 2012-2014.    

Interdisciplinary Evaluation Expert, with PhD in Quantitative Research, Evaluation and Measurement in Education.  MAs in Educational Policy and Leadership and Industrial-Organizational Psychology.

Methodologist. Conducted multiple quantitative and qualitative systematic investigations (research, evaluation, quality improvement) across sectors, including health, public health, education, criminal justice, and accounting/finance. Evaluated for organizational value (e.g., evaluated eight Ohio schools on school safety dimensions, including psychological and physical safety; developed outcomes protocol for evaluating uptake of evidence based practices across 45 Geriatric Education Centers; developed outcomes evaluation protocols for Traumatic Brain Injury and Psychological Health programs in government/ military health system; conducted Lean Six Sigma process improvement on Veteran recovery support program to modify SOPs in government). Developed several on-line surveys for measuring evidence-based practice outcomes, coalition factors, knowledge and skill gain, use of strategies, and needs assessments in health, public health, and education sectors. Developed and conducted key informant interviews for Chief of Staff Office evaluations (DoD/ Defense and Veterans Brain Injury Center) and health services research (Ohio Commission on Minority Health).

Analyst. Skilled in quantitative analysis and psychometrics (instrumentation development, factor analysis).  Have used a range of specialized analytic techniques, including descriptive statistics, item difficulty indices, correlations, multi-variate regression, cross-tabulations, factor analysis, principal components analysis, and reliability (inter-item and inter-rater).  For example, tested predictive relationships between capacity factors and uptake of evidence based practices among providers; created items, recruited reviewers, and conducted inter-rater reliability coefficients for Alzheimer’s Disease curriculum components (WebMD/DHHS-HRSA-BHPr partnership) and American Academy of Dermatology.  

Author. Published in peer-reviewed journals as single and lead author on needs assessment in military health system, systems change in healthcare settings following adoption of provider behaviors, and adapted measurement for evolving provider behaviors in interdisciplinary teams.

Trainer/ Facilitator. Proficient in adult learning, on-line and in-person.  Developed and presented over 60 on-line learning webinars (on evaluation topics such as logic modeling, evaluation planning, stakeholder engagement, data analysis, needs assessment) for health providers, educators, and other practitioners.  

Educator. Taught psychology, education, and business courses (in-classroom and on-line) in two large community college systems and an on-line university (2001-2016). Developed own syllabi, curriculum, exercises, and assessments.

Past Program Chair

Pat Clifford

Clifford Consulting

Cincinnati, OH


Outreach Manager
Erika Steele
Department of Veterans Affairs Medical


thumbnail image
Erika Steele received her PhD in Science Education with a focus on Curriculum and Instruction at the University of Alabama in 2013.  In her role as a graduate student in Science Education, Dr. Steele worked with the National Study of Education in Undergraduate Science (NSEUS) to collect and analyze data to evaluate the impacts of the National Aeronautics and Space Administration’s Opportunities for Visionary Academics (NOVA) professional development program and to measure the long- and short-term impacts science education reform in the classroom on undergraduate students’ abilities to learn and use science.  Quantitative and qualitative data were collected from colleges and universities throughout the U.S. to provide the big picture of "how and why" science education helps or hinders student learning.  While working with NSEUS she developed the skills to help improve their teaching, educational program or curriculum so that they can provide participants with the best possible learning experiences primarily through professional development workshops.


Erika is a currently a Health Professionals Education Evaluation Research Fellow.  In this role, she collaborates with senior evaluators at the Center for Program Design & Evaluation at Dartmouth to assist Faculty at the National Center of Patient Safety improve the learning experiences for participants in the Chief Resident in Quality and Safety Program (CRQS).  Some of her accomplishments include 1) creating a program logic model and evaluation plan, 2) developing a tool to assess quality improvement projects, 3) writing clear learning competencies, learning objectives and learning activities for the CRQS program, and 4) providing faculty and staff with assistance in developing, implementing, and evaluating quality improvement projects at their VAMC. 


Dr. Steele is a novice program evaluator with a background in curriculum and instruction in higher education.  She has experience in education evaluation at K-12, higher education, and graduate medical school levels. In addition to her PhD in Science Education, Erika has a Master’s Degree in Cellular and Molecular Biology and has taught biology courses at Stillman College.  Her time in the classroom drivers her passion for helping others improve their educational programs.  Dr. Steele is a new member of AEA and is the current Outreach Manager of the Military and Veteran Evaluation Topical Interest Group.