Conference workshop program – Tuesday 16 September

>>> DOWNLOAD A PRINTABLE VERSION (WITH PRICING)

Having trouble seeing the full overview table? Switch to mobile version.
View Monday conference workhop program.

The following categories will help you select sessions best suited to your interests: Foundation – Intermediate – Advanced

8–9am REGISTRATION
9am–12:30pm WORKSHOP PROGRAM

Full day workshop:
Cultivating evaluation: Exploring an expanded garden of approaches for universe contexts

Presented by Bianca Montrosse-Moorhead (KEYNOTE), Daniela Schroeter 

FOUNDATION / INTERMEDIATE 

> Details

Full day workshop:
Practical approaches to evaluating impact without experiments: Contribution, analysis, process tracing and more 

Presented by Patricia Rogers

INTERMEDIATE / ADVANCED 

> Details

Full day workshop:
Evaluation and value-for-money: An inter-disciplinary, participatory approach using evaluative reasoning and mixed methods

Presented by Julian King

FOUNDATION / INTERMEDIATE 

> Details

Full day workshop:
Cultivating our bubble: Using developmental rubrics to enhance evaluation

Presented by Amy Gullickson, Ben Lawless

FOUNDATION / INTERMEDIATE

> Details

Full day workshop:
Making it stick: Increasing the power of evaluation reports using easy-to-play principles, tools, tips and tricks

Presented by Samantha Abbato

FOUNDATION / INTERMEDIATE 

> Details

 

Half day workshop:
Strong stories, strong outcomes: Narrative practice and strength based evaluation

Presented by Donna-Maree Stephens

FOUNDATION

> Details

 
12:30–1:30pm LUNCH
1:30–5pm WORKSHOP PROGRAM

Montrosse-Moorhead, Schroeter workshop continued

Rogers workshop continued

King workshop continued

Gullickson, Lawless workshop continued

Abbato workshop continued

Half day workshop:
Observation methods

Presented by Emma Williams

FOUNDATION / INTERMEDIATE / ADVANCED

> Details


Workshop details


Full day workshop 

Cultivating evaluation: Exploring an expanded garden of approaches for diverse contexts

presented by Bianca Montrosse-Moorhead (KEYNOTE), Daniela Schroeter  |  FOUNDATION / INTERMEDIATE

Workshop description

Evaluation approaches shape the credibility, utility, and relevance of evaluations. Choosing the right approach is complex due to diverse stakeholder needs, power dynamics, and methodological constraints. This workshop introduces the Garden of Evaluation Approaches, a structured framework first published in the American Journal of Evaluation. Participants will learn to select, adapt, and apply evaluation approaches based on context, values, and goals.

By the end of the workshop, participants will:

  • identify key evaluation approaches and their applications
  • use the Garden Framework to make informed methodological choices
  • evaluate the relevance of various approaches in different contexts
  • apply selected approaches to a real-world case study

Participants will explore the foundations of evaluation approaches; and the expanded Garden as it has unfolded to support evaluation decision-making. Purposely selected approaches illustrating different worldviews for evaluation will be examined based on their underlying paradigms, preferred methods, and dimensions of values and valuing, as well as the breadth and depth of stakeholder engagement, power dynamics, role in promoting social justice, and power dynamics in evaluation decision-making. Possible approaches include realist evaluation (understanding causality and mechanisms in complex systems), outcome harvesting (using backward mapping to track outcomes and contributions), adaptive evaluation (assessing interventions in dynamic and evolving contexts), inclusive systemic evaluation for gender equality, environments and marginalised voices (a gender-responsive and equity-focused approach) and made in Africa evaluation (a culturally grounded approach integrating local knowledge and perspectives).

Strategies include:

  • small group activities for peer learning
  • a case study to analyze complex challenges
  • facilitated discussions and polls to encourage reflection and engagemen
  • hands-on exercises using the Garden framework

This workshop aligns with Domains 1– 3 of the AES Evaluators’ Professional Learning Competency Framework.

About the facilitators

Daniela Schroeter (Associate Professor at Western Michigan University) and Bianca Montrosse-Moorhead (aes25 keynote speaker; Professor, University of Connecticut) have collaborated since 2016 to cultivate the garden of evaluation approaches. With over 20 years of experience each, their expertise spans evaluation theory, methodology, capacity-building, and education across multiple sectors. Bianca directs the PEER lab and is Co-Editor-in-Chief of New Directions for Evaluation. Daniela co-edits Teaching & Learning of Evaluation (American Journal of Evaluation,  AJE) and is Associate Editor for the Journal of MultiDisciplinary Evaluation (JMDE). Their research and publications have shaped the understanding of evaluation’s distinct role. Recognized for engaging, high-impact sessions, they have presented extensively at major international conferences (AEA, EES, UNEG), earning positive feedback and repeat invitations. Their collaboration continues to expand the evaluation landscape, fostering innovation and intentionality. Their evolving work on the ‘garden’ metaphor has inspired evaluators worldwide, advancing both theory and application.

> back to overview


FULL DAY WORKSHOP  

Practical approaches to evaluating impact without experiments: Contribution analysis, process tracing and more 

presented by Patricia Rogers   |  INTERMEDIATE / ADVANCED     

Workshop description

There is increasing demand for high quality impact evaluation, but it is not always possible – or appropriate – to use experimental methods (such as randomized controlled trials) or quasi-experimental designs. This workshop will explore rigorous non-experimental approaches to impact evaluation, focusing on contribution analysis and process tracing.

For evaluators, the workshop will strengthen their ability to design and conduct high-quality non-experimental impact evaluations. For evaluation commissioners, it will strengthen their ability to frame appropriate terms of reference, select and manage appropriate evaluation teams and support use of findings. 

By the end of this workshop, participants will:

  • understand why experimental approaches are not always feasible and how non-experimental methods can still provide strong evidence of impact
  • learn the key principles of contribution analysis and process tracing and how they help answer impact questions
  • apply these methods to practical exercises, improving their ability to assess whether and how a program has contributed to observed changes
  • identify how they might apply these approaches in their work

The workshop will employ adult learning strategies including case studies, group exercises and individual reflection.   

The full-day advanced workshop is intended for people with a good understanding of foundational evaluation concepts and data collection, analysis and reporting options. It addresses Domain 4  of the AES Evaluators’ Professional Learning Competency Framework.

About the facilitator

Patricia Rogers has over 30 years of experience conducting evaluations, teaching evaluation courses and workshops, conducting research into evaluation and strengthening evaluation capacity with government and non-government organisations in Australia and internationally. 

She is the founder of BetterEvaluation, the global knowledge platform on evaluation methods and processes and has an enduring commitment to supporting appropriate choices in evaluation design. She has presented keynote and plenary addresses at evaluation conferences in Australasia, Europe, Asia, Africa and North America and successful evaluation workshops in Australia and internationally including AES pre-conference workshops. She has written on appropriate methods for impact evaluation and ways of addressing complexity in evaluation. Her publications include Purposeful Program Theory: Effective Use of Theories of Change and Logic Models (with Sue Funnell) and Choosing Appropriate Designs and Methods for Impact Evaluation (with ARTD Consulting) for the Department of Industry, Science and Resources. She is a Fellow of the AES, and recipient of the AES Outstanding Contribution to Evaluation Award and the American Evaluation Association's Myrdal Evaluation Practice Award.

> back to overview


FULL DAY WORKSHOP  

Evaluation and value for money: An inter-disciplinary, participatory approach using evaluative reasoning and mixed methods

presented by Julian King   |  FOUNDATION / INTERMEDIATE    

Workshop description

This workshop provides practical guidance, underpinned by sound theory, for evaluating VfM. It unpacks a process of explicit evaluative reasoning (using rubrics) and the use of mixed methods. A sequence of steps will be shared to help evaluators and commissioners to develop and use context-specific definitions of good VfM. These definitions provide a system for ensuring the evaluation: is aligned with the design and context of the policy or program; engages stakeholders in evaluation design and sense-making; collects and analyses credible evidence; draws sound conclusions; and answers VfM questions. The approach is intuitive to learn and practical to use. 

Participants will learn how to: frame an evaluative question about VfM; develop rubrics setting out VfM criteria and standards; combine multiple sources of evidence; incorporate economic evaluation within a VfM framework where feasible and appropriate; interpret the evidence on a transparent basis; and present a clear and robust performance story, guided by the rubrics.

The workshop involves a mix of powerpoint presentations, group discussions and examples. Participants will receive optional pre-workshop reading, and a post-workshop take-home pack including a copy of the slides, exercises and links to online resources. 

This workshop includes a brief overview of economic methods of evaluation (e.g. cost-benefit analysis) including considerations for determining when to use them in a VfM assessment. It doesn’t provide detailed instruction in the design and implementation of economic evaluations. There are courses already on offer that focus on economic methods of evaluation. 

This workshop aligns with Domain 2, 3, 4, and 7 of the AES Evaluators’ Professional Learning Competency Framework.

About the facilitator

Julian King is a NZ based public policy consultant. His professional background includes over 20 years evaluating policies and programs in high, middle and low-income countries. The Value for Investment (VfI) approach, developed through Julian's doctoral research, is used worldwide to evaluate complex and hard-to-measure policies and programs. Julian received the 2021 AES Evaluation System Award in recognition of the contribution made through the VfI approach. Julian is a member of the Kinnect Group, an Associate of Oxford Policy Management, a member of Verian Group's Centre for Value for Money (VfM), an Honorary Fellow at the University of Melbourne and a University Fellow at the Northern Institute. He is a member of the evaluation societies of Australia, NZ, UK, USA, and Europe. Julian specialises in VfI capability-building. He regularly provides training workshops on Evaluation and Value for Money in Australia, NZ and globally.

> back to overview


FULL DAY WORKSHOP  

Cultivating our bubble: Using developmental rubrics to enhance evaluation

presented by Amy Gullickson, Ben Lawless   |  FOUNDATION / INTERMEDIATE 

Workshop description

This full-day workshop brings together best practice in rubrics with best practice in evaluative reasoning: developmental rubrics. A powerful tool for qualitative, growth-oriented program evaluation, developmental rubrics offer deeper insights for monitoring, evaluation, and learning at any point in a program’s lifecycle by replacing traditional static ratings with nuanced performance descriptors. Workshop participants will learn the basics, practise on examples, and  create developmental rubrics for their own projects.

Participants will learn to: 

  • distinguish between developmental and traditional rubrics
  • • design rubrics that capture stages of progress with clear criteria and descriptive indicators
  • apply these rubrics to real-world program and systems evaluation contexts
  • avoid common pitfalls such as pseudo-counting and overly vague language

Drawing on educational research and established frameworks, the workshop emphasises:

  • structuring effective rubrics with progressive, observable quality markers
  • integrating rigorous evaluative reasoning into rubric design
  • providing meaningful feedback for decision-making and stakeholder engagement

This interactive session uses hands-on activities, small-group discussions, and case studies to guide participants in creating or refining their own developmental rubrics. Templates, sample rubrics, and follow-up materials will be provided to enable immediate application.

Designed for evaluators, commissioners, and policymakers with foundational to intermediate experience in evaluation, though no prior rubric-writing expertise is required. Familiarity with basic evaluation concepts is helpful but not mandatory.

Aligned primarily with Domain 3 (Theoretical Foundations) of the AES Evaluators’ Professional Learning Competency Framework, the workshop also addresses evaluative reasoning and stakeholder engagement domains. It fits both the ‘New Approaches and Ways of Thinking’ and ‘Foundational Evaluation Skills and Capabilities’ categories.

Through practical exercises and expert facilitation, participants will leave with the skills to implement robust developmental rubrics in their evaluation work.

About the facilitators

Associate Professor Amy Gullickson is an AES Fellow who has been doing, researching, and teaching evaluation for more than 20 years. She is  in demand internationally as an expert in evaluation-specific methodology, evaluation teaching and learning, and evaluation capacity building for individuals, organisations, and systems. As a member of the AES Pathways committee, she co-leads the ongoing work on evaluator competencies and self-assessment. 

Ben Lawless is an educator and researcher at a large P-12 College in northwest Melbourne. He writes textbooks, lectures in assessment at the University of Melbourne, worked with VCAA to improve their assessment practices, written units of work for the National Museum and presented widely on the topics of developmental learning, assessment, well-written rubrics, and using assessment data to target teaching. He has won a number of awards, including Hume's Graduate Teacher of the Year and being a finalist for ResourceSmart teacher of the year. He has created a number of international political simulation games putting students in the role of world leaders to solve various political, environmental and social challenges, one of which was in the Top 10 for Australia for the HundrED prize. His personal education passions are using evidence to improve teaching, and learning through games. He is currently in the process of developing a PhD proposal, with the thesis being "Bringing developmental education rubrics into the field of evaluation".

> back to overview


FULL DAY WORKSHOP  

Making it stick: Increasing the power of evaluation reports using easy-to-apply principles, tools, tips and tricks 

presented by Samantha Abbato   |  FOUNDATION / INTERMEDIATE 

Workshop description

Communicating evaluation findings through effective reporting is for those of us who commission, plan and do evaluations. This workshop builds on the AES online Making it Stick 1-3 series to introduce practical activities for strengthening participants’ evaluation reporting across the suite of report types, including written reports, infographics, data dashboards, video reports and PowerPoint presentations.

Drawing on the principles of effective communication and evaluation use, this workshop incorporates activities and techniques from various disciplines. Throughout the day, participants can apply the learnings to real-life examples, including the opportunity to volunteer their own reporting projects for small-group work.

Activities include: 

  • crafting the report message using the Rule of Three for a three-part and three-minute story
  • increasing the effectiveness of communication with report audiences using visual empathy mapping
  • use of storyboarding and story-arc tools to effectively ‘show’ the report story through images, drawing and data visualisation, and structure your story for greater impact
  • script-writing techniques for decluttering and creating powerful language to deliver directly, using video and increasing the report audience’s engagement with written text

Workshop participants will have the chance to use cost-effective digital kits to present their report message on video. Easy-to-use tools, templates and checklists will enable participants to directly apply the workshop learnings to their own reporting projects. 

Workshop learning outcomes include:

  • the ability to use empathy mapping to communicate with evaluation stakeholders effectively
  • skills in crafting and delivering a potent core message
  • practice using storytelling tools to structure the report, create powerful language, and ‘show’ the message through visuals

This workshop is an ideal introduction for those who write or commission reports. It provides an opportunity for participants of the Stick 1–3 online workshops (2020–2025) to further their skills in reporting through new materials and tools that are best delivered face-to-face.

This workshop aligns with Domains 1, 3, 5 and 6 of the AES Evaluators’ Professional Learning Competency Framework.

About the facilitator

Samantha Abbato has completed more than 100 evaluation and research reports and papers for various government, non-government organisations and community stakeholders.  She has published numerous book chapters and peer-reviewed journal articles and has worked as a freelance journalist for several years. She received the 2015 AES Evaluation Publication Award (Caulley Tulloch Award).  

With a passion for communication and maximising evaluation use, and an extensive understanding of the evaluation commissioner perspective through her work on Visual Insights organisational capacity building, Samantha can offer a wealth of case studies of the good, the bad and the ugly of evaluation reporting. As the director of Visual Insights People since 2013, she has introduced a pictures and stories and play approach to evaluation reporting and the delivery of workshops. Using cartoons and a range of visual tools, templates and checklists that workshop participants can take away with them, Sam is passionate about participants’ continued use of learnings for sustained improvement of their evaluation reports.    

> back to overview


HALF DAY (MORNING) WORKSHOP  

Strong stories, strong outcomes: Narrative practice and strength based evaluation

presented by Donna-Maree Stephens  |  FOUNDATION 

Workshop description

This interactive workshop will focus on applying Aboriginal and Torres Strait Islander narrative practice methods

The workshop will cover:

  • defining Aboriginal and Torres Strait Islander narrative practice and storytelling
  • share a narrative framework for understanding narrative practice and processes in Aboriginal and Torres Strait Islander communities, particularly drawing from remote NT contexts
  • practice drawing strength-focused messages and metaphor using narrative inquiry

As an interactive half-day session, this workshop is open to all levels of evaluators but may suit early career and First Nations evaluators best.

This workshop aligns with Domains 1, 2, and 3 of the AES Evaluators’ Professional Learning Competency Framework. It is a valuable opportunity for evaluators especially those early in their careers or from First Nations backgrounds to enhance their culturally responsive and narrative-based evaluation skills.

About the facilitator

Donna-Maree Stephens is a Muran/Iwaidja women whose family come from Northwest Arnhemland. She grew up living on Larrakia Country in the Northern Territory.  In 1992, Donna began working as a teacher in Belyuen School on the Cox Peninsula. Her early work focused on teacher professional learning, including literacy pedagogy and e-learning in remote settings. In 2010, Donna moved to higher education as a lecturer for the CDU School of Education, during which time she focused on First Nations and international student support - working in the Growing Our Own First Nations remote preservice teacher program and as the NT Teaching Schools Coordinator. In 2014, Donna moved to a research-focused role addressing education pathways and remote community development.  

Donna has worked in research and evaluation roles focused on AOD, mental health, social and emotional wellbeing and workforce development. In 2019, Donna started work as a consultant and has had the honour of being the inaugural Community First Development Research and Evaluation Fellow. She also contributed to the Better Evaluation Aboriginal and Torres Strait Islander Evaluation Protocol. Donna has more than a decade of experience in research and evaluation project management and administration, reporting, publications and presentations at local, national and international forums.  Her research and evaluation interests include remoteness, poverty, mobility, flourishing, organisational change, ecosystems thinking, generative design and narrative inquiry.

> back to overview


HALF DAY (AFTERNOON) WORKSHOP  

Observation methods for evaluators

presented by Emma Williams  |  HALF DAY WORKSHOP (AFTERNOON) | FOUNDATION 

Workshop description

This half-day workshop will build participants’ awareness of the potential of observational data to answer evaluation questions and build skills in rigorous, ethical observational techniques. There are many variants of observational data; it may be quantitative or qualitative, gained through trace, overt or covert observation in controlled or natural situations, and recorded digitally or manually. 

This workshop will explain how each type of observational data suits a different range of evaluation questions and situations; it will focus on data gained through the observation of human behaviour in public places, which is typically of most use to evaluators. Ethical issues, particularly where evaluations include vulnerable populations, will be a special focus of the workshop.

By the end of the session, participants will be able to:

  • distinguish between different types of observational data
  • identify the evaluation situations for which each type of observational data is suited
  • implement strategies to ensure rigour in observational data collection, recording, and analysis
  • formulate strategies to ensure their observations are ethically designed and collected

Five real-life cases will be used to illustrate the range of approaches and issues that observational research can provide, with interactive exercises conducted after each case. 

No prerequisites are required; this is a foundational skills workshop targeted to those without a great deal of experience in evaluation, although it may be of use to those with intermediate and even advanced skills in other areas of evaluation, if they do not have observational experience. 

The workshop focuses on the Research Methods and Systematic Inquiry domain of the AES Evaluators’ Professional Learning Competency Framework, although elements of it, particularly the ethical exercises, address the Culture, Stakeholders and Context domain, and all of it is relevant to the Evaluation Activities domain.

About the facilitator

Emma Williams is a Credentialed Evaluator with experience in realist, observational and participatory evaluations on topics including throughcare, family violence, service access, employment, environmental issues, and international development. She was co-editor, with Ana Manzano, of Realist Evaluation: Principles and Practice (2024), and her chapter on observational techniques will come out in Sage’s Data and Research Literacy collection mid-year. Emma is currently undertaking a realist investigation of evaluation ethics for a PhD that also uses Q methodology and would love to talk about evaluation ethics with anyone at the conference

> back to overview

Australian Evaluation Society
425 Smith Street
Fitzroy Vic 3065 Australia
Phone +61 3 8685 9906

© Copyright 2024–2025 Australian Evaluation Society Limited. ABN 13 886 280 969 

We acknowledge the Australian Aboriginal and Torres Strait Islander peoples of this nation. We acknowledge the Traditional Custodians of the lands in which we conduct our 2025 conference, the Ngunnawal and Ngambri peoples. We pay our respects to the ancestors and Elders, past and present, of all Australia’s Indigenous peoples. We are committed to honouring Australian Aboriginal and Torres Strait Islander peoples’ unique cultural and spiritual relationships to the land, waters and seas and their rich contribution to society.

Conference logo design: Keisha Leon, Cause/Affect  | Site design: Ingrid Ciotti, Studio 673