Our work

Complexity and Systems Thinking

Pragmatica draws on a range of complexity theory approaches in our work. Collaborating with others at conferences offers the opportunity to share new ideas.

This section captures some recent work which helped surface new thoughts on complexity theory and systems thinking.

Contracting public health and social services: Insights from complexity theory for Aotearoa New Zealand

Public health and social services are often hard to specify, complex to deliver and challenging to measure. This research uses a complexity theory-informed lens to explore the challenges and opportunities of contracting out for public health and social services in Aotearoa New Zealand. This qualitative study considers the implications of complexity concepts with ten public sector managers experienced in contracting out for public health and social services.

Recently published findings show that public sector managers are experimenting with different ways of contracting out, yet the underlying New Public Management ethos, which is being applied in many administrative arms of government, can hamper initiatives. There is a growing impetus to find alternative approaches to contract out more effectively. An alternative, complexity theory-informed, framing highlights where changes to contracting out organisation and practices may support more effective service provision. This research also provides insights into why achieving change is hard.

2019 American Evaluation Association (AEA) Outstanding Evaluation Award Winners

Waikato Regional Council invested in a Developmental Evaluation as part of a collaborative stakeholder process to successfully agree on ways of managing water quality and use for diverse populations. This high-stakes evaluation was a trailblazing, three-year journey working collaboratively with multiple stakeholders in a highly contested setting. Significant for the evaluation field this project demonstrates high quality and effective Developmental Evaluation using an evaluation-specific methodology.

The evaluation team working on the project included Kate McKegg, Judy Oakden, Debbie Goodwin and Jacqui Henry.

“For Waikato Regional Council, winning the award affirms that our investment in evaluation for the collaborative stakeholder process was absolutely the right thing to do. …The evaluation award demonstrates to the organization, alongside River Iwi (tribes) and all stakeholders involved in the policy process, the value to be gained from evaluation as a key component of any project. We are thrilled to represent the academic and research expertise that is available in Aotearoa/New Zealand and stand proudly on the world stage and be acknowledged in this way.”

Evaluating in uncertainty: The curse of the wicked problem

Predictability and certainty are hallmarks of traditional evaluation, especially at the program level. We used to believe that we should be able to predict, produce, and evaluate outcomes for all funded programs. Today, exceptions to this rule of predict-and-control are all too familiar. Nonprofit and philanthropic sectors tackle increasingly complex, systemic challenges, where predictability is limited.
Many available options aim to support credible evaluation in complex contexts, including developmental evaluation, outcomes harvesting, and contribution analysis. These approaches are effective, but they may require clients to rethink their evaluation approach. Some clients are not always ready.

In this session, we offered an alternative. We suggested that when evaluators understand the sources of uncertainty in a complex system, they can adapt their evaluation approaches to be more sensitive to complex dynamics, without introducing an entirely new or radical evaluation strategy. In this session, we:

  • introduced five sources of uncertainty in complex environments
  • explored, with participants, situations where each source might affect the evaluation
  • identified evaluation adaptations to accommodate uncertainty as it arose

Glenda Eoyang and Judy Oakden presented this session at the American Evaluation Association Conference in Minneapolis.

Evaluation in complex situations

Drawing on examples from the Sustainable Farming Fund Evaluation this presentation offered practical tips to build usability into an evaluation from the start. The presenters showed how they built use into the evaluation at all stages in the process. This included:

  • the scoping, relationship building and contracting stages
  • the way external evaluators engaged with commissioners to design a multi-purpose evaluation approach
  • the stance taken to project management and client engagement
  • planning communication from the outset, which ensured findings were communicated with stakeholders.

Judy Oakden (external evaluator) and Clare Bear (commissioner) presented this paper at the ANZEA Conference, in Wellington.

Evaluation

Evaluation Rubrics

Pragmatica is well known for using rubrics in evaluation, having used them for over a decade. Judy regularly mentors others in their use. This section includes many of the key publications and presentations developed over time, in collaboration with others.

Use of evaluation rubrics

Understanding the components of evaluative rubrics – new thoughts

In this e-book, Judy Oakden explores the different ways evaluative rubrics can be constructed from three basic components:

  • key aspects of performance
  • levels of performance
  • importance of each aspect of performance

Here she shows some alternative ways she has combined the components in her own practice. She discusses the benefits and challenges of each approach.

Evaluation building blocks: a guide

With colleagues from Kinnect Group we published an e-book on the use of rubrics in evaluation. This free downloadable guide provides a simple nine- step process for planning and undertaking an evaluation. The ideas in this book were developed over ten years of collaboration.

Release Announcement: Evaluation Building Blocks – A Guide

This e-book was recommended reading by Better Evaluation. Better Evaluation is a website which collects best evaluation practice from around the globe.  It aims to build “evaluation practice, evaluation capacity strengthening and research and development in evaluation”.

Better Evaluation reviewer, Patricia Rogers, commented
[The guide} is particularly strong [for developing] a framework to assess performance. [It]… has detailed examples of using global scales (rubrics) to synthesise both quantitative and qualitative data... [It helps] to avoid the common problems caused by replying only on Key Performance Indicators and targets.... I’d especially recommend it in terms of developing a framework for evaluating performance.

Chapter on evaluation in "Social science research in New Zealand: An Introduction"

“Chapter 13 Evaluation” contributed to the textbook “Social science research in New Zealand: An introduction” Edited by Martin Tolich and Carl Davidson., Published by Auckland University Press

Social science research in New Zealand: An introduction (2018) is a textbook that is positioned as “the definitive introduction for students and practitioners undertaking social research in New Zealand”.  Judy Oakden and Julian King contributed a chapter on evaluation which described the evaluation approach used for the Evaluation of the Sustainable Farming Fund. An important part of this chapter was a description of the use of rubrics throughout the project.

The editors observed at the start of this chapter:

“This is an important chapter because evaluation is a discipline that is often misunderstood both among other researchers and the users or research outputs. Evaluation differs from other research approaches because it focusses on understanding the value of something. … Much of what researchers think of as ‘leading-edge’ practices have come from evaluation, and the authors capture much of that excitement and innovation here”.

For more information, please click here

Evaluation rubrics: how to ensure transparent and clear assessment that respects diverse lines of evidence

This is a practice example which is part of Judy Oakden’s early writing on how evaluators can use rubrics to make evaluative judgements. It details how the use of rubrics supports robust data collection and frames the analysis and reporting. Judy Oakden wrote this detailed account of the process as part of the first Better Evaluation writeshop process, led by Irene Guijt.

Rubrics: A Method for Surfacing Values and Improving the Credibility of Evaluation; Journal of MultiDisciplinary Evaluation

This is a practice-based article by the Kinnect Group members (Julian King, Kate McKegg, Judy Oakden, Nan Wehipeihana). It shares the Group’s learnings on the use of evaluative rubrics to deal with the challenge of surfacing values and improving the credibility of evaluation. Evaluative rubrics enable values to be dealt with in a more transparent way. In the Group’s experience, when evaluators and evaluation stakeholders get clearer about values, evaluative judgments become more credible and warrantable.

Providing mentoring support to develop rubrics

To develop and use rubrics on a large complex evaluation of the Food Systems Innovation (FSI) Initiative, team members sought mentoring from Pragmatica. This practice note, by Samantha Stone-Jovicich from CSIRO, describes the steps she went through and the challenges she overcame to develop rubrics for the FSI Initiative.

Samantha reflected afterwards:

[This was] a very large, complex project where we were trying to design a monitoring, evaluation and learning (MEL) system to capture the dynamics of innovation and its impacts. Judy helped us adapt and tailor a rubrics approach. Her abundance of knowledge and experience, coupled with her collegiality, collaborative nature, and flexibility and creativity, were instrumental to supporting us incorporating a useful and fit-for-purpose rubrics approach into our MEL system

Evaluation Projects

Here is a selection of published evaluation projects completed by Pragmatica, in collaboration with others.

Envirolink Evaluation

The Ministry of Business, Innovation and Employment commissioned us to review the Envirolink scheme and inform them of:

  • how well the scheme operates, and
  • whether it achieved its intended outcomes and provided value for money.

This was the first review of the scheme since it commenced in 2005.

The results of the review were positive and we rated the scheme very good overall. The review also provided evidence that the scheme makes a worthwhile and valuable contribution in supporting eligible councils to engage with and use environmental science and support uptake by users. The report’s recommendations will inform future funding decisions relating to Envirolink.

Regional Growth Programme Evaluation

The Regional Growth Programme (RGP) Evaluation was commissioned by MBIE and MPI. It assessed whether the RGP worked as intended. The evaluation reviewed some aspects of the RGP implementation, systems and processes. It assessed the value of the results so far.

This evaluation focused on ways government agencies worked with each other. It also considered ways central agencies liaised with the regional stakeholders and with Māori in the regions. It was not an evaluation of regional economic progress. Lessons learned at central, regional and project levels may inform future regional initiatives.

The evaluation team included: Judy Oakden, Kataraina Pipi, Kellie Spee, Michelle Moss, Roxanne Smith, and Julian King.

There are four documents linked to this evaluation:

The Regional Growth Programme Evaluation Report (103 pages) – full evaluation findings and methodology.


Three infographics which provide:

Sustainable Farming Fund Evaluation

The Evaluation of the Sustainable Farming Fund (SFF) provided an independent, formal assessment confirming the SFF’s value. It also provided information to help ensure the SFF is well-positioned for the future by aligning more closely with Government objectives.

Judy Oakden led the evaluation of the Sustainable Farming Fund (SFF) with team members; Julian King and Dr Will Allen. 

Three Case Studies: a companion document to the SFF Evaluation(46 pages):

  • Protecting the sustainability of New Zealand vineyards
  • Top of the South: Setting an example for sustainable water quality
  • Sustainable development and podocarp restoration on Tuawhenua lands.

Evaluation of stakeholder perceptions of the implementation of the Waste Minimisation Act

The evaluation of stakeholder perceptions of the implementation of the Waste Minimisation Act, looked into waste stakeholders’ perceptions of the early implementation phase of the Waste Minimisation Act, and the short-term outcomes (2009‒2010). This baseline evaluation of stakeholder perceptions was undertaken by the Ministry for the Environment with consultants Judy Oakden and Kate McKegg of the Kinnect Group in late 2010. The evaluation used ‘a mix of focus groups, key informant interviews and an online survey of 325 stakeholders.

Research projects

Here is a selection of research projects that have been published by clients.

Rangiātea: case studies and exemplars / Māori education success

Rangiātea is a major research project that draws together a rich array of examples from five high-performing, mainstream secondary schools. The research and case exemplars explain practical ways to raise Māori student achievement. They show how to build effective relationships with whānau. They also explain how to set up sound positive leadership in secondary schools in Aotearoa New Zealand. Judy Oakden, Nan Wehipeihana, Kellie Spee and Kataraina Pipi, undertook this project.

Here is the Ministry of Education portal to Rangiātea – to access the five case studies and exemplars:

Leadership Practices Supporting Pasifika Student Success

Leadership Practices Supporting Pasifika Student Success is an important research project containing three case studies and exemplars from three secondary schools with high numbers of Pacific students. These schools are McAuley High School, De La Salle College and Otahuhu College. They show effective school leadership in mainstream secondary schools to support Pacific student achievement and success. Judy Oakden, Kellie Spee, Dr Ruth Toumu’a, Pale Sauni and Clark Tuagalu undertook this project.

The links to each of the case studies and exemplars from the Ministry of Education Te Kete Ipurangi (TKI) portal can be found here:

McCauley High School Case Study (PDF, 1 MB)
McCauley High School Exemplar (PDF, 289 kB)
De La Salle College Case Study (PDF, 1 MB)
De La Salle College Case Exemplar (PDF, 420 kB)
Otahuhu College Case Study (PDF, 1 MB)
Otahuhu College Exemplar (PDF, 581 kB)


Wellington’s knowledge economy — coming to grips with technology change

In response to concerns about the state of the economy and employment opportunities in the greater Wellington region, this research drew on the experience of 113 knowledge economy businesses. Judy Oakden had a key role in designing the process that supported Bachelor of Commerce students from Victoria University to conduct the survey between July and September 2013. From this work, Dr Richard Norman from Victoria Business School, and Judy Oakden prepared a report of findings for the Wellington Regional Council.

Sensemaking

Sensemaking is a key part of Pragmatica’s process in undertaking evaluation and consulting assignments. Pragmatica uses tools such as rich pictures, pattern spotting and strategic foresight to make sense in uncertain and unpredictable settings.

Rich pictures: The use of Soft Systems Methodology in evaluation

Rich pictures are a Soft Systems Methodology tool that offers a quick and efficient way to work with key stakeholders to better understand their ‘problematical situation’ (Checkland & Poulter, 2006). In a demonstration session called If a picture paints a thousand words: the use of rich pictures in evaluation, Judy Oakden offered session participants a chance to try using rich pictures for themselves. 

This session was run at the American Evaluation Association International Conference in Denver.

Two webinars on the use of rich pictures in evaluation.
Judy produced two follow up webinars on how to use rich pictures in evaluation
  • University of Colorado 2015 Webinar series: Practical Application of Systems to Conduct Evaluation: Cases and Examples:
  • Soft Systems Methodology: The Use of Rich Pictures from Evaluation  Visit the webinar here
  • American Evaluation Association:  Rich pictures: using an effective Soft Systems Methodology tool in evaluation (available to AES members).

Pattern spotting as a sensemaking tool in evaluation

Sensemaking is essential in evaluation design. It promotes deeper stakeholder engagement and can lead to better insights and more evaluation use. This paper discussed how to design a collective sensemaking process as part of M&E practice. It considered ways to help navigate values and needs between different stakeholders. We argued sensemaking can make evaluation more useful. The presentation addressed five questions:

  • What is ‘collective sensemaking’ in M&E?
  • What forms can sensemaking take?
  • What are the conditions for successful collective sensemaking?
  • What role can sensemaking play to responsibly navigate the values, needs and understandings of stakeholders?
  • Why is collective sensemaking not more prevalent? How can we strengthen this part of M&E practice?

Notable innovation has occurred for qualitative data collection methods and for analytical procedures in quantitative reasoning in M&E. However, innovations in the analytical processes for mixed methods in M&E appear to be lagging behind. This paper suggests ways forward.

Irene Guijt and Judy Oakden presented this paper at the European Evaluation Society Conference.

Strategic foresight as a tool for envisioning evaluation in the future

I wasn’t expecting THAT! Imagining future scenarios – and the implications for evaluation

Change is on us – rapid, multi-dimensional change. In evaluation, we often hear people say “what a surprise – I wasn’t expecting that”. Buried in the tyranny of the day-to-day, we have few opportunities to consider the bigger, longer-term picture.

At the AEA conference in Minneapolis we ran a demonstration session where participants explored four possible futures, based in 2051, that were highly relevant to evaluation practice. Drawing from a six-month futuring exercise, this demonstration session provided insights and a new way of framing the future of evaluation. We thought about a wide range of issues – from climate change to population pressures and changing migration patterns, growing food and water insecurity, and the impact of accelerating technology that is all on the way.

During this exercise, we considered the future of evaluation. We shared possible scenarios for evaluation in 2051, considered evaluation practice in these possible futures and encouraged participants to explore options in more depth for themselves.

Judy Oakden along with Human Systems Dynamics founder, Glenda Eoyang, and fellow Associates,  Royce Holladay, Wendy Morris, Stewart Mennin and Claudette Webster, ran the session.

Archive

Our Archive contains a bibliography of:

  • Peer-reviewed articles and publications
  • Conference presentations
  • Recent published research and evaluation reports which have been made publicly available