Search the site by keyword

Evaluation report for a fair chance for all inquiry

Executive summary

This report offers an independent evaluation on the recent Productivity Commission (the Commission) Inquiry: a fair chance for all – Breaking the cycle of persistent disadvantage. This evaluation provides the Commission and its stakeholders with an independent view on where this inquiry performed well, and where there is room for improvement in future inquiries. This evaluation is the first time the Commission has brought together the traditional evaluation components (expert evaluation, focus groups, survey) into one combined report.1

The approach taken in this evaluation includes a grounding of findings in the purpose and function of the Commission. This provides context to the findings, acknowledging that public sector entities operate in a wider environment where formal and informal guidance and expectations change over time, and may not always align.

Evaluation findings point to an ambitious inquiry into a complex topic. The breadth and depth of research that was commissioned was celebrated, particularly given the novelty of the topic for the Commission. The value of the combined findings in the Final Report, created a valuable reference document to inform policy making and social change on the drivers behind persistent disadvantage and the public sector mechanisms that can be considered for reducing it. The breadth of engagement was valued highly, including amongst Māori and Pasifika stakeholders. The community sector in particular is already making use of the Final Report and its findings.

Findings identify improvements that could be made around process management and in communicating more clearly the trade-offs inherent in decisions around engagement and focus areas. Some sectors, such as economic and fiscal policy agencies, are not currently using the report as much as the not-for-profit sector. This may point to differences in frames of validity and/or values, an issue that could be considered as part of the Commission’s future work.

The wide approach to engagement, particularly at the beginning of the Inquiry was valued by many stakeholders, and seen as a particularly effective way to work in an Inquiry that dealt with people who persistently are left out or do without. The new initiatives trialled by the Commission were overwhelmingly supported and considered valuable.

There may be a greater call to action with a report that deals with such topics, due to the troubling fact that persistent disadvantage in Aotearoa New Zealand remains. There are a number of ways the Commission could consider how to navigate in this space, depending on its mandate, its strategic priorities and how it plans to give effect to and continuously improve on its function and purpose.

Recommendations include a span of measures across areas of performance, some small and some large, for the Commission to consider. This report is written in a practical way, to make these findings as useful and as actionable as possible. The report is written by the Evaluation Project Director, Dr Ruth Fischer-Smith.

1. Note the Commission issued the online survey and will publish these results in full, separately to this evaluation.

1. Purpose and intent

1.1 Approach to this evaluation

The New Zealand Productivity Commission (the Commission) commissioned an independent evaluation of the Fair Chance for All Inquiry (the inquiry), which was conducted from June 2021 when the Terms of Reference scoping began, to July 2023, once the Final Report and quantitative analysis were released. Evaluations are required for all Productivity Commission inquiries, as outlined in the Statement of Performance Expectations.2 Other recent evaluations of inquiries are available online.3

The scope of this evaluation included:

  • A review of the Final Report and an assessment of the key themes within supplementary reports;4
  • Interviews with Commissioners, Inquiry Directors and research consultants or contractors who contributed to the inquiry;
  • Two focus groups requiring 12-20 people (total participation was within range with 17);
  • Incorporation of additional data sources, including:
    • the online survey on the inquiry’s performance against the performance measures set out in the 2022-23 Statement of Performance Expectations.
    • submissions summary
    • engagement and media reporting
  • Feedback on the overall performance of the inquiry against the Commission’s six performance measures;
  • Comment, as appropriate, on the Commission’s three impact measures.

Outside the scope of this evaluation were:

  • Consideration of the Commission’s impact indicators beyond an overview;
  • Consideration of supplemental reports (Interim and Quantitative) beyond an overview; and
  • Additional interviews with past/present Commissioners or with further research contractors beyond those named within scope.

The evaluation of this inquiry is the first time that the findings from all these components have been brought together into one report.5 Previous inquiry evaluations have delivered separately the review, focus groups and online survey components. The intent of commissioning the evaluation in this way was to enable greater triangulation and synthesis of the findings across the various data sources, with the view to eliciting richer commentary and more robust and usable recommendations for future quality improvement.

The approach to this evaluation is a report that is readable and user-friendly. In addition to reviewing the inquiry according to the Commission’s performance measures, this report provides additional insights around the role of the Commission within the public sector and society more broadly. Both the key findings and these insights are intended to support the Commission in its impact, outcomes and continuous improvement, for future inquiries and for other Commission activities.

2. Statement of performance expectations 2023-24 (productivity.govt.nz) This document also provides reference to the Commission’s performance measures and impact indicators.

3. https://www.productivity.govt.nz/inquiries/immigration-settings/evaluation/ and https://www.productivity.govt.nz/inquiries/frontier-firms/evaluation/

4. The reports and materials considered included the Terms of Reference, Interim Report, Final Report, Quantitative Report, consultation and feedback reports on the Terms of Reference, and a range of research reports commissioned by the Inquiry.

5. The Commission delivered the survey independently of this evaluation and the results were shared and incorporated into this report.

1.2 Inquiry Terms of Reference

The inquiry Terms of Reference included:

  • new insights about the dynamics and drivers of persistent disadvantage;
  • recommendations for actions and system changes to break or mitigate the cycle of disadvantage; and
  • help to raise public awareness and understanding of trends in economic inclusion and social mobility in New Zealand.

Further guidance was provided around methods and approaches that the Commission should use in informing Inquiry recommendations. The referring Ministers for the Inquiry were the Ministers of Finance, Child Poverty Reduction, Minister for Social Development and Employment, Minister of Revenue, Minister for Māori Development, and Minister for Pacific Peoples. The full Terms of Reference and associated guidance can be found online.6

This inquiry used some new approaches for the work, which are discussed in detail within the 'new initiatives' section of this report. These approaches included:

  • Public consultation on shaping the Terms of Reference;7
  • Actively collaborating with particular groups using policy workshops;8
  • Using wānanga and talanoa sessions to gather evidence;
  • A comprehensive analysis and published summary of submissions to the Interim Report; and
  • Taking a systems approach, instead of focusing on specific policy areas (e.g. housing), and using systems-thinking tools and methods.

6. https://www.productivity.govt.nz/assets/Documents/ToR-EISM-inquiry.PDF

7. This approach was taken by the Minister of Finance and agreed by Cabinet https://www.treasury.govt.nz/sites/default/files/2022-06/cab-paper-swc-21-sub-0218.pdf

8. This may have been done previously but this Inquiry used it quite heavily.

1.3 Purpose and function of the Commission

The purpose and function of the Commission are laid out in the New Zealand Productivity Commission Act 2010 as:

The principal purpose of the Commission is to provide advice to the Government on improving productivity in a way that is directed to supporting the overall well-being of New Zealanders, having regard to a wide range of communities of interest and population groups in New Zealand society.

The functions of the Commission are:

  1. on referral to the Commission by the responsible Minister in conjunction with the relevant portfolio Ministers (collectively, the referring Ministers), to hold inquiries and report to the referring Ministers about productivity-related matters; and;
  2. on its own initiative, to—
    • undertake and publish research about productivity-related matters; and
    • promote public understanding of productivity-related matters.9

The Treasury is the policy agency for the Commission and the current responsible Minister is the Minister of Finance. The Commission’s functions and purpose are also governed by the Crown Entities Act 2004.

The role, purpose, and value of the Commission were most recently considered through a comparative analysis of international productivity institutions and subsequent advice from the Treasury to the Minister of Finance.10 The Evaluation Project Director is not aware of any other reviews of the Commission’s legislation, role or purpose.

2. Process and methodology

The primary evaluation frame comes from the Commission’s six performance measures. They are:

  • Good Process Management
  • Effective Engagement
  • Right Focus
  • High Quality Work
  • Clear Delivery of Messages
  • Overall Quality

Additional evaluation inputs for this include:

  • Feedback on new initiatives used in this Inquiry;
  • The quality and depth of drawing on te ao Māori and Pasifika approaches, as specified in the Terms of Reference;
  • Triangulation with further data sources concerning the Inquiry; and
  • Practical insights to contextualise the Inquiry performance within wider public sector and social trends.

References are cited in footnotes throughout this report. All sources are available online and links are provided. A standalone reference section is not provided, in an effort to reduce report length.

2.1 Methodology for qualitative data

The qualitative data that informs this report is primary data collected by the Evaluation Project Director. It consists of nine interviews and two focus groups, with a total of 17 participants between both groups. Total participation for qualitative inputs was 26 people.

Within the scope outlined in section 1.1, participants for interviews and focus groups were chosen to ensure coverage across a range of domains, specifically across:

  • Topics or content knowledge;
  • Types of data use and familiarity; and
  • Sectors.

A list of participating organisations for the interviews and focus groups, as well as the questions used to facilitate these sessions, are included within appendices to this report.

2.2 Focus groups

Focus groups were designed to capture different types of conversations. The first group focused on academics/subject experts and community sector representatives. The second group focused on public sector professionals. The themes from the focus groups are included as part of the wider qualitative data throughout this report.

Participants of both focus groups reported value in their attendance. In addition to an opportunity to share their experience of contributing to the inquiry, participants also took the opportunity to connect directly together, in order to further their work towards reducing persistent disadvantage. This was a clear example of secondary benefit from the focus groups, beyond evaluation data collection.

2.3 Further data sources

2.3.1 Online survey findings
There were 146 respondents to the online survey out of 1231 invitations.11 Survey respondents had the following characteristics:

  • 77% were involved directly with the Inquiry, either attending an event during the Inquiry, speaking to one of the Inquiry team or commissioners, or making a submission on the Terms of Reference or the Interim Report;
  • 32% were from the public sector (central or local government); 18% were from the charitable/social sector; the remainder included business, researchers/academics/think tanks, and private individuals;
  • 2% were from an iwi or Māori organisation; and
  • 66% had never engaged with the Commission on previous inquiries.

2.3.2  Submissions on Interim Report
A themed analysis of submissions on the Interim Report was invited and published, which was a new initiative for the Commission. This included thematic analysis of 68 submissions across a range of perspectives and sectors, which informed the focus and detail level of the Final Report. The full report is available on the Commission website.12

Linked to the submission report were further insights into the feedback gathered through wānanga and a talanoa session, to engage more directly with Māori and Pasifika communities.13

2.3.3  Media and engagement report
An internal report covering engagement and media during the Final Report launch period also informed this evaluation. This report was part of ongoing media monitoring internally at the Commission and is not a public-facing document. The report summary demonstrates the breadth of launch-related engagement, and includes findings such as:

  • 65 people attended the online officials’ briefing;
  • 169 people attended the launch event (63 in person, 106 via livestream);
  • There were 43 views of the online recording of the launch event;
  • 68 people attended the post-launch webinar (with 87 views of the recording of this event); and
  • The report received 44 mentions and/or articles covering the inquiry.

3. Findings

In general terms, the Inquiry was welcomed by a broad range of stakeholders across Aotearoa New Zealand. The topic was viewed as important and the Inquiry findings were of high interest to many people. The analytical frames used, and the breadth and depth of research commissioned to support the Inquiry, were overall seen as high-quality, robust and meaningful. This included support for te ao Māori frames and those that place whānau at the centre of analysis and advice. Particularly amongst the community sector, including some Māori and Pacific organisations, the Final Report is already being shared and used.

The wide approach to engagement, particularly at the beginning of the Inquiry was valued by many stakeholders, and seen as a particularly effective way to work within an Inquiry that dealt with people who persistently are left out or do without. The new initiatives trialled by the Commission were overwhelmingly supported and considered valuable. However, some stakeholders who often make use of Commission inquiry findings, did not always understand or appreciate the trade-offs this required, such as not publishing an issues paper because the consultation on the Terms of Reference had broadly served that purpose instead.

Structural and personnel changes at the Commission had a direct effect on the Inquiry, an impediment articulated by nearly every key respondent. This was amplified by other process obstacles, such as technical malfunction and insufficient planning in some areas.

Although some people bemoaned the decision to focus on the public management system for the Final Report, most people supported this, or at least understood the rationale for the decision. Some of the qualms expressed by respondents concerning this focus represented a difference in views around theories of change, in terms of when, where and how policy change can be most easily articulated and/or impactful.

The clarity of message was good or excellent for most people. Even for those who disagreed with the focus on the public management system, the narrative coherency of the Final Report was generally viewed positively.

As context to the following discussion on specific performance measures, it is helpful to remember the basics. The Inquiry delivered an insightful, comprehensive, and well-researched report within the timeframes agreed. The report provides a thorough and robust reference document to inform policy making and social change on the drivers behind persistent disadvantage, and the public sector mechanisms that can be considered for reducing it.

3.1 Performance measures

The following sections concerning performance measures are laid out as follows. First, the key qualitative findings for each performance measure are presented. More detail on these key findings, as well as additional common findings are then presented according to three types of feedback (intent and methodology, clarity and execution, useability). The coverage of themes differs between performance measures, in order to best capture the relevant data for each measure.

These qualitative findings are contextualised with further data sources from section 2.3. Key summary and findings by the Evaluation Project Director conclude each section.

These primary qualitative findings are followed with relevant references to other data sources (online survey, submissions summary report, engagement and media report) to highlight alignment or lack of alignment with the qualitative data. Each performance measure section is then wrapped up with a summary, inclusive of all the data sources, and concluded with findings from the Evaluation Project Director. At the end of this report, findings are recast as more general recommendations for the Commission.14

Throughout the report, the term ‘respondents’ refers to those who participated in interviews or focus groups for this evaluation. Where this report discusses further data sources, such as submissions summary report or online survey findings, the term respondents is further qualified.

3.1.1.  Right focus

The Right Focus measure is defined as ‘the relevance and materiality of the Final Report in meeting the Terms of Reference’. The key findings in this area were:

  • The ambition to focus on system-level change was broadly supported, although views differed on execution and some stakeholders wished for a deep dive in particular sectors.
  • The dynamics and drivers contributing to persistent disadvantage were captured well across the suite of research, analysis and reports, although views differed on the ultimate impact and effectiveness of recommendations that focused mostly on the public management system.
  • The intention to include both longitudinal and theoretical data (e.g. learning systems) was an ambitious attempt to speak to a range of methodological views and theories of change. The result was questions from all sides, which probably means the balance was about right.

Intent and methodology

Slightly more than half the respondents agreed with the intent and methods behind the focus of the Inquiry. The ambition to cover system-level change, rather than diving too deeply into too many individual sectors, was seen as the right choice by these respondents. The ambition of scope behind the Inquiry was acknowledged and celebrated, with respondents noting the challenge of this accomplishment. One contributing factor to the ‘rightness’ of this choice included the importance of establishing broad coverage to understand persistent disadvantage, before diving too deeply into specific sectors.

Some respondents had concerns about the focus on the public management system. These concerns originated within the narrowing respondents felt happened between the early Inquiry materials and the Final Report, to focus recommendations towards the public management system. Some respondents stated this as a change from the Interim Report, however the Interim Report states in its overview that, ‘Our interim recommendations focus on the overall settings of the “public management system”’15. One concern behind these choices was a view that focusing recommendations at the public management system minimised the role of other socio-economic drivers of persistent disadvantage. Drivers such as ‘design of the economy’, ‘job creation’ and ‘living wage movement’ came up repeatedly for these respondents. Another concern was a view that the Commission did not dive deeply enough into specific public management sector areas, such as health, housing, or child poverty reduction policies.

A different type of methodology concern came from people who felt the report was not grounded enough in quantitative data and evidence. This included a desire to see more longitudinal datasets as part of the Final Report. Many of the respondents who held this opinion, often referred to previous Commission inquiries as examples of what they had hoped to see, identifying a type of information that brings people to the discussion table.

It doesn’t engage enough at the data and evidence level. It is music to some people’s ears but turned off others. Focus Group Participant

Data people were frustrated because it was undercooked. People who wanted to see both systems change and data story integrated didn’t get it. It became more about the systems issues at the end. Interview Respondent

Note that discussions around data limitations are present in the Interim Report (Chapter 3), the quantitative report (Chapter 2) and are mentioned in the Final Report as well. Barriers to including the quantitative data within the Final Report are further discussed in the ‘Good Process Management’ section of this report.

Clarity and execution

The majority of respondents were favourable towards the execution of the focus that was ultimately decided upon. They supported the way the Inquiry was framed, identifying the language around persistent disadvantage at a systems level as accurate and clear. The dynamics and drivers feeding into the focus and recommendations were seen as both broad and deep, providing a robust base of knowledge and information for drawing on. The inclusion and integration of frames such as Mauri ora, He Ara Waiora and whānau-centred frames, contributed to clear and well-positioned execution of the focus.

Respondents recognised the nature of this Inquiry as a different type of topic for the Commission, as well as arguably for the wider public discourse, and acknowledged that finding a focus was always going to be a challenge. They felt that execution of a piece of complex system analysis, particularly in a topic that went beyond traditional economic expertise, had been delivered well.

The Terms of Reference were vast, choices had to be made by the Productivity Commission on finding a coherent topic within a vast territory. Asked to build on many things across many disciplines [they] instead went to system findings and steered into higher level description of systems problems. Focus Group Participant

In terms of what was seen as less effective for execution and clarity of focus, a number of respondents mentioned a ‘disconnect’ between the Interim and Final Reports. In particular, respondents from the public sector felt the discussion around complexity and what it means to tackle complex problems in the public sector was diminished in the Final Report, relative to the Interim Report. Noting that this connection is made in places within the Final Report (pages 36 and 104), it may simply be that the relationship between complex systems and learning systems could have been clarified more directly for readers in the Final Report.

Similarly to the feedback around the intent of the focus, some respondents were disappointed with the lack of economic data in the Final Report. Although this data was provided in the Quantitative Report in July 2023, that approach did not seem sufficient to such respondents, who saw that ‘the intention to do economic data and then the shift to systems change never melded together’. This finding is discussed in more detail in the considerations section of this report.

On ToR basis, I thought it could be a strong report, but I was very surprised by the report that came out the other end - I was expecting specific data and evidence, rather than systemic issues. Focus Group Participant

[There was a lack] of engagement with business … it has very little about material production. The way we produce and distribute economic resources is fundamental to our definition of economic disadvantage. Focus Group Participant

Further data sources concerning right focus

Responses to the online survey aligned with participants who supported the focus of the Final Report. This is best contextualised in Question 6, which tested the likelihood of people using the Inquiry report as a resource and reference in the future. To this question, 76% of survey respondents agreed or strongly agreed, indicating that the focus was right for a majority of stakeholders. Question 17, which asked whether the report focused on the issues of most significance from the Terms of Reference, had 69% of survey respondents who agreed or strongly agreed.

Submissions from the Interim Report in the Your Feedback report also ‘endorsed’ system barriers as a focus for persistent disadvantage.16 However, some of the feedback to this focus decision pointed at uncertainty as to whether a focus on the public management system would result in the type of recommendations that lead to real change. For example, some submitters to the Interim Report disagreed that focusing on proposed system barriers would be sufficient to reduce persistent disadvantage, with a small group of submitters suggesting that changes elsewhere (beyond the public management system) were needed instead.

Summary of right focus performance

The inquiry decided on, executed and communicated on a focus that worked for many people and entities. The decision to focus on system-level findings provides a platform for future work, whereas a decision to provide more deep dives would arguably have skipped a consistency and frame- establishment step, potentially creating methodological siloes for future work.

The split in the primary data between people who agreed versus disagreed with the focus of the Inquiry is quite even. This indicates that no choice would have satisfied everyone.17 That seems to be the case for the focus of the Fair Chance for All Inquiry.

There was a correlation between people who disagreed with the focus on the public management system and those who felt that economic data was underused in the Final Report. People holding this view were also most likely to feel that a particular sectoral deep-dive would have enhanced the report, often correlating with the sector in with the respondent worked. It is also interesting to note that some of the ‘missing sectors’ people wanted to see in the Final Report, such as job creation or the living-wage movement, do sit at least partially within the public management system. Some of the implications of these views are discussed further in the ‘Insights’ section of this report, regarding which methodological frames are seen as valid by whom.

There may have been an opportunity to communicate more effectively the trade-offs inherent in this narrowing choice to focus on the public management system. The Commission made a choice to focus at the system-level, rather than diving into sector-level analysis or recommendations. Based on respondent feedback, the Commission could have made this decision clearer in the Final Report. The recommendations in this section are therefore not around the focus choices that the Commission made, which were informed by robust feedback, engagement and expert knowledge of those on the Inquiry, but around the way those focus decisions were communicated.

Some of this scope discussion is interconnected with the approach taken to consult on the Terms of Reference, and may have been an implication of that approach. This is discussed in more detail within the 'New initiatives' section of this report.

Finding 1: The Final Report could have included a more proactive discussion around the limitations of public sector management levers and mechanisms alone to reduce persistent disadvantage. This discussion does occur within the Final Report but it could have been simplified and emphasised for readers.

Finding 2: The Commission could have more clearly telegraphed the evolution of ideas and findings from the Interim Report, through submissions, and the subsequent rationale for choices made in the Final Report. Specifically, the themed submission analysis on the Interim Report could have been accompanied by a more exhaustive rationale on all interim findings, recommendations and questions), mapping more clearly how these ideas did or did not find their way into the Final Report. The publishing of submissions provides a valuable record for those working on the topic in the future, and should continue.

3.1.2  Effective engagement

The effective engagement measure is defined as ‘the quality of engagement with interested parties’. The key findings in this area were:

  • Engagement was robust and thorough throughout most of the Inquiry. The Commission was perceived as having an authentic approach to engagement.
  • The inquiry covered an impressive spread of engagement types. This range of approaches created a broad reach to the voices included in the Final Report.
  • The level and quality of Pasifika and Māori engagement was seen as positive overall. Key partners saw the voices they represented incorporated throughout the process and within the Final Report.
  • The diversity of approaches at the beginning of the inquiry were highly valued by a portion of stakeholders. This included a reach into channels not used as much in previous inquiries. However, these stakeholders had wanted to see consultation and engagement continue at the same level as the process used for the Terms of Reference.
  • Respondents raised questions about sufficiency of engagement at political and decision- making levels, although others saw this as outside the mandate of the Commission. Views were inconclusive on what difference this may have made.

Intent and methodology

A majority of respondents valued the engagement approaches used in the Inquiry, finding them both effective as well as meaningful. This includes the breadth, depth and range of engagement.

It was actually a conversation, rather than ‘tell us what you think and we’ll take it away’. It was a dialogue. I think that was partly what fed into the confidence to go to a draft report rather than an issues paper, they felt they had a good feel. It’s a risk, when you go to report and recommendations, people do tend to focus on recommendations, but they had such deep engagement. Focus Group Participant

They were getting out and making contact with people, feeding back - we heard back, they were really good about it. They were really brave about coming out and talking to everyone, facing them. Focus group participant

Because the diversity of approaches, and the breadth of engagement, were so valued by a group of stakeholders, there was disappointment that this approach did not continue at the same level and reach throughout the entire Inquiry. A group of people wanted to see more consultation towards the middle and end segments of the work, mirroring the very wide, consultative approach to the Terms of Reference. Another group of people felt engagement approaches were inconsistent and thought the focus and supporting engagement should have been narrowed much earlier.

Clarity and execution

The authenticity of the Commission’s engagement came up as a theme for many respondents. This applied particularly to wānanga and talanoa, as well as to some of the early policy workshops held around the country. The Chair of the Commission was named several times in this context, as having a particularly authentic form of engagement, which many respondents valued.

[The] Commissioner honouring the lives of people he’s talking about was very well done. I heard him talk again last week and the attempts to honour lived experience came across as genuine. Focus Group Participant

In addition to the wānanga and talanoa, the level and approach to Pasifika and Māori engagement was seen as positive overall. This finding came both from respondents who participated in these processes, noting them as respectful and embodying types of engagement appropriate for working in te ao Māori and Pasifika, as well as those who were observers or recipients of this information.

The team are public sector workers so they’re comfortable with engaging with Te Ao Māori, if anything they were too reserved, worried about getting it wrong when they are actually quite capable. Interview respondent

There was a general intention to listen to feedback provided, you could see your input was captured. Interview respondent

These views included appreciation of the nuance of how Māori and Pasifika voices were incorporated, inclusive of both direct findings as well as contextual evidence.

Useability

A few respondents raised questions raised about sufficiency of engagement with political and decision-making levels, views that were often expressed hand in hand with a wish to see clear, decisive action on reducing persistent disadvantage. Lack of engagement with senior decision makers arose as a theme, although some voicing this view noted at the same time that they were unsure what difference it would have made. Some respondents mentioned Ministers as well in this context.

Note that the Productivity Commission Act requires the Commission to ‘act independently’ in delivering its role. This means that the Commission would deliberately not engage with Ministers during the course of an inquiry, apart from providing written and verbal briefings on publication of interim and final reports. For this Inquiry, referring Ministers were briefed verbally for the Interim Report. Referring Ministers did not take up the opportunity to be briefed verbally for the Final Report.18

Though not a majority view, some respondents found the tone of the inquiry ‘too political’ and indicated ‘too strong a hand’ from Government. This can be viewed as a perception rather than evidence, as the Government was involved only in setting the Terms of Reference and in providing general guidance to the Commission via the letter of expectations, discussed in section 3.1.5.

Other respondents saw engagement with the political layer as outside the mandate of the Commission. These views are discussed further in the ‘Insights’ section of this report.

[There was] a big gap between consultation and decision makers at agencies with more connection to the appropriate ministers. Focus Group Participant

There was not much from [political layer] leaders - but this is not the Productivity Commission’s fault, it’s hard to get to that level. Focus Group Participant

Further data sources concerning effective engagement

Responses to the online survey aligned with participants who found the engagement process effective and enjoyable. This is best represented in Question 22, where 68% of survey respondents agreed or strongly agreed that they had sufficient opportunity to participate in the Inquiry, and Question 23, where 68% of survey respondents agreed or strongly agreed that the Commission was approachable.

Of the respondents, 66% had never engaged with the Commission on previous inquiries. This is further indication of the breadth of engagement on the inquiry.

The media and engagement report outlined ministerial engagement during the launch phase of the Inquiry, with a written briefing paper provided for all referring Ministers. This report notes that in- person briefings were offered and that no referring Ministers took up this invitation.

Summary of effective engagement performance

The engagement approach for the Inquiry was not only welcomed but celebrated by many. Given the commonality of this view across respondents, this is a point of note within the findings. Although it is likely too early to identify deeper impact from this engagement, current engagement impressions of the Commission amongst many stakeholders are positive.

The approach to engaging with Māori and Pasifika was highly rated. This opinion was held both by those who engaged directly within wānanga, talanoa or other fora, as well as by people from these communities who engaged with the inquiry in other ways.

Finding 3: The Commission may not have sufficiently considered the trade-offs inherent in its approach to engagement, consultation and feedback. Specifically, this could have included:

  1. the degree to which broad engagement approaches and activities were sustainable throughout the Inquiry;
  2. actively managing stakeholder expectations about this level of engagement, particularly during the more intensive analytical phases of the Inquiry; and
  3. actively weighing up the relative merits of different consultation processes, and communicating this clearly as part of engagement.

For example, if the Commission were to repeat the consultation exercise on the Terms of Reference for an inquiry, it should conduct engagement in a way that makes clear that the exercise is the primary (or only) method for feeding into framing the inquiry.

Finding 4: The Commission delivered well on engagement methods specific to Māori and Pasifika as part of future inquiry and work planning. The feedback was positive overall, which created a good platform to work from. Further investment into these capabilities would be valuable, in order to improve integration and understanding with these frames and lived experiences.

3.1.3  Good process management

The Good Process Management measure is defined as ‘the timeliness and quality of the inquiry process’. The key findings in this area were:

  • The Final Report was delivered on time, including the completion of significant research and engagement work programmes.
  • Staff turnover and changes within the Commission more broadly had direct impact on the Inquiry. This contributed to resourcing challenges and some pockets of inconsistent information passing within the work team.
  • Role clarity and recommendation-setting process both suffered from lack of clarity at times. Expectations between directors and commissioners were not always clear, which created some critical pinch points.
  • Impediments at Statistics NZ reduced access to the Integrated Data Infrastructure (IDI), impacting the overall Inquiry timing, insofar as it required the Final and Quantitative reports to be delivered separately.

Clarity and execution

The Final Report was delivered on time, including the completion of significant engagement and research work programmes (e.g. the publication of ten supplementary research reports during the course of the inquiry). This was a significant achievement, particularly considering the challenge of the topic at hand and some of the intervening variables, both internal and external to the Commission.

Staff turnover impacted the inquiry significantly. These changes were occurring across the Commission more generally and were not always specific to the inquiry. The impacts on the Inquiry included direction changes in engagement methods and inquiry focus.

There was a lack of clarity in some domains concerning the respective roles/inputs of directors and commissioners. This reduced clarity of communication in some parts of the work. It was also responsible for creating pinch points, where an unclear process had to be worked through during the process, rather than in advance.

In future, we would put expectations on the table with clear roles and responsibilities, and we’d have that conversation right up front. Interview respondent

The process to set and confirm recommendations potentially suffered from under-planning. Although this was included as scheduled milestones within the work programme, it may not have been given sufficient ‘wiggle room’ to work through the types of analytical and tactical decisions that inform a recommendation-setting exercise. Several internal interview respondents identified this as an area to work on. Respondents reflected that this was potentially an instance of ‘over promising’ what the inquiry team could deliver within the timeframes and allocated resources. Additionally, family emergencies meant the inquiry Director was unavailable during parts of the process to confirm recommendations. The most sensible way to consider these challenges may be as a key person risk, where in a small organisation like the Commission having back-up or acting duties assigned is not always possible or top of mind. These findings inform Recommendation 5.

A drive to produce both a deep and broad report required a thorough quality assurance process. The time and resource this required was underestimated, which had an impact on planning both internally and externally.

A desire to do everything had an impact on planning. Honestly, we were a bit slow in terms of recognising those limitations and how much that would impact on our plans. Interview respondent

A few comments also arose concerning peer review processes, including significant technical failure of systems in use. This includes template malfunctions and issues with IT systems, leading to many hours of issue mitigation, all of which caused delays to producing report material drafts. Indirectly, these technical issues may also have affected team morale, although most respondents were not close enough to these issues to make such observations. Respondents who raised these issues, clarified that the technical malfunctions have been remedied since the Inquiry.

Useability

Significant delays from Statistics NZ occurred in releasing IDI data outputs for Commission use. This was a result both of Cyclone Gabrielle affecting the Census process as well as more general backlogs in accessing IDI. This meant the Commission had to make a decision about the release of the quantitative report, which was then delayed until July 2023 in order to ensure it was robust and thorough. It also meant the quantitative analyses weren't available until much later than planned, which limited the Commission's ability to integrate these findings more deeply alongside findings from other research and engagement.

In addition to limiting the quantitative available for the Final Report, this delay also presented a communication and messaging challenge in keeping stakeholders engaged long enough to receive the Quantitative Report. This is mentioned again within the ‘Clear message delivery’ measurement section of this report.

Further data sources concerning good process management

Responses to the online survey roughly aligned with some of these process and timing findings. This is best contextualised in Question 20, which tested survey respondent satisfaction with the Commission’s process. To this question, 34% of survey respondents disagreed, strongly disagreed or did not respond that they found the process satisfying. Although 66% of survey respondents agreed or strongly agreed that the process was satisfying, this is a lower rate of agreement than most of the other survey findings referenced in this report. This may indicate that some of the challenges discussed in this section were felt by stakeholders and participants.

Expert overview and findings

It is important to remember that the Final Report was delivered on time. This is an accomplishment from a process point of view, considering the ambition and scope of the report, as well as some of the internal and external challenges the inquiry team faced along the way.

Many of the process challenges in the inquiry appear to be correlated, insofar as they represent issues that may have some similar causes. Particularly for a small organisation working on an ambitious inquiry topic, these process challenges had an impact on the ease and internal clarity of expectations for the inquiry.

Finding 5: The Commission could have reduced risk and resource pressure by building in mitigations to anticipate disruption. This could have included:

  • role clarity across leadership functions could have been more clearly articulated to ensure understanding across all relevant parties;
  • the scheduled process for confirming recommendations could have been revisited, once resourcing changes and other delays occurred, to ensure the plan was still fit for purpose;
  • acting arrangements could have been considered as a mitigation to key-person risk and to anticipate the occurrence of personal events. Lessons learned from business continuity during COVID-19 events could potentially have informed an approach to this.

Finding 6: The delays in accessing IDI had a significant impact on the quantitative component of the inquiry. This required the Final and Quantitative Reports to be published separately, which likely reduced the readership of the Quantitative Report. Considering the criticality of this component for the inquiry, the Commission should anticipate such potential IDI delays in the future.

3.1.4  High quality work

The ‘high quality work’ measure is defined as ‘the quality of the analysis, use of evidence, findings and recommendations in the Final Report’. The key findings in this area were:

  • The overall presentation and analytical frames worked for many people as a relevant and evidence-grounded way to analyse and present information.
  • The breadth and depth of research that informed the inquiry was named as valuable by many. This finding was common across a range of stakeholder demographics and evidences participant confidence in the work.
  • The inclusion of longitudinal datasets to improve understanding of persistent disadvantage was a key accomplishment. This is part of the Commission’s core purpose and function but accessing, analysing and presenting such data is not a simple undertaking. This can be seen as a success for the inquiry.
  • The use of te ao Māori frames within the Final Report were generally seen to be integrated authentically and in an analytically rigorous way. This finding was common across many respondents.
  • Although engagement in te ao Māori and Pasifika was viewed positively, the process around integrating the research grounded in these frames could have been better supported. Some te reo terms and concepts required more effort and understanding in order to be utilised accurately, such as concepts like mauri ora and mauri noho. Integration of research and experiences arising from colonisation also required more nuanced conversations and time, than had potentially been anticipated by the Commission.
  • Some respondents identified recommendations that did not feel grounded in evidence and/or analytical frames, to the same extent as other recommendations. This (real and/or perceived) logic gap presented a barrier for them in using the Final Report.

Intent and methodology

The analytical frames used in the Final Report were viewed as well-grounded and rigorous by many participants. The overall presentation of the data according to these frames worked for many people, including those who are experts in the topic.

The breadth and depth of research commissioned to inform the inquiry was named by many as extremely valuable. Not only did this directly enhance the quality of inquiry reports (Interim, Final and Quantitative), but it also provided a public resource to inform future thinking and changes. The commonality of this finding demonstrates confidence in the work, across a range of stakeholders. The online survey gathered similar findings.

The creation and inclusion of longitudinal datasets to improve understanding of persistent disadvantage provided a key contribution to public understanding of the topic. Although some participants wished for more sector-specific data, most agreed that the information created and presented by the Inquiry was valuable. This is the type of work that, according to its purpose and function, would be expected of the Commission. However, it is a challenging task to actually deliver and should thus be viewed as a success for the Inquiry.

Quite impressed with way the team managed data quantitative evidence (close to my heart) with bringing in different databases. Interview Respondent

Just the descriptive statistics showed inequity all over the place. So, there was evidence to suggest a problem that needs to be fixed, but it wasn’t granular enough to target or understand the mechanisms to deal with the issues. This inquiry helped fill in this gap. Interview Respondent

The use of te ao Māori frames, namely the Mauri ora and the He Ara Waiora approaches, were named as authentic and applied well. This finding was articulated by a range of respondents. In addition to creating a way of viewing persistent disadvantage that is grounded in the experiences of Māori, the use of these frames creates analytical integration with similar topics of research and advice currently in the public sphere.19 Respondents also commented on the growing capability with the public sector, with the Commission as a good example of this, for understanding, discussing and referencing te ao Māori frames.

Execution and clarity

Some respondents extended their views of authentic application of te ao Māori frames to include the way that data and evidence around Māori and Pasifika people was incorporated into the Final Report. These respondents found the data to be nuanced and accurate, reflecting a wider understanding of the Māori and Pasifika experience. This included strengths-based as well as deficit- oriented data.

One area where room for improvement was identified was the articulation of some concepts from te ao Māori into non-Māori frames and language. For example, terms such as mauri ora and mauri noho may have been used more narrowly in the Final Report than they are in Māori communities.

Additionally, respondents noted that the intent to incorporate frames and experiences around colonisation and institutional racism was laudable, but the process took work to build understanding within the Commission. This work may have been integrated more smoothly if the inquiry had set aside more time to work through these topics with the research provider, considering the challenges inherent in bringing together differing backgrounds and frames of reference. Additional capability investment in te ao Māori, particularly when themes so central to the Māori experience are part of the evidence base, may have improved this process. This is captured in Recommendation 8.

Some respondents questioned whether the discussion around disadvantage versus persistent disadvantage, differentiated enough between the two concepts. Note that this discussion received direct discussion on page 25 of the Final Report, but some people may simply have wanted a more extensive discussion on the differentiation between types of disadvantages. Some of these comments may be mapped back to methodological differences, which is further discussed in the 'Insights' section of this report.

Some respondents identified recommendations that did not feel grounded in evidence and/or analytical frames to the same extent as other recommendations. This (real and/or perceived) logic gap presented a barrier for them in using the Final Report. Other respondents felt that some research was not represented in the way it was intended, although including specific examples would not be appropriate for this report; therefore this finding can be taken as generic rather than specific. One example was the Final Report Recommendation around a social floor.

There are some policy recommendations which don’t drop out of the analysis. It’s disconnected. With the public management system, it’s really important to distinguish relevant alternatives. Interview Respondent

There is a disconnect between the thinking and recommendations - we’re conflating two things: inactive behaviour and formal systems - they’re different, one may be much more permissive. The social floor [for example], where did this come from? This didn’t follow from the analysis. The links don’t always match up in the report, maybe because of time limits. Focus Group Participant

Some more general comments by respondents around frames that did or did not work for them, may have come from the new methods and approaches used in the inquiry, including the use of te ao Māori frames and experiences. Some of this feedback is further discussed in the section under new initiatives in this report.

Useability

The theme of focus areas came up again within respondents’ discussion around ‘high quality work’, echoing concerns that, whilst the recommendations are impactful and linked to relevant evidence within a public management frame, the Inquiry overall creates a sense that public sector change is the primary way forward. While the Final Report does clarify that causes of persistent disadvantage are wider than the public sector (page 17 and throughout Chapter 2), some respondents were left with the impression that the inquiry advocated change only in this domain.

Further data sources concerning High Quality Work

Responses to the online survey aligned with participants who found the quality of work informing and resulting from the Inquiry to be high. This is best contextualised in Question 7, which tested how logical the flow from analysis to findings was. To this question, 73% of respondents agreed or strongly agreed with the logic flow. Online survey Question 9 presented a similar view, with 90% of survey respondents identifying the quality of analysis as acceptable, good or excellent.

The submissions report focused on the most common key themes. It did not provide detail around topics areas where little or no submissions were received. For example, the submissions report did not provide detail around the questions that asked how best to measure persistent disadvantage (Interim Report Chapter 3 findings, recommendations and questions),20 largely because of the small amount of feedback received on these themes. However, the decision not to include an exhaustive ‘question by question’ summary of submissions from the Interim Report may have impacted respondent views around measurement questions, in particular. Considering the number of respondents that raised datasets and measurement of persistent disadvantage as being ‘undercooked’ in the Final Report, taking explicit account of these submissions, few though they may have been, could have been particularly valuable. This is reflected in Recommendation 2.

Note that a more exhaustive approach would have had resourcing and time implications, however, and this would have been a trade-off the Commission may already have considered. And this report acknowledges that a more exhaustive approach on mapping submissions analysis from Interim to Final Report, is still unlikely to have satisfied everyone.

Expert overview and recommendations

The breadth and depth of research that informed the inquiry was named as valuable by many, both the coverage of research as well as the quality and the way it linked through to findings and recommendations. This stood out as a key finding and valuable contribution from the inquiry. Although this is part of the Commission’s mandate, the delivery for this inquiry was particularly strong, in an effort to establish coverage across a range of frames - particularly where existing information was scanty. This can be seen as a key accomplishment for the inquiry, particularly given the aforementioned process obstacles.

Some respondents expressed a desire to see further longitudinal datasets and evidence at the core of the Final Report, specifically wishing for more detailed data in specific sectors or policy areas. The Commission chose to focus its limited quantitative capacity on understanding the disadvantage experience of the same cohort of people through time – in order to address the most critical research gap.

The inclusion and presentation of te ao Māori frames as an analytical lens enhanced the Final Report and was viewed positively. The Commission was seen as having understood and used these frames authentically and was encouraged to do more of this in the future. This finding did not, however, diminish the importance and need of continuing to build te ao Māori capability across the Commission.

Finding 7: The Commission delivered a significant breadth and depth of research in the inquiry, filling a research gap for Aotearoa New Zealand. This was part of the system-level approach to the inquiry, and it provided a key service for current and future users of information relating to persistent disadvantage.

Finding 8: The delivery and presentation of analysis within te ao Māori and Pasifika frames was delivered to a high-standard. Respondents to this evaluation found the use of these frames generally authentic and representative of lived experience. However, there were places where greater capability at the Commission could have improved understanding between frames of experience.

3.1.5 Clear message delivery

The ‘clear message delivery’ measure is defined as is defined as ‘how well the work was communicated and presented in the Final Report’. The key findings in this area were:

  • The Final Report overall was seen as coherent, clear and well-articulated. It had a good logic flow that was easy to follow. Most respondents found the narrative and findings clear.
  • The relationship between productivity and wellbeing was clear to most, is clearly laid out in the Productivity Commission Act 2010 and was detailed at the beginning of the Final Report.21
  • So far, the Final Report and wider Inquiry findings are being used and referenced frequently in circles of NGOs, whānau-led or place-based initiatives and community organisations, and some pockets of public sector agencies traditionally associated with social services.
  • So far, the Final Report and wider inquiry findings are not being used nor referenced as much in traditional economist circles, including at public sector agencies traditionally associated with economic policy.

Intent and methodology

The Final Report was viewed as being clear, logical and easy to follow. Recommendations were clear, although some respondents wished for an even shorter, clearer set of messages.

I found sequencing quite helpful, not overwhelming. Focus Group Participant

Many respondents confirmed that the relationship between productivity and wellbeing was clear in the Final Report and broader inquiry materials. Although some respondents identified that the Final Report included ‘too much on wellbeing’, a close look at the Productivity Commission Act 2010 confirms that the relationship between productivity and wellbeing is part of the Commission’s direct mandate. The current Letter of Expectations from the Minister of Finance provides clear guidance on the relationship between productivity and wellbeing.22

The opening discussion on p17 of the Final Report could perhaps have been emphasised throughout the report, in order to clarify this. It is also possible that even with more overt clarification, some stakeholders would have continued to question the value of wellbeing measures and/or analysis, as they relate to productivity.

Clarity and execution

The inquiry products, including the Final Report, were overall seen as well-articulated and concise. Most respondents agreed that the frames fit the evidence and the recommendations, delivering a clear narrative. Although not all readers accepted the rationale for the Final Report recommendations to focus on the public management system, within that frame, even respondents who disagreed with this choice in focus found the narrative to be clear. It was noted, however, that the public sector frame might provide a stronger narrative for people engaged in the public service than to the wider public.

The Commission has created a pathways forward diagram which focuses around 3 themes. I think that’s coherent, it’s a strong public sector focus, and goes back to the purpose of Commission … the frame it’s taken, it has achieved coherence. Focus Group Participant

Some respondents identified that the important point made of keeping the recommendations together was lost a bit. Although this was included in the Final Report, many people missed it.

The part of the story that is getting lost is that the recommendations are a package, but this is a small bone to pick really. Focus Group Participant

Most feedback that I’ve heard coming back has been cautiously positive, some very positive, [there are] messages in there that they can pick up, some recommendations they could sign up to. They can understand where recommendations come from. The downside [is that] one message is less clear than others, is [that] the recommendations are a package, you can’t cherry pick. Focus Group Participant

The separate release of the Final Report and the Quantitative Report had an impact on overall clarity of messaging.

But for the quantitative delay it was quite disappointing, we’re a bunch of data geeks and all that valuable stuff has gone. All the glamorous stuff is out there but I’m the only person in the data agencies who read it, and I have a whole team of data scientists and now they are not paying attention [once the Quantitative Report came out]. Focus Group Participant

This is discussed in more detail in the ‘Good process management’ measurement section of this report.

Useability

The immediate use of the Final Report is best illustrated by who respondents found was leveraging and referencing the report already. So far, the sectors that were using it frequently included:

  • NGOs and community organisations
  • Whānau led-initiatives and agencies/organisations that use a whānau-led frame
  • Place-based initiatives and movements

Focus groups and some interviews included clear, direct examples of this use so far.

In my work we call on this report, we write opinion pieces using it, it’s great to see the place-based initiatives recognised. I work from a place of hope, this report can validate that. I see how siloed the work is across [government] departments, and with this report I can draw on that. Focus Group Participant

This report has helped to reinforce some stuff we’ve been trying to land with people, and it’s useful to quote bits of. We need to drive cross-agency response, [to be] generative rather than punitive. Focus Group Participant

[This report] has galvanised people to think more strategically about what the learning system looks like in social paradigm, use the IDI to analyse this and the crisis space. The report has inputted into grappling with this, and adds a data point that we need to think differently around leadership and implementation. There are so many patterns but often when you’re in one agency you can’t see it - so it’s helpful to name it [across the system]. Focus Group Participant

Considering the novelty of this type of research for the Commission, the uptake and use by this range of stakeholders can be viewed as a success. Respondents internal to the Commission identified the focus on findings that would be valuable for these sectors as a deliberate choice. To have those groups and sectors using and celebrating the report as something that accurately and authentically covers, presents and analyses the lived experience they work within, is a real accomplishment for the Commission.

In terms of where the reported is being less referenced, respondents identified less of an uptake amongst:

  • Economics think tanks, academics and experts;
  • Economic and/or fiscal policy agencies;
  • Cross-agency groups and initiatives; and
  • Senior public sector leaders and politicians.23

Focus groups and some interviews included clear, direct examples of this under-use so far.

Other reports have sunk in more than this one. I don’t think it is being paid attention to or will be. People aren’t discussing it. The initiatives I work with are gratified to be featured, but if nobody else is listening then how much does the report validate their responses? Focus Group Participant

It feels like a report for experts and public sector management, to be honest. The whole theme around accountability is written for government, but I wonder if there should be something to all those who participated in the inquiry - so at each stage they do a response. Something that is digestible would be a useful tool for communities struggling in this space. Focus Group Participant

Internal Commission respondents identified that the inquiry had planned to respond more fully to participants, particularly from community organisations, at the end of the Inquiry. However, given some of the challenges identified in section 3.1.3 on process, this response plan became unrealistic.

It's worth noting that this particular inquiry aimed more at communities and those that represent them than previous inquiries. However, as this is not a common space for the Commission to operate in, it was challenging to shift the organisation towards that direction. As with any new approach, skills need to be built and practised in order to be most effective.

This finding can also be contextualised with the pattern from previous inquiries of having an ‘adoption curve’ time-lag in the uptake and use of inquiry findings. For some of the sectors that are not using the report immediately, at least not as much as the community and place-based sectors, it may be helpful to think in terms of a longer timeframe for uptake. For example, some previous Commission inquiries were referenced frequently and became key sources for those working in the relevant sectors,24 and this may be the case for Fair Chance for All as time passes. Whilst the confirmation of that suggestion is outside the scope of this report, if there is any bearing in that suggestion, it may be that a similar trend is occurring for the Fair Chance for All Inquiry. This could be information overload for those working in the public sector and/or it could be a lack of awareness around current research and analysis concerning wellbeing.

Further data sources concerning ‘clear message delivery’

Responses to the online survey aligned with participants who found the message delivery logical and clear. This is best contextualised in Question 11, which tested how clear survey respondents found the findings and recommendations. To this question, 92% of survey respondents agreed or strongly agreed that the findings and recommendations of the Final Report were clear. Online survey Question 24 presented a similar view, with 80% of survey respondents agreeing or strongly agreeing that the Commission communicated clearly.

Within the internal media report, some of the feedback from the launch event celebrates and notes the clarity of the Final Report.

Expert overview and recommendations

The stand out findings in this performance measure is how some sectors are referencing and using the report far more than others. This is discussed further within the section 4.3 of this report, as an illustration of different frames of meaning and value.

Finding 9: The Final Report spoke most clearly to community sector organisations, which are already making use of the findings in their work. However, there would have been value in creating accompanying messaging that could be circulated more easily in across the community sector.

3.1.6  Overall quality

The ‘overall quality’ measure is defined as ‘the overall quality of the inquiry taking into account all factors’. This section is discussed more generally than the previous five performance measure sections. Overall, the inquiry was viewed positively, welcomed by many for the new information it brought to light, and largely regarded as high quality and analytically sound. The online survey results roughly align with this, with 38% of survey respondents stating that the inquiry had increased their understanding of persistent disadvantage ‘a lot’, and a further 50% stating that the inquiry had increased their understanding of persistent disadvantage ‘a little’. Only 12% of survey respondents did not find their understanding increased through the inquiry.

Many respondents saw this inquiry as a different type of work for the Commission, acknowledging the need for different tools, different approaches, different capabilities to address the complexity and reach of the topic.

It was a new type of inquiry for the Commission, especially on top of this there were the engagement expectations, with a much broader set of stakeholders than previous inquiries. Interview respondent

What I could see with this, and why I was interested from the start, was that they wanted to try something different and see if it uncovered something new. If they went with the traditional way of evidence, causality, they would probably find out what we already knew. So, they wanted to go in at the systems level and say, what is it here that’s stopping us from implementing change? I get the sense around this table that there’s a lot of frustration how it’s been done. Ok, it’s not the report we thought, they tried something new, innovation won’t always work - let’s look for the bits that did work rather than cut the whole thing down. So that when they do this again (I’m grateful they’re doing this review), let’s learn from what didn't work. Focus Group Participant

Overall, the Commission was seen as accomplishing a good result with a broad topic. It took decisions to narrow the scope into a frame that could be presented coherently and informatively to the public. Not everyone agreed with those choices, but many respondents either did agree, or saw the rationale behind the decision.

Some of the Insights section that follows uncovers a context and/or trends that may have impacted the overall receipt of the inquiry.

3.1.7  New initiatives

This inquiry used a number of new initiatives in gathering and sharing information for the work. Specifically, that included:

  • Public consultation on shaping the Terms of Reference;
  • Actively collaborating with particular groups using policy workshops;
  • Using wānanga and talanoa sessions to gather evidence;
  • Taking a systems approach, instead of focusing on specific policy areas (eg housing), and using systems-thinking tools and methods, including causal loop diagrams; and
  • Publishing a themed submissions analysis - A comprehensive analysis and published summary of submissions to the Interim Report.

Overall, these initiatives were valued by the majority of stakeholders. They were viewed as widening the pool of people feeding into the Inquiry, which was seen as making it of greater public interest to society at large. Many of these have been discussed throughout the report already. This section is therefore quite concise, providing only information that has not already been presented directly.

Wānanga and talanoa sessions were named by several respondents as extremely positive.

Collaborations and stakeholders are the key [with wānanga]– we should do this more often going forward, using the strengths of other agencies and organisations in engagement or collaboration, a lot more of this. Interview respondent

The systems approach was supported by many, as detailed in section 3.1.1. Respondents tended to speak about this in full, rather than in detail. Respondents did not name specific systems-thinking tools or methods used in the inquiry, except for in general terms. However, it should be noted that the interviews and focus groups did not prompt for responses around specific tools or methods.

The themed submission analysis from the Interim Report provided insight into the majority view and a clear summary of what mattered most to submitters for including in the Final Report. However, the inquiry did not provide an overview to map all Interim Report findings, recommendations and questions into a decision to include or drop them from the Final Report. It is unclear what additional benefit this step would have added, noting that it is time and resource-intensive. But providing a more exhaustive rationale on scope and focus in the Final Report, as a result of submissions, may have reduced some of the questions around decisions to focus recommendation on the public management system. The Commission regularly publishes submissions and is encouraged to continue this practice, alongside the continued exercise of theming submissions.

The consultation on the Terms of Reference created the strongest views amongst respondents as to the value of this initiative, also discussed in section 3.1.1. Many found it extremely valuable as an initiative that worked well for gathering a wide range of stakeholder input. The process followed was comprehensive.25

However, this activity was new for the Commission. Instead of consulting on Terms of Reference, previous inquiries produce an issues paper early in the inquiry, seeking discussion and responses to the initial frames of reference through that mechanism. The consultation on the Terms of Reference, without more communication that this would replace an issues paper, may have created an expectation bind for the Commission. Although stakeholders were pleased to see the breadth of voices represented through the early Terms of Reference consultation, a group of respondents advised that they really missed the issues paper as a way to consolidate and respond to emerging frames. However, some of them reflected in the same breath that they may just be creatures of habit and that they had simply become used to an issues paper over years of responding to and working with Commission inquiries.

The implications from these trade-offs may not have been considered as deliberately in advance as they could have. Should the Commission repeat the early consultation exercise, the implications and lessons from the Fair Chance for All Inquiry should be considered in more depth. Recommendation 1 speaks to this.

14. Note that findings do not always fit neatly under the performance measure section in which they are suggested. This speaks to the interrelated nature of performance measures more generally.

15. Page 17 of Interim Report.

16. https://www.productivity.govt.nz/assets/Inquiries/a-fair-chance-for-all/Summary_of_submissions_Final_21-Feb.pdf

17. There is a cultural saying in the policy sector that ‘if everyone is a little bit disappointed then you have probably got it about right.’

18. The Commission provided a verbal briefing to the incoming Minister of Child Poverty Reduction, Hon Jan Tinetti, when portfolio changes occurring following Prime Minister Jacinda Ardern’s resignation. This was also offered to an incoming Minister for Pacific People, when this portfolio changed hands during the course of the inquiry, but Hon Barbara Edmonds did not take up the offer.

19. Recent examples including the Treasury Wellbeing Report (https://www.treasury.govt.nz/publications/wellbeing-report/te-tai-waiora-2022) and the Future for Local Government Review Final Report (https://www.dia.govt.nz/diawebsite.nsf/Files/Future-for-Local-Government/$file/Te-Arotake_Final-report.pdf).

20. Specifically, Interim Report Findings 3.1-3.5, Recommendations 3.1-3.2, Question 3.1 concerning the measurement of persistent disadvantage are not present within the Submissions Summary Report. This may be the case for other findings, recommendations and questions as well but a fulsome evaluation of this report was outside the core scope of this evaluation.

21. This connection was also discussed in depth in one of the supplementary research papers supporting the inquiry: https://www.productivity.govt.nz/assets/Documents/Reducing-persistent-disadvantage-research-note-Sep-2022-FINAL-1.pdf

22. https://www.productivity.govt.nz/assets/Careers/Letter-of-Expectations110521.pdf

23. The timing of the Inquiry launch just before the 2023 election season may impact this.

24. For example, it is the personal experience of this Evaluation Project Director that the inquiry on Regulatory institutions and practices (https://www.productivity.govt.nz/assets/Documents/d1d7d3ce31/Final-report- Regulatory-institutions-and-practices.pdf) was heavily referenced and had direct influence on improving New Zealand’s regulatory environment. However, this is anecdotal observation and is may also be simply a result of the policy domains in the Project Director’s experience.

25. https://www.productivity.govt.nz/assets/Documents/Summary-of-public-feedback.pdf and https://www.productivity.govt.nz/assets/Documents/Text-Ferret-report-on-public-feedback.pdf

4. Insights

Throughout this evaluation, several themes arose that did not fit neatly into the six performance measures. These themes all demonstrate the way in which wider social and public sector context impacted the delivery and reception of the Inquiry. They are shared here, contextualised within the Commission’s impact measures.

4.1 Commission impact measures

Beyond performance measures, the Commission must consider three types of impacts that its work will have over the longer term. These impact measures address influence and change that reach further than the immediate time period following an inquiry. These impact measures provide a helpful way to articulate the broader themes that arose during this evaluation, both as an illustration of how context affects any piece of public sector policy or research. Broadly, these impact measures capture what might be called system evolution, which will ideally occur as a result of the research and recommendations that Commission inquiries present to the public.

The Commission must consider the following impact indicators:26

  • Policies and behaviours change as a result of the Inquiry work;
  • Discussion and debate is generated on the Inquiry's findings and recommendations;
  • Levels of engagement and response lift the standard of quality analysis and advice.

26. See earlier cited Statement of Performance Expectations.

4.2 Mandate and an ‘expectation of action’

Expectations for action were high for this Inquiry. It may be that a report of this nature, dealing directly with discomfiting evidence around the inequity and disadvantage that some New Zealanders face, creates more of an onus for action than previous inquiries. Respondents named this throughout the evaluation process.

This report is trying to tackle the holy grail, the big issue of inequity around the world. One nation state tries to crack it. I think we need small agile dialogue. A social policy report is a very expensive doorstop … do you think some well thought out thing is actually going to deliver? No. Focus Group Participant

If this is about changing things, a lot of change happens through social movements. Documents are important milestones to articulate, but what’s really important is networks and deep dialogue. There’s a dominant set of constructs in a report, but the real value is the ongoing dialogue and intentional networks trying to make sense of it. This is a really big opportunity for the Productivity Commission - what are the dialogues and networks for ongoing conversation? The advantage of NZ is our small degree of separation, so use it. Focus Group Participant

This call for action relates directly to the impact measure of ‘policies and behaviours change as a result of the inquiry work’, which also relates to the Commission’s function to ‘promote public understanding of productivity-related matters’. The Commission is expected to influence the conversation towards changing policies, but the Commission does not possess a mandate for ensuring action, nor is it resourced for facilitating and convening dialogues and networks for ongoing conversation. It is not a policy agency and it does not have policy levers. One explanation for the frequent ‘calls for action’ heard during this evaluation could be the unfamiliarity of new (for the Commission) stakeholders regarding the boundaries of the Commission’s role. However, expectations of action came from public sector and academic respondents as well, who are presumably more familiar with the role and purpose of the Commission. This could indicate a general frustration with a perceived slow pace of change, particularly in a topic area like persistent disadvantage, where people are suffering. It may also be that the discussion of accountability mechanisms outlined in the Inquiry itself, were at play in some public sector respondents’ comments around the need for action, particularly in cases where people responded that action only happens ‘above our level’.

Furthermore, there is no requirement for the Government of the day to respond directly to Commission inquiries. Although the Government does often issue a formal response to an inquiry report, the onus on Government could be considered further, particularly in regards to any departure from or inaction on Commission recommendations.27

Despite a call to action, with respondents expressing a wish for something to happen with the inquiry, the Commission does not currently have levers to do this, beyond influencing discussion and understanding.

Some respondents understood the current mandate and bemoaned it as ineffective:

There’s no formal commitment from Government to respond to recommendations in any way, so that diminishes confidence in the process - what prospect is there for an impact? Focus Group Participant

We need to think about how reports can be more enduring. The answer lies in, reports shouldn’t read like they’ve been commissioned by the Government - they should have durability to live on. Focus Group Participant

Other respondents did not understand where the Commission’s mandate ended with the delivery of the Inquiry:

Curious about how process works from here. How do they go about engaging with incoming Government, is there opportunity to do some collective work? … Would be good for them to let us know what the next steps are and how we could support. Focus Group Participant

There was an initial discussion / reaction when released, but it seems there’s no clear plan going forward. Focus Group participant

Others simply wanted to see some change and were already working towards this:

We’re still in direct connection with the Productivity Commission and presenting at leadership forum on this soon. Still trying to find opportunities to keep the ideas moving, while report is a bit on hold. Protecting the things that are working is so important - we can use this as motivation to keep going. They’re in this job trying to transform the system, at a regional level, we’re trying to work to drive change, and that’s a motivation for them. Focus Group Participant

No matter which way the issue is cut, it was clear that respondents wished for more clarity around next steps, with a strong preference to see policy change and action as a more immediate result from the Inquiry. This report notes, that even if the Commission had possessed the resources to foster further public debate, pre-election period guidelines, including advice from Te Kawa Mataaho, informed a decision not to.

In response to these calls for action, the Commission could act within its current mandate to place greater emphasis and proportionality of its resources onto the education, promotion of understanding, and influence of its findings in the public arena. Within its current purpose, functions and impact measures, the Commission already has a mandate to influence and educate. The Commission could rebalance its work programme to put greater weight and resources on educating and influencing, relative to its role researching and analysing. Note this would likely mean a significant review of the Commission’s existing work programme and the recommendation below does not suggest it would be a simple exercise.

Finding 10: Although it is clearly stated in the governing legislation that the Commission does not have a policy design nor implementation role, a significant number of stakeholders called for action from the Commission. This likely arose from the significance of the topic, which created a strong desire to see immediate reduction in persistent disadvantage, as well as the nature of some stakeholders from the inquiry, who may be less familiar with the Commission’s mandate.

Another consideration could be seeking a review of its mandate, either broadly, or more narrowly concerning Government response. This could expose the tension inherent in being an entity with a great amount of knowledge of complex social issues (following an inquiry) but without the policy levers or mechanisms to act on this knowledge, or the resourcing to convene ongoing discussion and debate. As discussed in section 1.3, the Treasury commissioned advice in 2020 around the role of productivity institutions, which appears to have been quite narrow. Any mandate review could also present an opportunity to consider more broadly the role and purpose of the Commission, including whether the Government of the day should be required to respond to future Commission inquiries, and the implications of any such change. For example, requiring a direct response from Government could require the Commission to grow more responsive, policy-like functions whilst in an environment where Government is not required to respond, or not required to address all inquiry recommendations, then engagement and partnership incentives for the Commission may lean more towards non-government actors, particularly when an inquiry topic is valued highly by them.

It may also be worth considering the recent trend of establishing response units to consider recommendations made by other inquiries. For example, the Ministry of Justice is currently establishing a new response unit to work through findings and recommendations from Waitangi Tribunal inquiries.28 Establishing such a body or specifying a policy unit that should receive and respond to a future inquiry's findings could be one way to mitigate actual and/or perceived risk of no pathway for action. Any consideration of this approach should include consideration of the wider system-cost of such tools, and the degree to which they may inadvertently create more costs (resources) than they confer benefits (increased policy change).

Finding 11: The expectation of action also materialised in frustration with what some stakeholders saw as weak levers on Government to respond to the recommendations of the Inquiry. Considering that the mandate of the Commission has not been thoroughly considered in recent times, some form of review may be valuable in the near future.

27. For example, the Climate Change Response Act 2019 includes a requirement for a written response from Government, including reasons for any departure from the Climate Change Commission’s advice (part 1B, 5U). https://www.legislation.govt.nz/act/public/2019/0061/latest/LMS183848.html?search=sw_096be8ed8190b20c_response+from+Government_25_se&p=1&sr=9

28. https://www.justice.govt.nz/assets/Documents/Publications/Cabinet-paper_Responding-to-the-Waitangi-Tribunal.pdf

4.3 Competing validity frames

Another theme that emerged during the evaluation was differentiation in the methodology that people use and find valid. This relates directly to the impact measures of ‘generating discussion and debate on inquiry findings and recommendations’ and to ‘levels of engagement and response that lift the standard in quality analysis and advice’.

As evidenced throughout this report, some sectors and types of agencies embraced this Inquiry more than others. Beyond specific views around the focus and scope of the Inquiry, there may be a broader social experience at play, where competing frames of validity are talking past each other. This is most starkly demonstrated within the ‘right focus’ and ‘clear message delivery’ sections of this report, where some people disengaged with this Inquiry because for them it did not use sufficient frames of relevance for their work. However, other people engaged heavily with the Inquiry because they saw direct relevance for their sector and work, for many of them in a way that they had not seen previously (when they may have been the ones disengaging from an earlier report or Commission inquiry).

It may be that whānau-centred, place-based initiatives and more traditional, economic spheres of work and policy analysis do not speak a common language. Furthermore, these differing groups likely hold different values, although confirming that was beyond the scope of this evaluation. The Final Report named this phenomenon in a Chapter 4 discussion on broadening values.

These different responses to the Final Report points to these differing frames of validity, and highlights the importance of pluralist approaches to research, analysis and policy-making.

Considering this, the nature of this Inquiry means it operated in this nexus ‘talking past each other’ phenomenon. This may point to a current culture of separate ‘bubbles’, perhaps amplified by social media norms, reducing the onus on people to engage constructively with frames or findings they do not agree with.29

My worry is, there’s some really good stuff in there, but [people] being frustrated with parts of it has made us dismiss the whole thing. So how do we give voice to the parts that are good in it, valid and useful? It’s up to us, we can make change, let’s not dismiss the whole lot. I feel a responsibility to take the good bits to my organisation and work to try and push it through the system. I have been doing this. Focus Group Participant

29. These are general comments, which are the Evaluation Project Director’s observed experience and point of view. They are not within the core scope of this evaluation.

4.4 Timing and alignment

Another theme that arose was around the timing of the Final Report, particularly the rue some respondents felt about the delay of the Quantitative Report and that being a missed opportunity. This report has already outlined how those circumstances were beyond the Commission’s control. However, the broader theme of limited attention spans, potentially due to information overload, is worth mentioning. It relates most closely to the impact measure of ‘discussion and debate is generated on the inquiry's findings and recommendations’.

Some respondents expected to be given all the information in a single package. When that did not happen, they quickly disengaged from the conversation. This occurrence is well beyond the Commission’s control or mandate. However, Recommendation 6 of this report identifies an approach to mitigating future disconnect between quantitative findings with the rest of an inquiry.

Another timing considering was the way the Commission leveraged the proximity of the Inquiry launch to other relevant public conversations. Specifically, the decision to cross-reference common themes with the Future for Local Government Review launch of its Final Report, may have increased the level of public discussion for the Commission.30 This may be something the Commission considers even more in future inquiries, especially considering the feedback from this evaluation on the importance of hitting a ‘window of attention’ for busy people with lots of information to sift through.

30. Internal Commission media and engagement report.

4.5 Focus Group benefits

Beyond generating a source of evaluation data, focus groups were valuable for facilitating reflections between stakeholders. Both Focus Groups for this review stimulated connections and/or reconnections between parts of the system. Participants left the Focus Groups with actions to connect across their agencies. Other participants shared that they found the process of sharing their experience and reflections of the Inquiry ‘helpful’ and ‘cathartic’ as a way to process the complexity of the topic.

5. Conclusions

5.1 Summary of recommendations

Findings from this evaluation are framed as recommendations to apply more generally to future inquiries.

Recommendation Measure or type

1. The Commission should include within final reports proactive discussions around trade-offs and decisions informing the scope and focus of final recommendations.

In the case of this Inquiry, the discussion around the limitations of public sector management levers and mechanisms alone to reduce persistent disadvantage could have been simplified and emphasised for readers.

Clear delivery of message

2. The Commission should provide both a themed submission analysis on interim reports, as well as a clearer, more exhaustive rationale for how interim findings and recommendations translate into final findings and recommendations. The publishing of submissions provides a valuable record for those working on the topic in the future, and should continue.

In the case of this Inquiry, decisions on frames and presentation for quantitative data evolved from the Interim Report were not communicated directly as part of submissions reporting.

Clear delivery of message

3. The Commission should actively consider the trade-offs inherent in its approach to engagement, consultation and feedback. This should include:

a. the degree to which broad engagement approaches and activities are sustainable throughout the life of an inquiry;

b. actively managing stakeholder expectations about sustainable levels of engagement, particularly when an inquiry is in more intensive analysis phases; and

c. actively weighing up the relative merits of different consultation processes, and communicating this clearly as part of engagement.

In the case of this Inquiry, the trade-offs inherent in consulting on the Terms of Reference, instead of publishing an issues paper, were not clear to some stakeholders. Stakeholders also had expectations that the broad engagement supporting the Terms of Reference consultation would continue at the same level throughout the Inquiry.

Effective engagement

4. The Commission should continue investing in engagement methods specific to Māori and Pasifika as part of future inquiry and work planning.

In the case of this Inquiry, feedback on this capability was positive, with stakeholders identifying that more would help the Commission better integrate frames and experiences from these communities.

Effective engagement

5. The Commission could build in more mitigations to anticipate disruption and ensure planning and resourcing is fit for purpose. This could include:

a. role clarity across leadership functions clearly articulated and stress-testing to ensure common understanding;

b. revisiting scheduled processes for key milestones, such as confirming recommendations, in the event of any significant changes to resources or timeframes, to ensure that processes are still fit for purpose; and

c. introducing acting arrangements as a mitigation to key- person risk. Lessons learned from business continuity during Covid events could inform this approach.

In the case of this Inquiry, compounding factors across role clarity, key milestone planning for recommendation-setting, and intervening life events created significant resource pressure at the end of the Inquiry.

Good process management

6. Future use of the IDI should be planned with longer timeframes, to account for high demand and for unexpected delays. The Commission should consider carefully the length of time and scope required for IDI work, as part of the Terms of Reference stage, to ensure it can be delivered on time.

In the case of this Inquiry, IDI access delays required the separate publication of quantitative findings from the rest of the Final Report. This caused confusion for readers and meant that some stakeholders did not engage thoroughly with the Quantitative Report.

Good process management

7. When delivering inquiries that are either system-level, or where there is a significant research gap, the Commission should continue to commission research that is both broad and deep. This is an important part of the purpose and function the Commission serves for Aotearoa New Zealand.

In the case of this Inquiry, the Commission delivered an extensive research programme that reduced the knowledge gap in understanding persistent disadvantage.

High quality work

8. The Commission should invest more into capability around te ao Māori and Pasifika frames and experiences. This recommendation aligns with, but is separate to, Recommendation 4 concerning greater investment in engagement approaches with Māori and Pasifika.

In the case of this Inquiry, the use of te ao Māori and Pasifika frames was done well. However, there were places where greater capability at the Commission could have improved understanding between frames of experience and made the process smoother for all.

High quality work

9. Future inquiries should actively consider which sectors and organisations may use and/or benefit from findings, and orient materials to fit that need.

In the case of this Inquiry, the Final Report spoke clearly to community sector organisations. However, this sector could have benefitted from simplified messaging that speaks to its stakeholders more clearly.

Purpose and function

10. Consider the balance the Commission places across its purpose and functions. The Commission could explore benefits and methods of increasing its current function to promote public understanding, by introducing or strengthening influencing tools at its disposal.

In the case of this Inquiry, a misunderstanding of the Commission’s mandate, combined with a general desire to see positive change in the topic area, created an ‘expectation of action’ which the Commission is unable to fulfil.

Purpose and function

11. Consider seeking a review of Commission purpose and function. This could include consideration of requirements on Government to respond to specific inquiries, and what implications any changes would have on the purpose and function and efficacy of the Commission.

In the case of this Inquiry, some stakeholders questioned the efficacy of the current process, where, particularly in a system-level inquiry, there is no clear ‘home’ for ownership over Final Report recommendations.

Purpose and function

5.2 Expert statement in summary

The recommendations laid out above speak to an Inquiry that delivered well against many factors. The highlights included:

  • an ambitious scope and topic delivered on time with clear messaging;
  • a breadth and depth of research that delivered against an existing knowledge gap;
  • a high value placed on Commission engagement methods and authenticity, particularly amongst community, Māori and Pasifika communities; and
  • clear messages that were understandable and celebrated by a wide range of stakeholders.

Areas with room to improve included:

  • improved processes for ensuring planning and resources are fit for purpose, particularly when significant personnel changes or events occur;
  • communication around rationale for scoping and engagement choices could have been clearer; and
  • potential to contextualise the Commission’s mandate more clearly for stakeholders, as well as considering whether any clarification or review would be valuable and why.

There were also impacts from the environmental context of the day, including a disconnect in frames of validity between the community sector and economic organisations and a call for action from the Inquiry that the Commission is not resourced nor mandated to pursue.

Appendices

I. Interview and Focus Group details

Beyond internal interviews at the Commission, the following organisations participated in interviews and focus groups.

  • Ministry of Social Development
  • Social Wellbeing Agency
  • Auckland Council
  • Victoria University
  • Ministry of Business, Innovation and Employment
  • Haemata
  • Inland Revenue
  • Department of the Prime Minister and Cabinet
  • Ministry of Pacific Peoples
  • Waikato Wellbeing Project
  • NZ Council of Christian Social Services
  • Wellbeing Economy Alliance
  • Inspiring Communities
  • South Auckland Social Wellbeing Board
  • The Treasury

The primary questions that respondents were asked are detailed below, which those in italics asked as both the focus groups and interviews, while those in plain text were used only in the interviews.

As with any qualitative data collection, additional, probing questions are often asked during the course of an interview. The questions captured here are the basic skeleton and do not include specific probing questions.

Measurement area Questions
Introductory, demographics
  • What was your role in the inquiry?
  • How did you engage with the inquiry?
Right focus
  • Did you agree with the area the Inquiry focused on? Why or why not? (Prompts for how the view applies to the range of Inquiry reports, with a focus on the Terms of Reference and the Final Report)
  • How well did the Commission communicate the rationale for this focus?
High quality work
  • How did you find the overall quality of research, analysis and frames in the Final Report? (Prompts for breadth, depth, gaps in thinking, and other specifics)
  • How well did the Commission link the research and analysis to the recommendations and findings of the Final Report? What worked well? What could have been improved?
Good process management
  • What were the strengths of the Inquiry process? What worked well?
  • Where were there challenges? What worked less well?
  • How well did the planning process work? How were risks identified and mitigated? What went well? What could have gone better?
  • How flexible were ways of working to support the work?

Effective engagement

  • How did you find the overall level of engagement across the Inquiry? (Prompts for depth, breadth, frequency)
  • What were the direct impacts from engagement onto the work?
  • Where there any gaps or missed opportunities for engagement? Where there any stakeholders or sectors that were over-engaged?
Clear message delivery
  • Comment on how Inquiry findings and summaries told the overall story - did you find the report coherent as a single narrative? If so, why? If not, why not?
  • For which stakeholders/sectors are messages most relevant and/or clearest?
  • For which stakeholders/sectors are messages less relevant or less clear?
Overall quality
  • End to end, what were the strongest components of the Inquiry process and products?
  • End to end, what were the weakest components of the Inquiry process and product?
  • How well did the inquiry generate new insights?
  • To what degree did the inquiry raise public awareness and discussion of the topic?
Feedback on new initiatives

There were a range of new initiatives used on this inquiry (prompts as needed)

  • Can you comment on which ones you found particularly beneficial?
  • What level/type of impact did they generate for the inquiry?
  • Were there any that created drawbacks?
Conclusion
  • Anything I haven't asked that you'd like to share about the inquiry process or outcome?