Step 5: Using the W@S tools to review progress
Step 5: Key points about reviewing progress
- Change is not immediate. It can take 1-2 years to put in place new activities, and 3-5 years for related changes to show in student and teacher data.
- Repeating the W@S tools can provide data to assist schools to review progress against action plan goals.
- Baseline data is best collected prior to making changes. This data can be compared to follow-up data to assist you to review progress over time.
- When comparing data over time it is best to compare like with like (e.g., the same year levels of students at the same time each year).
- When interpreting data, consider common data patterns and the impact of “intervention effects”.
- Sharing progress with the wider school community can assist in developing new goals.
Information for Step 5
Step 5 is about reviewing and reflecting on progress. During Step 5, schools re-use the W@S tools to collect data to review actions.
Using the W@S tools to review process
A data collection process can be used for a range of purposes. Using W@S data for the purpose of awareness raising and for a needs assessment is discussed in Using the W@S tools to collect data (info for Step 2).
Using W@S data as part of a formative review to improve new approaches is discussed in Implementing the action plan (info for Step 4).
W@S data can also be used to chart change over time and review longer term progress, or to assess the impact of new programmes. This is the focus of Step 5 of the self-review cycle. This information sheet offers some ideas about using the W@S tools to review progress against your action plan.
Using the W@S tools to chart change over time
Attempting to change the culture of a school is a complex task. Many studies show that it can take some time for the changes a school is making to be clearly visible in student or teacher data. In relation to health and wellbeing initiatives, research suggests that the “start up” phase, during which a school decides on new approaches and fully implements these, can take at least 1-2 years. It can then take 3-5 years for any related changes to be clearly visible in data.
Repeating the W@S tools
The W@S tools can be repeated annually or bi-annually to contribute to a formative review of action plan activities. After a year or two some patterns may start to become visible in your data. Your data may also show areas that need more attention or resourcing. A more intensive review can occur at the end of a 3-5 year cycle. By this time patterns should be more clearly visible in the data. The focus of this more intensive review is on exploring the extent to which the activities in your action plan are making a difference. This review is likely to result in new goals and a revised action plan.
You could be using the W@S tools to explore overall short or longer term changes, or to explore the outcomes from a particular initiative. These three uses of the W@S tools are described below.
|Purpose of use||Suggested time frames and data collection|
|Short-term monitoring and formative improvements||Schools may wish to use the W@S data to contribute to a formative review of action plan activities. To do this we suggest that the best time to first use the W@S tools is in the middle of Term 2 or 3 before starting the new activities in your action plan (this is the baseline data). The W@S tools can then be repeated at annual or biannual intervals. Data is best collected at the same time of year, and from students of the same year levels as the baseline.|
|Evaluating the extent of longer term change||Schools may wish to use the W@S tools to monitor overall longer term changes. To do this we suggest that data is collected before starting the new activities in your action plan (this is the baseline data). The W@S tools can then be repeated 3-5 years later. To evaluate change over time, data is best collected at the same time of year, and from students of the same year levels as the baseline.|
|Evaluating change in relation to particular initiatives||Schools may wish to use the W@S tools to collect baseline and follow-up data to assess the effectiveness of a particular programme or initiative (e.g., restorative practices). Baseline data is best collected immediately before the start of any new initiative or programme (not during the start up process). To evaluate change over time, follow up data is best collected at the same time of year, and from students of the same year levels, as the baseline. If you are using the tools for this purpose, there needs to be a good fit between the aspects of school life explored in the W@S tools and the activities or intent of the programme. If this is not the case, it is unlikely that the tools will be able to detect changes. You may want to identify some key questions or aspects from the W@S tools that appear to have the best fit with the programme, and in which you would expect to see changes over time. The W@S tools are less suited to assess short-term changes (e.g., over a period of less than a year) as this length of time is likely to be too short for changes to be clearly visible in data.|
How can we compare groups over time?
As much as possible, it is important to compare like with like. If you are repeating the W@S Student Survey, data is best collected from students from the same year groups as the baseline at the same time each year. There are two reasons for this:
One is that students’ perceptions of school are more positive at the start of the year – therefore any changes noted in school data could be due to the time of year the data is collected rather than changes in school practice. This pattern is likely to be the same for teachers, who may feel differently about school in Term 1 compared with at the end of Term 4.
The other reason is that younger students also have more positive views about school than older students. The W@S national reference data shows that Year 5 and 6 students respond similarly to most questions. Compared with Year 7/8 students, students in Year 5/6 are more likely to select the options “Agree” or “Strongly agree”. A similar difference is shown between the Year 7/8 and Year 9/10 data (see the Technical manual for information about the expected patterns). Therefore, when comparing data over time, it is important to compare the same year group (e.g., comparing Year 9 data from 2012 to Year 9 data from 2015).
However, your school may want to compare the results from a cohort of students who have been part of some form of change or intervention. If this is the case, it is important to survey these students at the same time each year, and be aware of the pattern described above when you are interpreting data.
Results can also vary between different cohorts of students, e.g., the achievement of Year 9s in one year could be higher than previous years. Particular cohorts of students can also have different data patterns in terms of views about school or engagement at school. If data is collected over a number of years, you will start to build a picture of the sort of variation that can be expected at your school.
Common longer-term “intervention effects” and data patterns
When comparing data over time it is important to consider one common “intervention effect”. When schools start an intensive focus on addressing behaviours such as bullying this tends to act to raise awareness about these behaviours. This can result in an increase in reports of aggressive behaviours.
Thus after one year, school data can show a reported increase rather than a decrease in these behaviours. Therefore it is important to also have discussions with students, staff, and the Board of Trustees to assist you to interpret your data if this pattern occurs. Seeing that initiatives are making a positive difference can be motivating for staff, therefore it is important to have a longer-term view of change and clearly communicate that this might happen to the school community.
Another common pattern is that an initial burst of activity in the first year or two can be followed by a plateau, and then a return to previous practices. Research suggests that this is a common pattern in schools that are working to create a safer climate, as over time, staffing changes and other priorities take over. Therefore it is important to see self-review as a continuous cycle rather than a one-off change, and plan for the longer-term.
Sharing results with the school community
Change in schools is more successfully managed when the wider school community are fully informed and onboard. Therefore, it is important to share W@S results (or a summary of results) with staff, parents and whānau. You might want to use the School at a Glance reports for this purpose.
Make sure you inform community members about expected data patterns. Holding a staff and community workshop to share and interpret results is one way of creating space to reflect on changes, successes, and barriers to change. This sharing could assist your review team to develop new goals for the action plan.
What next? Starting a new review cycle
Comparing W@S data over time can assist you to explore the extent to which your school is meeting the goals in your action plan. Some other forms of data that could also be used for this purpose are discussed in Using the W@S tools to collect data (info for Step 2).
Once you have reviewed your data and collected feedback from the wider school community, the next step is to revise your action plan, include new goals that relate to new areas of need that might have emerged, and start a new cycle of change. In this way you will be keeping the momentum going to improve the climate of your school.