Self-assessment report: Why write one?

Back
Self-assessment report: Why write one?
Date18th Sep 2023AuthorSteven TuckerCategoriesLeadership, Teaching

I have read hundreds of self-assessment reports (SARs) through peer reviews, support and challenge visits, and when leading inspections. Most have taken a huge amount of effort and time to produce, but I have often been left wondering: were they worth the investment of time? And did they have a positive impact for learners?

So, since it is the time of year when top-level SARs are written, it seems a good time to reflect on how to get the most from self-assessment. Let’s take apart the phrase word by word…

Self

Self: a word associated first and foremost with people. Makes sense. A SAR is assessing people’s work. But who is/are the ‘self’ in self-assessment? The author? The senior leadership team? The staff? I have yet to come across a leader who does not want their top-level SAR to represent the whole college, in which case ‘self’ must refer everyone who works there or has a stake in the college. If the SAR is to represent the staff and stakeholders - the collective ‘self’ - it cannot be the work or perspective of the senior leadership team or a single author alone. If it is, ‘buy in’ from staff and their faith in the legitimacy of the judgements will be limited. This is a particular issue if the SAR is graded (grading: a topic for discussion in itself) since, unfortunately, many readers do not read far beyond the grades. Nor can it be based solely on the exam results from the summer. These are one indicator of quality but must never be the only indicator.

So maybe a SAR needs many authors to represent the organisation faithfully, or at least an author who is acutely tuned into nuances within and across the whole college. There is no simple formula for how to produce a SAR. As with so many processes, the self-assessment process reflects the culture and values of the college. A college with an inclusive culture will probably build the SAR from many locally authored course or departmental SARs. Not an easy job, and one that risks inconsistency and places a high level of trust on curriculum leaders. Conversely, a college with significant problems across a range of curriculum areas may pay less attention to course level evaluation as leaders prioritise systemic change. Whatever approach is taken, a SAR has little value if staff and stakeholders do not recognise their college in its pages. 

Assessment

Many of you reading this are, or have been, teachers. You know that to assess, you need evidence, measurable criteria, evaluation schedules, etc. You also know that assessment once a year is of little use - you need on-going formative assessment to check progress and adjust curriculum and teaching strategies. And so it is with an assessment of quality. A SAR is an assessment that is fixed in time. Validation and publication processes rarely allow for it to be a dynamic, evolving document. This is why so much emphasis should be put on a QIP (or QIMP as I increasingly think of it. More on this later). It is the QI(M)P that does the heavy lifting in self-assessment; it is the vehicle through which progress is measured and reviewed. It is the QI(M)P that stimulates and guides discussions all year, leads to forensic examination of the learners’ experience, and repeatedly raises the question – ‘Are we making improvements and are we maintaining (there’s the ‘M’ in QIMP1) the good stuff?’

The QI(M)P is part of the quality cycle, which hopefully involves large numbers of people. A strong QI(M)P is based on verifiable evidence and records changes to quality as a year progresses. It is not a ‘one-off’. It is not tied to an academic year. It never ends. As items are completed, they are archived. As new items emerge, they are added. It is the journal of the college, year after year, as opposed to its annual memoir; it is dynamic and of-the-moment rather than frozen in time. The QI(M)P can be part of the engine that drives the college, while a SAR is the review of that engine’s performance. So, the best SARs I have seen are ones that emerge from a well-managed, inclusive QI(M)P. There shouldn’t be unexpected revelations in a SAR that dramatically change a QI(M)P. If the QI(M)P has been effective, it should already contain anything included in the SAR. Let the SAR fall out of the QI(M)P, rather than the other way round.

Report

A well-written report informs, evaluates, illuminates, explains, reflects, celebrates, and criticises. It is not over-burdened with description. It is written with a specific audience in mind. The most effective SARs are written for the staff and stakeholders, the people who can make most use of the judgements it contains. It shouldn’t hold long descriptions of evidence or extensive examples; these simply make the report too long and indigestible. A general rule of thumb: the longer a SAR, the fewer the number of people who read it. A SAR should be factual and evidence-based, guiding a reader to a clear understanding through careful use of words. A good test, of course, is to find out how many staff and stakeholders actually read the self-assessment report. And, of these, how many recognise their experiences in it.

There is one audience that should not be in your mind when writing a SAR: Ofsted. If you write it for Ofsted, you are not writing a self-assessment report. You’re probably writing a version of the truth that aligns to a grade you want to achieve in inspection. A SAR is very useful for a lead inspector and the leadership and management inspector to learn about your college before an inspection starts. Other inspectors may read parts of it, but they may not. After all, the lead inspector synthesises their pre-inspection research for their team in a pre-inspection briefing (you get a copy of this). Most importantly, the inspection is not an exercise to validate your SAR. Once the inspection is up and running, inspectors make judgements based on the evidence that they collect. It would be a poor lead inspector who was influenced, when deciding the inspection outcomes, by the judgements and grades you put in your SAR. 

So, keep your eye on the main purpose of the SAR and don’t try to second-guess how Ofsted are going to use it. And always remember, as an inspection progresses, someone on the inspection team may look at the SAR to see if it reflects the inspection findings. If not, expect some awkward questions about whether leaders really know their college well enough. 

Finally

All quality processes and activities should focus on securing the best possible experience for learners. That may sound a bit obvious, but too often time-consuming and expensive processes such as self-assessment (observations are another one) are weakened because they lose that focus. I am always heartened by a principal telling me that they do something because it is good for their learners ‘and we know it works because…….’ Conversely, one of the more depressing phrases is ‘We’ve got to do XXXX, inspectors expect it’. So, as you head deeper into SAR season – keep the ‘self’ of self-assessment in your mind.

[1] To elaborate…. some aspects of provision need actions in the QIMP in order to maintain high quality. They may not need to improve significantly but leaders cannot take for granted that strong provision stays strong without some care and attention. For example, a strong subject area may face a high turnover of staff, a new syllabus may require new curriculum and staff development, or staff may simply need new challenges to maintain their enthusiasm. These areas can be overlooked as leaders focus on the weaker areas identified in a SAR. I have often found that SARs do not identify quickly enough when quality in a curriculum area has not been maintained. As a result, the decline in quality is unchecked, making the eventual turnaround more arduous. At worst, it is only when inspectors visit that the true extent of decline is laid bare.

Steven was a full-time further education inspector in Ofsted for 10 years. During his time at Ofsted, he held national lead posts for curriculum and 16 to 19 programmes and led the training of HMI and Ofsted inspectors. He is now a consultant in the FE sector and can be reached on LinkedIn here.

Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×