identifying teacher itches to scratch

Schools in Victoria have many standardised data sets about student learning, student attitudes to school, parent opinion surveys and finally staff opinion surveys. This post focuses on this last data set, staff opinion.

There was industrial action for the 3 earlier years stopping this new survey from being conducted in 2014. The survey results include: teachers, education support people and principal class officers. Using our evidence based practices we shared the data with staff asking for both feedback and if possible the story behind the data.

There were two data sets that we did not sit comfortably with us both about professional learning: Coherence and Applicability.

We had been “talking” about staff development for several years using older data sets like the two below:

Slide03

and

Slide06

We spent 18 months or more developing staff understandings about observation (e.g. non judgemental feedback, staying low on the ladder of inference, agreed protocols including pre and post conversations) and developed three protocols: walkthroughs, learning walks and instructional rounds.

I admit there was a certain reluctance about just going for it (teacher observation of each other: “learning walks”) for fear of wounding (not that we believed they would be intentional).

So when we got the applicability data which showed we were well below the State mean score it was a shock for the one recommended strategy for improving applicability was teacher observation. Lesson learned we had moved too cautiously. A lesson now corrected for we have facilitated teacher observation (just going over prior pd on observation to be inclusive of new staff).

Slide35

The other data set that struck discomfort was the coherence data set. When we unpacked this with staff it wasn’t so much an alignment of professional learning with our strategic improvement work but rather opportunities for personal teacher pd on their perceived needs.

Slide38

 

So once again a lesson learnt. Allowing time for teacher individual pd is now within our overall pd plan. The question we initially posed was how does one “perceive” that need and measure its effect?

Lots of the literature on personal teacher pd suggests that they look at their student data set and decide what gaps or instructional adjustments might be needed (learned) to improve individual teacher effect – the point being pd should have some effect on student learning and or engagement.

Well from the teacher feedback that point is still misunderstood by some however being a “hopeful” leader perhaps going through the personal pd experience might by learned. Most, but not all teachers, have formed into “supportive groups”  (a term that might be called professional learning communities) and are investigating practices that “interest” them: e.g student writing, use of i-pads in the early years and differentiated instruction book club to name a few.

We are all hopeful the actions taken scratch the itch teachers identified and the data sets show an improvement in 2015.