Here are the links to the other parts: Part 1 and Part 2.
So, this will be the final post in reviewing the academic year. I had wanted to include more stuff, but finding the time is so difficult at the moment! In this post, I’ll look at the development programme as a whole, going into detail about each term’s feedback from teachers, and my reflections on the year as a whole.
End of term evaluation questionnaire
So, at the end of every term, teachers were asked to complete an evaluation questionnaire which aimed to collect data on their thoughts on the development programme, management, and many other things. In this post, I’ll only focus on the development programme stuff, but if you’d like to take a look at the questionnaire, you can find a digital copy for Term 2 here. The questions for the development programme stayed the same all the way through the year, although other sections had questions changed, taken away, and added.
Question 1: How relevant did you find the workshops from Term X?
Term 1 average | Term 2 average | Term 3 average | 2021/22 average |
---|---|---|---|
4.75 | 4.75 | 4.75 | 4.75 |
I was really happy to see that the workshop were perceived as relevant to teachers. As mentioned previously, this was the first year that we aimed to meet as many bottom-up needs as possible.
Question 2: How happy are you with the level of support provided regarding workshops, observations and coaching?
Term 1 average | Term 2 average | Term 3 average | 2021/22 average |
---|---|---|---|
4.75 | 5 | 4.75 | 4.83 |
Providing support to teachers throughout the year was a major concern for me. Why? Well, there are a number of reasons. One, you’re always learning while teaching; no matter how many classes someone has taught, there is always something that comes up, issues that present themselves, etc. I wanted to ensure that there was a support network for teachers – something that they could choose to use should they need to.
Question 3: How effective do you believe the term’s development programme has been at helping you develop as a teacher?
Term 1 average | Term 2 average | Term 3 average | 2021/22 average |
---|---|---|---|
4.5 | 4.25 | 4.5 | 4.42 |
Overall, teachers felt that the development programme as a whole helped them develop as teachers. Impact can be hard to measure, but I do feel that teachers’ impressions are a good place to start. This questionnaire no doubt suffered from some kind of response bias, but being anonymous I feel teachers were fairly open, and so I am happy with this score.
Question 4: How can the development programme be improved?
To my surprise, there were not many comments regarding how the programme can/should be improved. Three comments did come up, both of which I thought were interesting and worth commenting on here:
- Observing classes together: Throughout the year, we did a few sessions in which we observed a number of classes (from online videos). Teacher mentioned that they really liked these and wanted more. I’m really happy about this because I felt that these sessions were very ‘catalytic’ and look forward to including more of them.
- No homework: In Term 1, I set some homework for teachers to do. One of the teachers felt that while the session was worthwhile, the homework should have been avoided as their time to devote to work and development was limited (and they did their own development). This I completely understand, and for this reason from Term 2 onwards I made the ‘homework’ optional. This being said, I really like the idea of enquiry-based learning, and that means teachers going away and trying out ideas in their classes, collecting some data and then bringing it back to the workshops or coaching sessions. This is something I’ll have to consider as I plan the ‘projects’ for this academic year (based on teachers and learners needs and wants of course!).
- More experiential sessions: In Term 3, teachers were put into the shoes of learners quite a number of times. They mentioned that these sessions were useful as they got to feel how it was like to be a learner, and also to see ‘the theory in action’. Again, was really happy to see this reaction, so can’t wait to include more of these this academic year.
Evaluation questions
I’ve put together a list of questions based on Guskey (2002) and Kennedy (2005) – they are by no means all the questions that could be asked, but I think they cover a lot of ground and push me to really think about the success of the programme.
Were teachers happy/involved/engaged/motivated?
I think they were. The feedback certainly indicated that they were, and as a manager I had no real complaints about the development programme. This being said, I was the one leading much of the development, so perhaps teachers felt that providing negative feedback would negatively influence their time with me, although I doubt this – but it is something to consider.
What teacher knowledge was affected? What evidence is there of this?
I really like this question, but I also find it a little difficult to answer. I’m going to use Richard’s (1998) classification of content domains.
- Theories of teaching: We did cover quite a lot on TBLT and looked at some theories of learning in a reactionary manner; that is, teachers had questions and I answered them as best as I could. Teachers were also asked to make explicit their personal teaching theories and principles, and connect these to their practice. Whilst this is not necessarily the creation of ‘new’ knowledge, I feel that having teachers make explicit their theories, etc. is important as it helps mediate the rest of the development programme, workshops, etc.
- Teaching skills: Judging from the observation feedback and reflection forms, and from recalling the conversations between teachers regarding peer observations, I feel that teaching skills were certainly developed. I also think the coaching sessions had an impact on teaching skills indirectly – mainly because most of the teachers utilised the coaching sessions to focus on their termly development goal which usually was teaching skills focused (e.g., classroom management and assertiveness with teens). As Richards writes, though, there is difference between knowing the skill and the higher levels of thinking that teachers develop. I don’t think the programme taught any specific new skill that teachers didn’t have prior, but I do think that they became more aware of how to navigate certain ‘events’ in classrooms because of the development programme. For example, the workshop that focused on developing listening skills in our learners presented a number of ideas (that were not ‘new’), but mainly focused on the way in which they can/should be implemented based on research. I can distinctly remember teachers saying ‘ah, this makes sense now’ after the session – I know that this is a very informal piece of data, but it is something of value.
- Communication skills: Hmm, I don’t think this was applicable to all teachers as none of the workshops focused on this directly, and only one teacher focused on ‘assertiveness and communication with teens’ in their coaching sessions and termly development goals. This being said, the teacher who did focus on that did improve, noting more on-task behaviour from learners and reduced stress. At the start of the year, we focused on peer observation ground rules, and I think that set teachers up to work effectively together, especially when providing feedback. So, whilst the peer observation ground rules session may not have affected their communication skills with learners, I feel that the workshop has an effect on their communication skills with their pairs.
- Subject-matter knowledge: Many of the workshop certainly aimed to develop this (e.g., motivation and motivational strategies, or the listening skills workshop). I don’t feel that many teachers chose to work on these areas specifically, but I can’t help but think that my emphasis on TBLT and evidence-based teaching may have stifled many teacher’s normal ‘urges’ to speak about and teach grammar explicitly. I do remember that many teachers noted that they were not confident in many subject-matter areas such as phonology, BUT they also noted that is wasn’t that interesting for them and so they would prefer to not focus on it. Perhaps I should have pushed a little harder here.
- Pedagogical reasoning skills and decision making: Difficult to assess. In some of our post-observation oral feedback sessions, I had teachers run me through their thoughts as we watched the video of the class. Many of their comments focused on reflection-in-action and their thought processes after coming up against something unexpected. As we were looking at how they were dealing with events in the moment, and we did see improvement in these areas, I feel that pedagogical reasoning skills and decision making was affected. We also had a workshop in which we looked at motivation and then teachers observed a lesson and identified why it wasn’t successful due the decision the teacher made during the class – perhaps this affected this content domain also.
- Contextual knowledge: Ok, so induction week was the big one for contextual knowledge. Not only did the sessions raise awareness of the academy’s goals and objectives, they also looked at learners needs and wants (in general) and the Spanish teaching context.
How transmission/transformative were the workshops?
Interesting question, and here we have to go to Kennedy’s (2005) Spectrum of CPD models (see below).

Whilst little action research was conducted (there were a few mini-action research pieces of homework), I do feel that the majority of the development tools were either transitional (e.g., coaching) or transformative (e.g., workshops that were based on teachers’ needs and wants, and that focused on dialogic training). Cascade training did take place, although in a very minor fashion – we all attended an online conference (not sure if I mentioned this yet), and then teachers were tasked with writing a short paragraph about what their takeaways were. These were then presented in plenary. Overall, then, if we take that ‘transformative’ development is better for the context in which I work/train, I feel the programme did well, although more action-research led by teachers would certainly strengthen it (and would certainly lead to greater changes in teachers’ practice).
Did teachers’ teaching change in anyway? How?
This is a definite yes, and I’ll put down a few clear examples from three of the teachers:
- Teacher A had his perspectives changed on explicit grammar teaching, and starting to adopt a more TBLT/Dogme style teaching. Funnily enough, he got back in contact with me a few weeks ago with the following message. He spent a lot of time talking about his opinions and theories surrounding explicit grammar teaching and asked for a lot of extra things to read and talk about. He was very ‘involved’ in his development, and took advantage of the workshops.

- Teacher B had his perspectives changed on the need to be assertive with teen classes. After a number of very bad classes, Teacher B spoke to me in one of our coaching sessions. We spoke about the idea of prevention being better than the cure, what work he had put in, and his ‘teacher-ness’ in the class. We did a number of observations that were filmed, and we watch back certain critical incidents together. We worked out a number of strategies to use, and then we team taught the class together for a number of weeks (he would lead one class, and I or my colleague Patrick would lead the other). At the end of Term 2, the teacher felt more comfortable, and by the end of the year had developed a much clearer understanding of why his communication skills with teens needed improving, and had made the necessary improvements.
- Teacher C had identified that one of her teen classes was ‘dull’ – learners were not engaged and seemed ‘demotivated’. We looked at what the profiles of the teens were, how they interacted, and some things that she could try. I showed her a number of resource books such as Dudley’s (2018) ETpedia – 500 ideas for teaching English to teengers, which she took home and made plenty of notes. She actively sought out engaging activities for her class, and while she didn’t notice a huge change in their behaviour immediately, progress was made. Furthermore, she had built up a repertoire of engaging activities to use in teen classes
How did the programme support both individual and collective development?
We tried to support individual development through coaching, observations, peer observations (they chose focus, class or teacher), and through elective workshops. We supported collective development through workshops and incentives to share ideas (e.g., we had an ideas board that teachers would write ideas on. At the end of the month, we would choose the best idea and they would get a prize – usually beer or biscuits). I feel that the programme provided a good mix of both, although perhaps more information could have been used to inform Term 1 bottom-up collective needs – I didn’t do snapshot observations before planning the programme, and so this is something I will be doing in the 22/23 academic year.
Were both top-down and bottom-up needs identified and acted on? In what quantity?
Yes. As mentioned, the goal for this year was to be meeting both top-down and bottom-up needs. I would say that top-down needs were focused on about 20% of the time, whilst bottom-up needs were focused on 80% of the time. This being said, one of the negatives of the programme was that it lacked input from learners. This is something we aim to rectify in the next academic year by using learner focus groups.
What opportunities were provided to teachers who wanted to take their development further?
So, I have a few things:
- Teachers that wanted to read a certain book were either provided with the book or were pointed in the right direction.
- We had one teacher take on Celta, and we provided 100 euros in funding towards this.
- We had one teacher who wanted to plan and deliver their own workshop. They were given this opportunity and support along the way.
Was there any noticeable impact on learners and learning? What evidence is there for this?
This is the area that needs the most work – at least in terms of collecting data for the evaluation of the programme. We did collect some data, but I don’t think it was that useful. For example, we track all exam marks, and we did note improvements; however, we have only been tracking exam marks with this system for about a year, so hard to see if the training and development programme (especially the exam moderation workshops) had any effect on exam results. Also, we asked parents and learners for feedback, but the only feedback we got was in the form of positive comments, which we were happy with, but they didn’t really tell us if learners were affected by the programme. My gut tells me that the programme, at a minimum, ensured that the quality of classes was kept as high as they could be, and so learners were kept happy. I also feel that it pushed teachers to try out more learner-centered activities, and as such learners may have been more engaged. This being said, this is an area we need to improve in, and so I think learner focus groups may be something that we can do at the end of the year also.
Was there any noticeable impact on parents’ and stakeholder perceptions? What evidence is there for this?
Again, no real evidence that perceptions changed, although we have changed our website and now advertise the development our teachers are involved in. We have had many leads from the new website, but not sure if it is because of the development programme itself.
22/23 Academic year
So, what about the 22/23 academic year? What changes need to be made? Let’s note them hear:
- We’ll be implementing an observation menu; that is, teachers will be able to choose what kind of observation they get to do (e.g., blind, focused, three-way, etc.).
- We’ll be implementing a reflection menu – after observations, teachers will get to choose their way of reflecting (e.g., audio note, written, etc.).
- More experiential workshops that engage teachers as learners, ideally with as much loop input as possible.
- More classroom observation videos used in workshops.
- Actively advertising more development opportunities for teachers (e.g., running a workshop, getting funding for development).
- Carrying out learner focus groups to find out learner needs and wants.
- Carrying out snapshot observations to get a descriptive account of the teaching in the academy to inform Term 1’s development programme (ideally, leading into an enquiry-based term course).
Currently, the development programme overview for 22/23 is shaping up quite nicely, with induction week done, initial coaching sessions underway, and learner focus groups in the pipeline, so I’m pretty excited. At the end of this year, I’ll be able to look back and see if we actually made all these changes – accountability is important!
Final notes
This review of the year wasn’t as detailed as I had hoped, but I do hope that it gives you a good idea of the thoughts in my head, the things the programme involved, and some of the things that, at least from my perspective, we successful and not so successful. I would welcome any comments, feedback and suggestions from other teacher educators, trainers, managers, etc., so please feel free to comment or get in touch!
References
Guskey, T. R., “Does It Make a Difference? Evaluating Professional Development” (2002). Educational, School, and Counseling Psychology Faculty Publications. 7.
Kennedy, A. (2005). Models of Continuing Professional Development: a framework for analysis. Journal of in-service Education, Volume 31/2, p.235-250.
Richards, J.C. (1998). Beyond Training. Cambridge: Cambridge University Press.
1 Comment