Macdonald, J., & Poniatowska, B. (2011). Designing the professional development of staff for teaching online: an OU (UK) case study. Distance Education, 32(1), 119-134. doi:10.1080/01587919.2011.565481
This article caught my attention because it is at the crossroads of several personal interests in thinking about online pedagogy: the workplace, blended learning, near synchronous feedback, and cool and geeky new tools. The authors review a module taught through the UK’s Online University. This module is aimed at online teachers, but teachers in the workplace though in this case the workplace was the OU.
Drawing on this experience, we therefore set out to design a new online professional development module at the OU (UK), which would act as a guide and introduction to new ways of working with online tools for all staff throughout the university. It was important that this module should be designed in a way that it could be easily updated with changing technologies. We were aware of the need to sustain engagement by using measures such as an activity checklist and certification system, and to consider ways of encouraging peer learning through an online community. Finally, we wished to design this module using a practice-based approach, starting with the job. (Macdonald and Poniatowska 2011)
The authors spend several pages on developing a theoretical structure that informed their case study which we will happily gloss over. Instead, their approach was to focus on the common intentions of teaching and supporting learners. Their focus shifted then to strategies and finally to tools; an eminently practical approach, I think. This approach allowed them to minimize the need to regular revision of the course – instead new tools could be classed by strategies and accommodated.
Learners selected either a self-study route or a cohort program. It sounds like the latter was easier to manage since interactive projects were precluded in the self-study route.
Use of the Elluminate tool was experimental and new so the authors recruited tutors competent with the tool to enrich that experience. Their experiences with this approach have encouraged them to explore online tutoring. The authors review briefly some of the quantitative and qualitative data they collected on participants experience with the curriculum. They discuss the outcomes of the course broadly and each of the tracks, cohort and self-study, their conclusions, as with most scholarly projects, are constrained and suggest additional directions for subsequent research.
I particularly like their final observation: “In other words, what the learner actually learns cannot be predicted in advance.” I think this is brilliant. It shows the aleatory quality of learning. We throw a variety of learners and supporting props together and then watch intently to see what is learned. There is no accounting for motivation, curiosity and discovery. A gifted learner can skew a set of course outcomes significantly from those imagined by the teacher. Combine that with a cohort and the outcomes can be profoundly variable.
I liked this inquiry very much. It was not exactly what I was looking for in my thinking about teaching young adults about work at a service desk, but it is closer than many of the articles we have reviewed thus far. I find it rewarding that this article is about teaching teachers. A number of interesting facets to that, one seeing that full-time teachers self-selected for cohort study whereas part-time preferred individual study. Intriguing also to see teachers receive tutoring. In addition, to see them working to discover an on-line voice, on-line techniques for tutors, is rewarding. I suspect like teaching labs, tutoring on-line has its challenges. I also suspect that the learners themselves provide many clues on how to do it well. I found it valuable that this course showed the collaboration between instructors and instructional designers. I liked as well that it introduced the collaboration between tutors and the aforementioned. I like that it is an iterative process to develop the course. I also like and simultaneously struggle with it not being a graded course. “Finally, to support engagement, participants are encouraged to complete a choice of activities using an activity checklist that once completed generates an automated completion certificate.” This is something I am struggling with as I consider the final assignment for the course. I am obviously suspicious of “schooling” and of “grades” and so writing rubrics is a conflicted task for me. I understand their value in assessment and in connecting outcomes and course work. However, there is part of me that wants to honor the discovery that is unpredictable in throwing learners and tools together in an aleatory space. There is the part of me that is a boss. I am driven by finite resources and expected to show return on investment. I am appreciative of focused and measurable outcomes as evidence of learning. I also understand that excellence in service that creates customer enthusiasm is a result of motivation, curiosity and discovery. I suspect that I will need both/and in my assessment of learners in order to accomplish rote skills and interpretive skills.