If technology should be integrated, why do we need to talk about it at all?
Recently, I was having a conversation with a colleague, who is herself a very experienced and knowledgable technology educator and leader, about technology professional development for our staff. We were mainly discussing how we could help our staff to engage more with the vision and purpose underlying our approach to technology integration, but also how we could more deliberately train them in the use of some of the core tools we have in place. My colleague made the point that we tell our teachers that technology should be integrated in learning, yet we train and talk to them about technology in a stand-alone way. Yes, she argued, it is always talked about in the context of learning and practice, but we nevertheless have conversations and sessions about technology, which somewhat defeats the purpose. If technology is supposed to be truly integrated, then shouldn't we simply talk about "optimal learning", of which technology is a part?
I both agreed and disagreed with her, in the sense that I absolutely, 100% agree that this is the direction we should be heading, but I disagree that we are there yet. For now, I think we absolutely do have to talk about technology and technology integration separately so that we can get to a point where it is embedded so deeply and effectively in practice that we can change the professional dialogue to one of "optimal learning".
In classroom teaching we integrate lots of different tools and approaches into learning all the time: sometimes we do it consciously, such as when a tool or approach is relatively new, and sometimes it is so embedded in general teaching and learning that we don't give it a second thought. I hope that one day the phrase "technology integration" will be as redundant as "pencil integration", or "language integration", or "questioning integration". Eventually technology will become just another tool that we use to enhance or facilitate learning, and technology integration will just be an aspect of effective teaching.
However, before we can get to that point, there are a couple of phases I believe that educators and schools need to go through. We are already deeply in the first phase - and hopefully heading out of it soon! This phase has been all access to technology: varying technology resources have been introduced, and by now in many places they are even embedded, and access continues to increase, but there is generally little agreement about how and why that technology should be used.
Since the early 1990's when technology became a part of school, it has traditionally been seen and taught as a discrete subject. I personally have vivid, if uninteresting, memories of "keyboard skills", in which our hands were covered by a board and we typed letters in unison as the teacher called them out, and "computing" class, in which I learned something about bits, bytes and how to use a now-obsolete word processing program. To put that in context, I am 34 years old and went through my high-school education in the late 1990's. According to the OECD, in the US in 2015 the average age of public school teachers was 42, with 31% being over 50 and only 15% being under 30. There are rare international examples where this is not the case, such as the UK, but in general the proportion of teachers over 50 is at least twice as high as those under 30. All this means that the vast majority of teachers internationally will have had similar or even less exposure to technology in their education than I did, and these personal educational experiences shape to a degree how we think about and implement educational approaches in the classroom today.
Many schools continue to teach and resource technology in a stand-alone, disciplinary way, with access to technology being dependent on bookable "IT labs", classroom sets of devices or IT teachers who work on a set timetable. This approach unfortunately forces teachers and students to use technology in a way that is discrete from general classroom learning: it means that you have to do technology, during technology time, in the technology place. It is an undeniable fact that technology is expensive, and embedding technology into every classroom is a privilege not every school can afford, but right now in many places huge sums of money are being invested into resources and infrastructure without a clear and communicated understanding of how that technology can be used to really benefit learning.
Much of the recent research on technology shows that even (or especially?) in these technology-rich environments little is changing in terms of student outcomes and experiences, and many teachers continue to feel overwhelmed, frightened and dubious about the role of technology in education. If we are going to move from where we are right now with technology to a place where that investment is genuinely enhancing how we teach and how our students learn in a sustainable and cohesive way, we need to embrace the 2nd phase, where we develop the professional conversation around how that technology is used, and for what.
In many schools, countries and educational approaches that conversation is beginning, and is even being had at the level of teacher education, for example at the Stanford Teacher Education Program under Professor Peter Williamson, at the programme level, for example in the International Baccalaureate. To shift how education views technology and get to the place where technology is being consistently and effectively integrated into wider learning without being seen as a separate element, we as teachers and leaders need to continue to have real, pedagogical, professional conversations and training about the approaches to, purpose of and implementation of technology integration.
With these conversations and this deliberate effort to develop a shared vision and understanding about technology integration, when we get to phase 3 and technology is simply a part of learning, we can be sure that it will be a meaningful, effective and truly beneficial part of learning.