As I study for my comprehensive exams and finalize edits to the first four chapters of my dissertation, I have found myself perplexed by the evolution of educational technology so I reached out to Audrey Watters, scholar and author at Hack Education. Since she blogged her initial response to me, I decided to continue the conversation here.
Our conversation began when I asked Audrey why the sector of ed tech seemed to have regressed in its thinking as the technology itself improved. Early pioneers like Papert, Collins, Scardamalia, Bereiter, Seely-Brown, and Sandoval had designed systems of inquiry intended to create environments that encouraged students to engage in knowledge construction, collaboration, and creative problem solving. Though the technology would be viewed as “clunky” today, its capacity far surpasses much of the available tools and apps that students currently use. As Audrey explained:
There’s a popular origin story about education technology: that, it was first developed and adopted by progressive educators, those interested in “learning by doing” and committed to schools as democratic institutions. Then, something changed in the 1980s (or so): computers became commonplace, and ed-tech became commodified – built and sold by corporations, not by professors or by universities. Thus the responsibility for acquiring classroom technology and for determining how it would be used shifted from a handful of innovative educators (often buying hardware and software with their own money) to school administration; once computers were networked, the responsibility shifted to IT. The purpose of ed-tech shifted as well – from creative computing to keyboarding, from projects to “productivity.”
As the responsibility and vision of ed tech shifted from pioneering academics to the bureaucracy of education, it appears as though it became adapted into the existing ecosystem of schools instead of changing the environment. Zhao and Frank (2003) describe this ecological perspective in their seminal work. They found that computers would be assimilated into the ecosystem of schools when they could be used to improve the efficiency or effectiveness of existing tasks – a phenomenon documented time and again by other scholars (Cuban, Kirpatrick, & Peck, 2001; McLeod, 2015; Reich, Willet, & Murnane, 2012). I understand this conception, and have read the research attributing lack of “technology integration” to everything from teacher beliefs and teacher comfort, to the pressures of standardized tests, to lack of access and professional development. However, what struck me most in Audrey’s response to my question of what went wrong came in her conclusion. The history of ed tech remains intrinsically linked to the broader history of education; and in that narrative, as historian Ellen Condliffe Lagemann wrote, “Edward L. Thorndike won and John Dewey lost.”
I used that same quote in the first chapter of my dissertation and argued that to understand how to change the educational system has required a deep understanding of the system itself. Organizational psychologist Kurt Lewin once wrote, “If you want to truly understand something, try to change it.” So in my quest to understand the history of the educational system, I started with Tyack and Cuban’s book, Tinkering Towards Utopia. They begin their history with the establishment of the Common School in the 1840s by Horace Mann. What began as loosely coupled one-room schools evolved into a decentralized, secular system designed to prepare workers for an industrial economy as well as to create a sense of national identity through the assimilation of immigrants into society. Though this conception ultimately created the notion of a unified public education system,
“[it] was not a seamless system of roughly similar common schools but instead a diverse and unequal set of institutions that reflected deeply embedded economic and social inequalities” (Tyack & Cuban, 1995, p. 22).
In 1916, John Dewey argued that advances in technology resulting from the Industrial Revolution added to the complexity of society, increasing the need and value of formal education; and yet, he warned that education needed to prepare students not only for learning in school but also in society. Unfortunately, as both Audrey and Lagemann explain, Thorndike and his behaviorist colleagues emerged as the “winner” of the day with assertions that teachers existed to execute the designs of administrator and that learning is accomplished when students produce a desired response to a given stimulus.
Herin lies the rub: as Audrey concludes in her response to my question, ed tech has essentially reinforced the behaviorist principles of the original founders. Most tools support drill and practice – occasionally in the guise of “formative assessment” – or classroom distribution of resources. As I think about it, most paid-for components of tools also provide district or school level data dashboards so that administrators can ensure that standards are being addressed and content is being taught (not necessarily learned). Though we have some of the most powerful computational devices available in the history of our world, few tools encourage educators to create networked learning environments like Scaradamalia and Bereiter’s Knowledge Forum or Collins and Sandoval’s BGuILE. Instead, as Bruce Dixon from Modern Learners lamented after returning from ISTE, we largely have vendors hawking products to perpetuate the behaviorist principles espoused by Thorndike in 1910.
Researchers Collins and Halverson argue that digital technologies fundamentally challenge the tenets on which schools base their identities: a fixed amount of knowledge, teachers as the purveyors of that knowledge, age-based grading, curriculum sequencing, etc. Though computers – or at least computing devices – have existed within the context of education since the early 1900’s (Audrey has a complete timeline available here), they have failed to have a systemic impact on classroom practice within most American schools because of the entrenched history – what Tyack and Cuban refer to as the “grammar of schooling.”
According to Krishnan (2009), history can be defined as the documented evolution of an individual, nation, or organization. The history of school has always linked education to economics and society. Whether considering the role of public education to acculturate individuals into an assimilated view of American society in response to the Industrial Revolution, or the 1983 A Nation at Risk report that linked education to economic success in an Information Age, different eras have shaped the culture of American schools. In a recent blog post, Larry Cuban argues that educational reform neither bounces between fads nor remains completely static. Depending on where you look, and when, schools do change – just not necessarily consistently across geographic areas or socioeconomic strata.
So now I have a new question: if in this post industrial era (whether you want to call it the 4th Industrial Revolution or the Third Wave) technology will increasingly change economics and society at a previously inconceivable rate, then what will be required to fundamentally alter the culture of schools to achieve Dewey’s ideal — an educational system organized around essential questions and ideas and where teachers and researchers collaborate to create communities of inquiry?