Education, Personalisation and Surveillance

Education, Personalisation and Surveillance concludes my coursework at The University of Edinburgh. In some ways, the class was exemplary of the overall experience. An experienced and thoughtful professor, a highly engaged class who takes the topic seriously, and a mix of assignments consisting of both traditional academic writing and non-academic formats. The course involved a fair amount of preliminary reading, all of which was required, some of which I’ve even referred to in my current professional role. Leading up to the intensives—two days of eight-hour lectures—I found the online group discussions about various topics the professor chose to be quite helpful as a “warm-up” to the more critical writing we’d do live in the classroom.

While we did cover many topics during the course, the topic I delved deepest into was the role of technology in education. Specifically, the role of AI in education with an emphasis on debating the benefits and risks of the so-called promise of “personalized learning experiences.” Surprisingly, (and refreshingly) many of the arguments heard, certainly the required reading, were pleas against using AI in educational settings. There were many ideas I explored, but there were a few that I landed on and have adopted as part of my own opinion on the role of AI in educational settings, specifically for K-12.  

  1. Focusing on under-resourced communities is not in itself a benevolent act
    Poor kids are easy to rally behind, but they can be unfairly characterized as instruments for change rather than recognized as people who genuinely deserve support under prioritarianism, a principle of distributive justice that gives precedence to improving the well-being of those who are worst off or most disadvantaged. It’s easy to sympathize with underprivileged children in under-resourced schools. However, should the same type of experiments be proposed for resource-rich environments like expensive private or preparatory schools, the reaction would likely be quite different. 
  2. Personalized learning experiences are wholly dependent on harvesting personal data
    Another theme is the rhetoric of individualization, which is often portrayed as meeting each student “where they are” by creating unique learning pathways. This narrative frequently positions students as consumers of personalized educational experiences based on individual needs. However, the promise of personalized learning experiences is fueled by the acquisitive data collection process, raising concerns about privacy and the implications of data collection on students. Aside from the apparent ethical breach of collecting and mining children’s personal data, this harvesting of information represents a fundamental paradox: the more “personalized” the learning becomes, the more children must surrender their privacy and autonomy to technological systems. 
  3. Augmenting teachers’ roles results in their marginalization
    While seemingly empathetic to teachers’ struggles, there is often a common refrain of aiding teachers by automating their responsibilities using technological assistants, thereby allowing them to do what humans do best: emotional support, motivation, and developing a personal connection with students. The subversive irony here is that while technology seemingly creates more opportunities for teachers to interact with students, it minimizes their roles as educators—their primary function. This shift raises fundamental concerns about devaluing instructional expertise and educators’ judgment.