‘Humanising the machines’ is not the answer (or a plan) for AI in higher education
Professor Kelly Matthews, University of Queensland
‘anthropomorphise’: attribute human characteristics or behaviour to (a god, animal, or object — to AI).
Is anyone else concerned about the bent to humanise AI and its relationship with students?
Reading too many articles, blogs, and social media posts anthropomorphising new advances in AI (genAI as a common name) prompted me to write this piece. Texts that read:
- keep humans in the loop
- students will partner with AI
- the student-AI partnership
- AI will replace teachers
I wish only consultants and salespeople were evoking these notions, but some scholars in higher education are also framing AI in education in these ways.
‘dehumanisation’: to deprive (someone or something) of human qualities, personality, or dignity.
Read the work of Nick Haslam in psychology (whom I had the pleasure of meeting last year – folks in Melbourne can visit him) to really understand the complexity of dehumanisation and its consequences.
Suggesting that AI has human qualities implies that collaborating with AI is like collaborating with a person.
Any such suggestion that the student-AI relationship is a partnership or collaboration is harmful. It risks dehumanising students, academics, and staff with specialist educational acumen while humanising AI. It suggests a mechanistic relationship that is fixed, unemotional, and technical.
No surprise that I am deeply disturbed by anyone labelling the student-AI relationship as a partnership. I am part of a larger group of scholars and practitioners who are re-asserting the centrality of human interactions and relationships in education that positions students as agentic contributors to designing, implementing, analysing, and researching with academics/faculty/staff in higher education. We use partnership – students as partners – as a metaphor to challenge the deficit framings of students.
Partnership embraces messy human interactions that demand an ethic of reciprocity and shared understanding gained through care, collaboration, conversation, debate, disagreement, and consensus over time. The student-AI relationship is not a partnership.
I am curious about the role and place of AI in education. I am worried about its implications for society. In my classes, I encourage students to use it critically. It has benefits that students can leverage in their learning.
Here is my point: AI is not human. The danger of AI-student partnership/collaboration discourse is subtle dehumanisation whereby student learning relationships are reduced to mechanistic, one-sided transactions, void of emotion that inadvertently centre a pursuit of efficiency and conformity.
AI can’t become human, but students can become more AI-like. That is the harm of ‘student as partners with AI’ or ‘human in the loop’ discourse – inferring students (people) reflect AI-ness that allows us to ‘stay in the AI loop’.
New AI advances clearly have educational applications that will shape student learning, behaviour, and beliefs about what a student should spend time doing and learning about in higher education. Maybe, students (everyone) feel ‘safer’ asking the chatbot or ‘agent’ the dumb questions. Uses of AI in therapy are actively being examined. But chatting with AI about learning or mental health does make AI human or even human-like.
Here is the call for action.
Language matters. Resist language that humanises AI. In humanising the AI-student relationship, we risk dehumanising students, teachers, and specialist staff in the field of higher education.
Professor Kelly Matthews, Institute of Teaching and Learning Innovation (ITaLI), University of Queensland