A photograph shows a group of children in a classroom in front of a blackboard covered in equations. The five children are using laptops and sitting around a central table. The back of the a teacher can be seen in the foreground of the image supervising the class.

Share:

LinkedIn
Twitter
Facebook
Email

Is AI the future of learning?

Adam Choong discusses the role artificial intelligence may play in educating children and young people with disability. 

This piece draws from an interview with PhD candidate and 2024 Westpac Future Leader, Brynlea Gibson. Read the full transcript here.

Artificial intelligence (AI) is more than your favourite study buddy ChatGPT. Yes, generative chatbots like OpenAI’s flagship have taken the world by storm and changed the way the average person learns and works today. But the AI revolution – particularly in the area of education, and specifically for disabled young people – goes far beyond this.

PhD candidate in psychology from the University of Queensland Brynlea Gibsons says ideas such as “artificially intelligent” tutors are already being implemented in university classes to deepen students’ understanding of course content – and could be used to meet the specialised learning needs of intellectually disabled children. This begs the question: what’s the difference between learning from a human versus an AI, really?

Most of us have been affected by the hassle of remote work or study during the pandemic, with disabled children particularly dropping through the cracks of our education system and falling behind in terms of reaching benchmarks in knowledge and skill requirements. AI has addressed this issue through means such as Blackboard Ally, a software that assesses accessibility to educational content using a variety of metrics, including those related to disability, like colour blindness.

In principle, AI also has the capacity to detect “hidden” trends to resolve issues, such as finding the best teaching approaches for individual students. This is important when educating children with intellectual disability, as machine learning (ML) models’ teaching methods are optimised over time and developed to tailor to students’ special learning needs.

All that said, we can’t ignore the downsides of introducing these futuristic technologies to students. According to Brynlea, this includes the risk of overusing efficient AI tools, due to their cheapness and accessibility, to the extent that vital human connections between teachers and students are lost. This is especially harmful for disabled children’s development as forming interpersonal relationships, which begin in the classroom, is essential to shaping how they interact with society and advocate for themselves.

However, if a balance of teaching methods can be reached by adhering to the values that foster an effective learning and social environment, AI can increase disabled students’ capacity to engage in intellectually stimulating activities.

For this reason, teachers should embrace AI to enhance disabled children’s educational experiences. Brynlea points out that the landscape of learning is constantly evolving and so too must teaching practises. From a disability perspective, this means that teachers need to find novel and innovative ways to integrate AI in an empowering manner.

A main measure to enrich disabled students’ learning experiences is ensuring that teachers supervise the incorporation of advanced technology into the education system so students still learn to think critically and use AI as a learning aid rather than a replacement for their own intuition. As disabled students learn to handle AI responsibly with appropriate guidance, Brynlea notes, students’ accessibility to deeper learning experiences is increased, and they can have more agency over their education.

Today, families of children with speech impairments are already taking matters into their own hands by using AI, with guidance from therapists, to provide real-time feedback on fluency, pronunciation, and other language skills.

But what about the wider ethical implications involved in embracing AI? Here, there are two main perspectives to consider – that of students with disability and educators.

For disabled students, one big concern is their privacy. With sensitive data about their educational experiences being required to improve AI services, there’s understandably a looming fear of this data being misused. The Australian government has implemented tight and enforceable laws, regulating how data is handled, but whether these laws are sufficient safeguards against unethical data practices is a great source of debate.

For educators and scholars like Brynlea, the main ethical concerns relate to equity. Some students might be more proficient than others in using AI to their benefit, creating a widening gap in accessibility to educational opportunities. To partially prevent this issue, Brynlea believes that AI developers should focus on creating accessible, user-friendly AI services.

I’ll leave you with this: AI has the power to positively shape the lives of disabled people, especially in education, and should be embraced by disabled students and their educators. If implemented responsibly and creatively, this profound technology will help foster inclusivity and help students take ownership of their learning journey.

This article is an extract from CYDA’s The Platform Newsletter. Receive monthly updates by subscribing below.

A photo of a young man with short dark hair and black-rimmed glasses. He is standing against a beige coloured wall and wearing a black jacket featuring the Monash University logo.

About the author:

Adam (he/him) is a person of culturally diverse heritage who uses his lived experience with disability to amplify the voices of disabled CALD communities in his advocacy work. Outside of his advocacy work, Adam is a data science student at Monash University who is actively researching the intersectionality between artificial intelligence and disability. He is also a 2024 Westpac Asian Exchange Scholar who is going to the National University of Singapore to study its AI practices.

The Platform Newsletter banner on green, orange and blue background.

The Platform is our newsletter for young people with disability, featuring interviews, opportunities and news on the issues that matter to you!

* indicates required