Curtin University students will soon be offered a new approach to tackling their mental health problems, a therapeutic chatbot called Manage Your Life Online, or MYLO for short. It was developed by the Led By Experience Mental Health Research Group at Curtin enAble Institute.
Curtin Professor of mental health Warren Mansell said MYLO helped people to explore the issues underlying the challenges they face through conversation. It is set to be released to students next year.

Professor Mansell said: “The advantage of MYLO is that it doesn’t try to be a companion, but it models what it’s like to be listened to and to be curious. This is the most powerful ingredient of therapy.”
This was verified through testing of MYLO, where participants found that they benefited from being listened to rather than being given advice, Professor Mansell said.
According to a 2024 report from Massachusetts Institute of Technology, there are a significant number of users seeking mental health support from AI to alleviate feelings of loneliness.
A 2023 report by Ending Loneliness Together indicated 15 per cent of Australians often feel lonely, with 1 in 6 reporting severe loneliness. The rates in the loneliest demographic, aged 18 to 24, have skyrocketed from 22 per cent to 40 per cent, based on their 2025 findings.

Professor Mansell said: “The first thing to unpick is to understand loneliness. Very often, loneliness doesn’t happen in a vacuum. It happens because there’s a reason behind why that person feels lonely or becomes socially isolated.”
Behind loneliness there can be fears, anxieties and unresolved relationship issues that can often be grounded within early relationships, alongside individual differences that people could have, Professor Mansell explained.
He said: “From a psychological perspective, we want people to explore and face those anxieties, concerns and relationship issues that are behind the loneliness.”
MIT’s 2024 report indicated that AI companions created by companies like Character.ai, Snapchat’s My AI, Anima and PI are popular applications that people use to tackle loneliness, alongside other motivators.

While these services offer social interaction and emotional support, there are risks associated with the use of them, such as over-reliance, withdrawal from human socialisation and delays in seeking professional help.
eSafety, a government organisation responsible for the regulation of online safety, aims to educate citizens about the potential risks of AI use.
In a speech delivered at the National Press Club in June 2025, eSafety Commissioner Julie Inman Grant said emotional attachment to AI companions is hardwired into their design. By displaying human-like qualities and generating human-like responses, they provide constant affirmation and a feeling of deep connection.
An eSafety spokesperson said eSafety has continued to monitor the rise in popularity of AI chatbots and companions and has taken a number of steps to educate about the risks, such as hosting webinars, issuing an online safety advisory and updating our eSafety Guide.
Professor Mansell said: “The purpose of chatbots for mental health is for people to have an open space to explore and go into depth in ways they wouldn’t have done before.”
He said creating a conversation where people would feel valued and heard, would take the burden off people trying to solve problems for themselves.
Categories: General, Mental Health, News Writing and Reporting, Technology

