General News
30 January, 2026
WRAD Health cautions use of AI
WRAD Health is cautioning south-west residents to be careful with how they use artificial intelligence for accessing health advice.
New Australian research published in the Medical Journal of Australia (MJA) shows a growing number of people are using artificial intelligence tools such as ChatGPT to seek health information, highlighting both emerging opportunities and clear risks for the healthcare system.
The nationally representative study found that 9.9 per cent of Australian adults used ChatGPT for health-related questions in the previous six months, with many seeking information that involved interpretation of symptoms, conditions or next steps.
Worldwide that number is estimated to be around 40 million per day.
WRAD Health CEO Mark Powell said the research reflects what frontline services are already seeing.
“People are using google and AI tools to make sense of what’s happening for them or someone they care about,” Mr Powell said.
“Used well, AI can help people reflect, and make the idea of reaching out feel less daunting, but it should never replace professional care or human judgement.
“We strongly support the message from researchers that AI use in healthcare must be cautious, transparent and ethical,” Mr Powell said.
“When used responsibly, it can complement not compromise access to high-quality care but it should never replace professional care or human judgement.”
The MJA authors noted that while AI tools may improve access to information and support health literacy, there are significant risks if AI-generated information is misunderstood, over-trusted, or used in place of professional care, particularly for complex or high-risk health issues.
Both Australian and international research shows that AI tools can produce inaccurate or incomplete medical information, may respond confidently to incorrect assumptions, are not regulated as medical devices, cannot assess risk, context, or safeguarding needs and can increase the risk of delayed care if relied on in isolation.
However, Mr Powell said WRAD Health believed the rise of AI use also presents an opportunity for health services to improve access and engagement, provided AI is positioned as a bridge into care, not a substitute for it.
“As long as AI is used responsibly, it could help people recognise when to seek help and reduce barriers for families and carers, and support earlier, safer engagement.
“AI can help people ask better questions and feel less alone,” Mr Powell said,” but healing happens in relationship. Our role is to be the safe, human next step once the thinking has started.”
“We need to be clear AI does not diagnose, treat or hold risk. Safe, ethical use means guiding people towards qualified services earlier, not encouraging self-management in isolation.”