AI and Mental Health: Collaboration, Not Substitution
- Shane Warren
- Aug 8
- 2 min read
AWARENESS STATEMENT
For Immediate Release
8 August 2025

Recent commentary from OpenAI CEO Sam Altman has reignited debate on the growing trend of using AI chatbots for therapeutic support. Surveys suggest that nearly half of people experiencing mental health challenges who also use AI are turning to large language models (LLMs) such as ChatGPT for help. While these tools are attractive, always available, cost-free, and accessible, serious concerns remain about privacy, safety, and efficacy.
Altman himself has acknowledged the risks: unlike professional therapy, conversations with AI are not protected by confidentiality or privilege. Sensitive disclosures could, in certain circumstances, be made available to third parties. More critically, AI cannot yet replicate the empathy, attunement, and contextual understanding that are the essence of safe and effective therapeutic care.
At the Vocational Mental Health Practitioners Association of Australia (VMHPAA), we recognise that AI is here to stay and will increasingly be part of the mental health landscape. The question is not whether it should be used, but how it should be used, and by whom.
Striking the Balance
AI can support, not replace. AI tools may complement mental health care by providing psychoeducation, basic coping strategies, and connection between sessions. But they cannot substitute for professional, person-centred care.
Privacy and ethics are non-negotiable. Stronger regulatory and ethical frameworks must govern AI in mental health to protect vulnerable users.
Partnership is key. AI systems must be developed and deployed in collaboration with mental health experts including vocationally trained practitioners, counsellors, peer workers, psychologists, social workers, and psychiatrists.
Accessibility matters. The fact that people are turning to AI is also a signal: access to affordable, timely mental health care remains too limited. AI should not become a “band-aid” for systemic underfunding.
As VMHPAA Chair Shane Warren notes:
“Technology can amplify access, but it cannot replace human connection. AI must walk alongside practitioners, not stand in their place.” Shane Warren
And as Secretary Susan Sandy adds:
“People turn to AI because it is available. Let’s learn from that and ensure our mental health workforce, including vocational practitioners embedded in communities, are just as easy to reach, and just as responsive.” Susan Sandy
Call to Action
VMHPAA calls on policymakers, technology developers, and health leaders to:
Establish clear policy and privacy safeguards for AI in mental health.
Ensure expert co-design of AI systems with practitioners across the full spectrum of care.
Address the structural gaps in access that push people toward unregulated alternatives.
AI may offer promise, but without proper integration into a trusted, multidisciplinary system of care, it risks doing more harm than good. The future of mental health must be human-led, AI-assisted, and always centred on safety, dignity, and wellbeing.
Media Contact:
Shane Warren, Chair
Susan Sandy, Secretary
Philip Armstrong, CEO
VMHPAA
Comments