Skip to main content

Why Clinical Oversight Matters in AI Mental Health Tools: How jAImee Sets a New Standard

Artificial intelligence is rapidly becoming a fixture in mental health care. AI chatbots are now being used by millions of people worldwide to seek support for anxiety, depression, and other concerns. However, two recent articles have raised serious concerns about the risks of using these tools without clinical supervision.
 
Futurism investigation uncovered a number of disturbing cases where individuals developed intense paranoia, delusions, and hallucinations after interacting with unregulated AI chatbots. In some situations, these interactions contributed to hospitalisation or even incarceration. The term “ChatGPT psychosis” is now being used to describe this emerging issue.
 
At the same time, a Psychology Today article raised further concerns about the lack of safeguards in the design and use of mental health AI tools. The article calls for clear clinical oversight, ethical use, and a professional framework to ensure these tools support rather than harm vulnerable users.
This is the exact issue that ANTSA was created to address.
 

jAImee: Built with Clinical Oversight

jAImee, the AI therapy chatbot within the ANTSA platform, is the first in the world that is overseen by mental health practitioners. It is not available to the general public. It can only be assigned by a qualified clinician, who can review all interactions. This keeps the practitioner in control of the therapeutic process and ensures that jAImee operates within a safe and professional framework.
 
This model supports clients between sessions without replacing the human element of therapy. It helps clients remain engaged with their treatment and allows practitioners to monitor patterns, identify concerns, and offer timely guidance.
 

Putting Client Safety First

ANTSA prioritises safety and ethics. jAImee operates as part of a broader digital platform designed for use by health professionals. It includes secure data storage, evidence-based psychoeducation, journalling, and mood tracking. All AI use is visible to the practitioner and is used to enhance, not override, clinical care.
 

Australia Leading the Way

ANTSA was developed in Melbourne by Clinical Psychologist Sally-Anne McCormack, who has more than 20 years of experience supporting clients in both private practice and public settings. She saw the need for a tool that could keep clients on track between appointments without compromising clinical standards. Instead of releasing a public-facing chatbot, she created a solution that prioritises professional responsibility.
As concerns about AI in mental health continue to grow, ANTSA offers a safe, responsible, and effective alternative. It is already being used by practitioners across Australia to improve continuity of care and reduce the administrative burden in therapy.
 
To find out more about how jAImee works, contact us to explore how ANTSA could support your practice.