Some people use artificial intelligence tools to get advice for their mental health, instead of a psychologist. For these health professionals, AI can be a useful assistant, but it must be regulated.
- Users explain that they have recourse to artificial intelligence for psychological support.
- For shrinks, AI can have an interest in certain specific cases, but it will not be able to replace therapy with a professional.
- Certain uses of AI could however help psychologists in their practice.
Artificial intelligence is essential everywhere in our daily newspapers. Even with regard to mental health. On Tiktokusers talk about “Best psychologist on Earth “of their “new psychological support“Or because of having been able”speak without feeling tried“By referring to Chatgpt, the conversational tool of Openai. Specific bots have even been created, as Psychologist.
Mental health: when AI plays psychiatrics
On social networks, users highlight the advantages of AI compared to a professional psychologist: its availability (H24 and 7 days a week) and its free. The advice provided is not very personalized, but they can have their utility according to Sabine Allouchery, practitioner in psychotherapy and co-author of the report “Mental health “ of the Mentaltech collective. In an article of 20 minutesshe recalls that if these tools cannot replace a psychologist and psychotherapy, they can sometimes make it possible to temporarily reduce anxiety or sadness. “”But it will especially depend on the ability to ask good questions “she underlines. Olivier Duris, clinical psychologist and digital use specialist, completes in an article in World : “Our work as psychologists goes much further than that, and I don’t think that will be enough to replace us.“”
Artificial intelligence: tools for psychologists
In a report Published last November, the American Psychological Association raises another interest in artificial intelligences. For psychology professionals, it could be a kind of daily assistant. “”AI is used to rationalize administrative tasks, make work flows more effective and help clinical decision -makingnotes the document. And possible AI uses should only grow.“For example, AI can facilitate the implementation of a recall or billing, but it can also generate summaries of health files or create clinical notes. From a clinical point of view, the association estimates that AI could provide support tools, either for diagnosis or to complete face-to-face therapy with specific exercises.
Mental health: a chatbot validated by the British health authorities
In the United Kingdom, the National Health Service validated a chatbot for “Improve access to mental health services “. It is intended for people with symptoms of anxiety and mood disorders of light to severe intensity. “”The chatbot takes the individual through an easy -to -navigate conversation and some short exercises, specifies a press release. He RAutomatically fills the individual’s patient file, which reduces the need to repeat information to several clinicians. ” The objective of this tool is to detect the immediate risks, and to support patients with information useful for their mental health, before the start of their therapy. As the psychological association association recalls, it is however necessary to set up “Legal and regulatory executives” For “Provide suitable railings for the development and use of AI “. This concerns in particular the protection of medical data.