Skip to main content

Advertisement

ADVERTISEMENT

Perspectives

Artificial Intelligence is Quickly Emerging—but Will it Help or Hinder Behavioral Health?

Ron Manderscheid, PhD
Ron Manderscheid, PhD
Ron Manderscheid, PhD

Artificial intelligence (AI)—a field of computer science that focuses on building systems that can perform tasks in a human manner, such as speech recognition, language translation, and visual perception— took 2 giant steps forward just recently.

A revolutionary AI system has just come to the market. This system, ChatGPT (Chat Generative Pre-trained Transformer), now has the capacity to chat with you in a nonstructured manner. It also has the capacity to write documents, including academic papers. These developments are remarkable, as talking and writing are 2 of the communication functions that have defined humanness since our earliest days.

ChatGPT is a chatbot (a computer program that simulates language) developed by a company called OpenAI. It was launched in November 2022. ChatGPT is built on OpenAI's GPT-3 family of large language systems and has been trained using both supervised and reinforcement learning techniques to impart knowledge. You can access the system here.

Initially, ChatGBT has suffered from 3 problems:

  • Inability to distinguish between a reasonable response to a query and a ludicrous one
  • Low affect in responses; and
  • Lack of any moral sense.

Each of these issues can be overcome with additional training and exposure to actual human interactions, much as humans learn to behave. Critical for these developments will be very planful consideration of what knowledge to include, what affects to endow, and what moral values to impart. Clearly, these are complex decisional processes that should include representatives from the community at large.

I report these developments because they have a direct bearing on the future of behavioral healthcare. As we continue to experience major workforce shortages in the field and these shortages are expected to continue for years, AI clearly will become an option to extend our capacities. This is particularly true because much of behavioral healthcare involves interaction and verbal discourse. How we will adapt to this will depend upon the specific trajectories that we pursue.

One trajectory would involve close collaboration between a committee of field experts and representatives of the AI industry to develop a “ChatTherapist.” Discussions would involve the specific therapeutic content to be incorporated into the system, how to build affective tone into the system because of the key nature of therapeutic alliance, and, of immense importance, the moral values to be incorporated. This type of collaboration could produce a functional system in about 2-3 years.

Another trajectory would not involve collaboration between the field and the AI industry. However, this does not mean that the AI industry would not undertake work on a “ChatTherapist.” The attraction of immense profits would be far too strong. Rather, the system would be developed with inadequate or nonexistent technical and moral input from the behavioral healthcare field. It takes little imagination to discern what types of personal tragedies could result from this outcome. Clearly, this trajectory is not recommended.

A third trajectory also is essential. Federal legislation will be needed to direct the US Department of Health and Human Services (HHS) and its agencies to set up and staff a digital health office that monitors developments and sets standards for these AI chatbot systems. Only systems approved by this office would be able to receive Food and Drug Administration (FDA) approval for prescription to clients.

This brief piece just touches the surface of a very complex topic with tremendous implications for the future of behavioral healthcare. I hope that I have alerted you to the issues involved, as well as to the urgency of our involvement as a field. Chatbots no longer are hypothetical. Primitive versions, such as ChatGPT, already are sweeping the market and disrupting education and other fields of human endeavor.

Ron Manderscheid, PhD, is the former president and CEO of NACBHDD and NARMH, as well as an adjunct professor at the Johns Hopkins Bloomberg School of Public Health and the USC School of Social Work.


The views expressed in Perspectives are solely those of the author and do not necessarily reflect the views of Behavioral Healthcare Executive, the Psychiatry & Behavioral Health Learning Network, or other Network authors. Perspectives entries are not medical advice.

Advertisement

Advertisement

Advertisement