Eliza: Is something troubling you?
You: Men are all alike.
Eliza: What is the connection, do you suppose?
You: They are always bugging me about something or other.
Eliza: Can you think of a specific example?
You: Well, my boyfriend made me come here.
Eliza: Is it important to you that your boyfriend made you come here?
You: He said I’m depressed much of the time.
Eliza: I am sorry to hear that you are depressed.
You: It's true. I am unhappy.
Eliza: Can you explain what made you unhappy?
ELIZA is not really a therapist, but a computer program written in 1966 that includes a scenario of a "therapist" talking with the patient according to word pattern identification (an early version of natural language processing computer program (NLP)). The inventor, Dr. Joseph Weizenbaum, originally sought to demonstrate the artificiality of communication between computer and human, but was surprised by the emotional reactions ELIZA evoked.
So what's the connection between ELIZA and Covid-19? Is the conversation between the modern "Eliza" protected by the duty of medical confidentiality? And what is legally important to know if you develop a product in tele-psychotherapy? That is the subject of this post.
According to the Israel Ministry of Health, Covid-19 has led to an increasing number of calls to health clinics because of mental distress, particularly anxiety and depression. The pandemic stimulates significant fears. People have been fired or furloughed, businesses closed, and employment areas such as tourism and culture have been "frozen." In addition, there are the pressures and tensions emanating from social distancing, isolation, loneliness, concern about sick or elderly relatives, and the need to care for children staying home. Moreover, there are reports of a rise in family violence. Even under normal circumstances, there is an extensive wait for a psychotherapy appointment – according to a July 2020 survey, the wait time for an initial meeting was a month and a half and the wait for individual psychotherapy was half a year (The Marker, Sept. 25, 2020).
The pandemic finds the mental health field in Israel strained more than ever. Although responsibility for the field was transferred from the Ministry of Health to the HMOs several years ago, and family physicians assist their patients with drug treatments, psychotherapy is still unavailable to most patients who require it. Mental treatment for anxiety and depression seems to be aimed at and available to those able to spend significant sums for private treatment.
Over the years, mental health budgets have shrunk. Psychiatric hospitals and psychiatrists wage an ongoing struggle to improve patient treatment, and confront the stigma attached to mental illness. Despite growing awareness of the issue, there is still a wall of secrecy surrounding the need for medical help in the field of mental illnesses and disorders.
Given the corona situation that required social distancing, it was inevitable that many psychologists, social workers and psychiatrists began providing their services online. Companies offer online psychotherapy treatment by video conversations between therapist and patient, or by having the patient send text messages or clips to which the therapist responds. Other companies offer in addition to tele psychotherapy, a range of complementary services like support groups and tutorials. Despite the difficulties entailed in conducting intimate conversations in a digital sphere, it seems that tele-psychology is successful as evidenced by some research.
Digital treatment in the mental health field is not feasible or convenient for everybody. Some clinics in the U.S. offer this treatment via telephone (audio only) to patients who find it hard to handle technology or who do not have access to internet service. Another company offers a setting for people to conduct their online psychotherapy treatment. This setting includes a comfortable place, a high quality internet connection and means for privacy protection, for those cases where the patient's home does not afford the necessary conditions.
Although it is easier to detect a deviant medical index or a pathological finding in an image than to evaluate behavior, thinking or human emotions, it is evident that Artificial Intelligence (AI) is making giant strides forward in that direction. There are companies that utilize artificial intelligence to assist physicians in selecting the best medicine for patients suffering from depression. Another company developed a chatbot that uses AI and NLP to chat with a patient, offering support and CBT tutoring that can be used to complete a personal treatment.
We note, though, that individuals and institutions in most need of innovation with regard to medical treatment – namely patients and psychiatric hospitals – are also the last to benefit from it. There are clearly significant needs that must be addressed. Likewise, in the field of psychological treatment, it is worrisome that elderly patients will be left behind due to lack of digital orientation.
The sensitivity of the information requires significant care regarding protection of privacy. Thus, we may ask: is there a requirement of medical confidentiality of the discourse between the patient and a program such as ELIZA, when there is no therapist or medical institution operating the program? In Israel, the answer is that the Patient's Rights Law lays out the requirement of medical confidentiality on the part of the physician and the medical institution. When the physician and the medical institution are not involved, we are dealing with sensitive information protected by the Privacy Protection Law. (Regarding privacy and the differences in such situations, see previous posts on telehealth and medical applications). A more complicated question is whether human supervision is required when activating such an application.
In addition, the field of mental health and mental health therapists is regulated by various legislation, including the Psychologists Law, the Social Workers Law, and with regard to psychiatrists - the Mental Health Care Law, and the Patient Rights Law.
Accordingly, it is necessary to examine the relevant activities, and to ensure that they conform with the regulations concerning the different therapists’ authorities and duties.
For every therapeutic profession there are different definitions regarding the duty of medical confidentiality, requirements to notify authorities in various situations (for example, in case a patient endangers another individual or himself), different requirements regarding the patient’s consent to transfer information and other issues.
Likewise, in the mental health field there are inherent dangers regarding the consequences of errors in medical treatment – danger of what the patient might do to himself or to others. These risks should be mitigated in the framework of a protocol designed to guide proper treatment and human intervention when needed. These risks should also be contained in the framework of insurance arrangements and the agreements regulating the relations between the parties.
The diagnostic power of AI also raises questions. Perhaps in the future AI will aid in identifying risk factors leading to anxiety and depression, and thus will raise the awareness of physicians and patients to the situation and facilitate treatment. At the same time, it may be that at the end of the day, identifying risk factors to mental illness may lead to an invasion of privacy, and discrimination of persons in the employment market, in medical insurance and in society. Further questions evolving from AI will be analyzed in a future dedicated post.
Undoubtedly, the meeting between digital health with the mental health field has the potential for a significant breakthrough that could benefit patients that very much need it. Considering the legal consequences, it is important that it be accompanied by legal advice.
For questions please contact firstname.lastname@example.org
This post is intended to present general information only. It should not be construed as legal advice or legal opinion and should not be relied upon.