Over the past decade, the development of a humanoid has been vague yet there are plans to integrate this ‘humanoid technology’ into our psychotherapeutic system. What are these implications on our patients? Is this further reliance on technology something humanity should really partake in?
The most iconic research papers including New York University‘s paper run along the lines of:
The Future Of Psychotherapy
The successful creation of a humanoid is implicit when technology starts to convey signs of empathetic understanding when responding to human problems. The 1930s Turing Test by Alan Turing commenced the ideal of technology’s ability to exhibit signs of humanintelligence.Shortly after the establishment of the Turing Test, ELIZA was created by Joseph Weizenbaum and holds the title as the first chatterbot computer program with human qualities. ELIZA is modelled from the behaviour of a Rogerian psychiatrist.
Carl Rogers was a renowned psychotherapist who published his unique method of a ‘person-centred therapy’. He claimed this notion to be more effective and efficient for those in unstable mental conditions, especially depression. Roger’s humanist theory proposes that humans are inherently good and functional, with a significantly striking trait to always striving to maximise their potential. The principal human motivation is to self-actualise, parallel to Maslow’s hierarchy of needs. It fundamentally captures the essence of determining where the ‘real self’ stands, as opposed to the ‘ideal self’. According to this theory, mental illnesses are diagnosed when there is incongruence between an individual’s ‘real self’ persona and the the societal norm’s expected persona.
What Is The ‘Rogerian’ Solution?
Rogerian psychotherapists need to create a non-judgemental environment. With this ambience, the patients are to formulate their own solution. Throughout this time, therapists are just required to demonstrate 3 distinct traits:
- Unconditional positive regard
All these traits are encapsulated in the functions of ELIZA, including the ability to listen, reflect and empathise without a purposive direction. Saying “I understand and I can’t help you decide, but I will be here until the end” is a common phrase for Rogerians.
Problem With Robotic Psychotherapists
- The interpersonal and physical relationship between the therapist and the patient cannot be established
- The inability to quantify results is another problem that psychologists raise, concerning the overpowering nature of technology
- Many psychotherapists ask… is unconditional positive regard a good trait to elicit? What if the therapist is encouraging the actions of a psychopath? Unfortunately ELIZA is unable to judge the consequences of a human’s potential actions, nor is malleable in providing the appropriate recommendations in certain situations.
So… What Now?
Of course it will be cost-efficient and perhaps even structured if we use an online bot such as ELIZA, however to certify their ability to diagnose mental illness, it must adhere to further fine-tuning. This is just the beginning to a new era of medical treatment. Welcome to the age of robotic psychotherapy.
See for yourself what it’s like to talk to a humanoid – Talk to ELIZA.