A cute penguin. A thousand-year-old pink interface. “Allies” and “bad guys”. A new class of therapeutic applications promises not only to help those in emotional distress, but also to make therapy a fun and enjoyable experience rather than beneficial and sometimes even painful work. That we can play our way towards a vague notion of well-being.
How? ‘Or’ What? By removing that pesky person, the therapist, and everyone who works with them, including waiting lists, referrals, fees (and, of course, expert care). Instead, these apps turn the human, their mind and body into a system of to-do tasks and inspire the user to accomplish them. Once an objective has been reached (whether it is a task or a module completed), the user is rewarded with badges and sequences, results specific to video games rather than to the consultation room. .
Coupled with customizable avatars and witty dialogues, apps like SuperBetter, Joyable, and MoodMission keep the patient, now called the user, coming back for more (or so they claim). SuperBetter, for example, allows users to select their “bad guy” they wish to defeat. These have a mix of diagnoses, like “depression” or “anxiety” and terms from traditional wellness culture like “reduce stress” or even the vague “I’m just getting super better”. The app then gives users “quests” to get rid of the bad guy, gives “bonuses” for doing simple “wellness” tasks, like drinking a glass of water.
MoodMission works much the same way but replaces superheroes with the top of a mountain. Based on a series of algorithm-based surveys that tailor the content of the app, MoodMission presents the user with five missions ranging from “cleaning your bathroom” to “visiting your favorite website”.
By replacing ongoing interpersonal therapy with a self-guided, limited quest for health, these apps provide all the play, but none of the play necessary for in-depth therapeutic work to occur. And play is, absolutely, at the center of the therapeutic process. As psychoanalyst DW Winnicott once wrote: “It is by playing and only by playing that the child or adult is able to be creative. The game offers the strongest proof that reality is not frozen. This change is both imaginable and possible.
But games (limited by rules, often motivated by victory) and gambling (largely unstructured, open-ended) are not necessarily a coincidence and are sometimes even antithetical. By programming a restricted set of behaviors to be rewarded, these applications exclude the unexpected, the creative. If a solitary training program replaces care, as sociologist Gregory Bateson suggested, “then life would be an endless exchange of stylized messages, a game with rigid rules, without change or humor. Any game and no game makes us boring – even if we feel momentarily relieved.
That interactive machines can make us feel good is nothing new, nor is automating the therapist while pretending mental health care is taking place. In the late 1950s, Dr. Charles Slack made tape recorders that counted the number of words they recorded, then distributed them to “members of a Cambridge teen gang,” paying them a tiered rate to register. speak.
Learn more about mental health apps:
The more they talked, the more their accounts went up – as did their reward: cash. Slack noted that “some of the attendees said they felt better to have spoken this way.” Talking was encouraged, it became a game.
A few years later, and through Cambridge at MIT, Joseph Weizenbaum in 1966 launched his ELIZA program, which took the idea of the mechanized soliloquy to a new level. The purpose of the ELIZA experiment was to demonstrate that “communication between man and machine was superficial”.
One of the first chatbots, Weizenbaum programmed ELIZA to “parody” a “client-centered” therapist performing an initial welcome with a new client. Weizenbaum was in shock: Many people did not find the interaction meaningless, even though technically and clinically it was. Emotionally, they appreciated “talking” with ELIZA – although they knew full well that she was simply programmed to respond – even going so far as to ask for privacy to be alone with “her”.
Since then, and despite Weizenbaum’s protests, psychologists and computer scientists have worked to develop programs that could help us. In our contemporary times, these apps have taken the fun and novelty of finding their way through the computer (or tape recorder) and combined it with our usual self-monitoring and current focus on wellness activities at home. detriment of deeper work and systemic solutions.
As Weizenbaum knew then, and we now know, a fleeting sense of accomplishment is quite different from what depth and play-based psychotherapies offer. If we don’t think of Slack’s tape recorder as therapy, why is a reward for, for example, doing a breathing exercise as an end in itself better? Although one may feel relieved at the time, no in-depth work has taken place; a bandage can stop the bleeding for a while, but will not cure a patient in need of surgery.
Rigid rules and in-app attention-demanding notifications (along with quests and badges) have indeed supplanted real play, and therefore, for some, therapy as well. There is no human in the program, nor human relationship, to ensure continuity of care or to help play and replay, especially when it is scary or uncomfortable. These games and their outcomes are, at the very least, predictable, although they may guarantee an individual’s participation in the program.
Learn more about mental health:
Worse, the gamification of care is not neutral; it exists to train users whether or not a platform delivers the promised results. As game designer and scholar Ian Bogost wrote of the consultant’s invention of gamification, it was: “… a way to capture the wild and coveted beast of video games and domesticate it.” to use it in the gray and desperate wilderness of big business.
The current activity is Big Therapy, which is more and more lucrative. The digital health market was worth billions of dollars before the COVID-19 pandemic; Today, online therapy companies are publicly traded, telehealth visits are 38 times more frequent than just two years ago, and employees receive a deluge of reminders and wellness initiatives sponsored by the company while competent care has no longer become affordable or accessible.
In the shift to gamification, patients are being asked to consume their care and be content to take care of only themselves. The user is now solely responsible for his well-being; momentary reduction in symptoms is the only goal. Putting the responsibility for treatment on the person in crisis alone is not just a problem of How? ‘Or’ What care is provided, but care itself.
Everything is fun and playful until someone gets hurt: these apps bypass surveillance, collect large amounts of personal data, and can prevent users from seeking full-fledged therapy support when they need it most . These same users are already the most systemically vulnerable to counting, data collection, prediction, and are the least likely to have access to more robust, interpersonal and, yes, playful forms of care.
We cannot praise therapy apps for their claims about expanded access and patient compliance without looking at the games they play with mental health.
Remote cure: a history of teletherapy by Hannah Zeavin is out now (£ 30, MIT Press).