Koko is a text-based peer support system, moderated by a chatbot. It is intended to help people rethink potential cognitive distortions, and to help people learn to assist others in rethinking potential cognitive distortions. In doing so, it teaches people a key aspect of Cognitive Behavioral Therapy (CBT) and enables people to see additional perspectives on their issues with input from peer supporters.
Recommendations for Use
Koko is best suited for consumers in need of peer support. It may be particularly valuable to isolated individuals that do not have peers to lean on virtually or in person. Unlike some other interventions (e.g. FearFighter, SuperBetter), Koko does not require users to identify their own peer supporter. Given that struggling individuals may have limited peer networks, the automated pairing with peers may be useful. Consumers should use Koko whenever they would like help in rethinking situations in their lives, or whenever they would like to altruistically help others. However, Koko may be potentially harmful to those prone to self-harm, as the thoughts shared by peers may be triggers. During testing, a number of peers shared thoughts of self-harm. For this reason, it is likely to also not be suitable for minors. As Koko does not facilitate patient/clinician interactions or allow for follow-up, it is unlikely to be used by clinicians directly. However, clinicians may wish to recommend Koko to adult patients needing frequent guidance, with the emotional fortitude to help others in serious distress. In addition to being useful for consumers, Koko’s application programming interfaces may be used to help social networks detect crises and abusive content, and to help those posting such content get the support that they need.1
Core Features and Similar Products
Koko is intended for people struggling with issues related to dating, work, friendships, school, family, and other areas of life which are amendable to brief peer support. It is not suitable for people requiring more than a brief response regarding how to handle their issue, nor people in need of comprehensive assistance with coping strategies. Artificial intelligence is used to identify people in serious distress and to refer them to more comprehensive support. Koko is not intended for people in crisis situations, and will recommend country-specific external resources to users in crisis.
Users engage with the system via the Koko website, Facebook Messenger, Kik, Telegram, or Twitter. Koko asks users to describe a situation that they are struggling with currently, and then anonymously sends the description to other users of the system so that they may provide peer support. The other users anonymously provide suggestions on how the described situation might be rethought. After doing so, the struggling user is provided with their feedback, as well as the opportunity to thank them and rate the helpfulness of the response. Thus, the Koko chatbot moderates peer support interactions between anonymous users which are encouraged to both describe their struggles and help other users rethink the struggles that they are experiencing. As users may thank each other for the assistance that they have provided, Koko has the potential to boost perceived self-worth.
Koko provides people with relatively immediate feedback that may be used to improve their lives. Feedback is not instant, as the system cannot provide advice until another user has inputted peer support. Users rate the quality of feedback after receiving it. While Koko could be used on a one-time basis, it encourages users to maintain engagement over time by having them accumulate points for engaging with the system and offering rewards (Twitter shoutout, Facebook shoutout, blog interview, stickers, or a prize pack) for doing so. It additionally reaches out to users after a period of inactivity.
Koko is not the only chatbot intended to help improve mental health. Woebot is an entirely automated chatbot that asks users about their moods and thoughts, and then employs techniques from CBT to help them better manager their lives. Unlike Koko, Woebot is not free and does not incorporate crowdsourced human input.2 Woebot has been shown to decrease both anxiety and depression in an unblinded trial.3 In a similar manner, Wysa provides chat-based Cognitive CBT using artificial intelligence. Wysa focuses on improving behavioral health, and has been advised by Professor Vikram Patel, a psychiatrist affiliated with Harvard Medical School.4
Research performed by both Koko and Woebot suggest that chat-based CBT interventions may be efficacious. There is additionally a long history of these tools being successfully used in other online formats.5 Koko may be more similar to live CBT than many of the interventions developed in the prior decade, as it is interactive. Many early online CBT interventions were not interactive, and focused solely on training people in the fundamentals of CBT. Nonetheless, they have been supported by a large evidence base.7 The National Institute for Health and Care Excellence (NICE) in the United Kingdom has written a technology appraisal suggesting that computerized CBT is an effective treatment for depression and anxiety.8 The appraisal from NICE further reiterates that online CBT interventions—the broad category of interventions to which Koko belongs—have the potential to help people experiencing mental illness.
Content – 4.25/5
While Koko does not attempt to teach its users about any psychological theories, it does train them to provide peers support with cognitive reframing. In doing so, it indirectly teaches users about CBT. Developed by Robert R. Morris, Ph.D., a graduate of MIT’s Media Lab, Koko is grounded in both psychological theory and affective computing research. A predecessor web-based, peer-to-peer cognitive reappraisal platform developed by Dr. Morris was shown to produce significant improvements in depression, reappraisal, and perseverative thinking by users. 9 Although the platform tested was in some aspects different from Koko, its design and intent was the same. Using paid, crowdsourced workers, Dr. Morris and colleagues demonstrated that structured prompts (like those featured in Koko) are superior at having crowdsourced workers empathize and reappraise situations. Crowdsourced workers were likewise shown to be able to correctly classify statements as containing cognitive distortions 89% of the time.10 Researchers have examined using crowdsourcing platforms to acquire new creative skills in other contexts, and have shown that it is possible to teach crowdsourced workers to perform complex, creative tasks.11
As Koko is intended to rapidly help users become supportive peers and to receive support, it provides users with structured support tasks to accomplish rather than information or training. It does not teach users about the major types of cognitive distortions (e.g. catastrophizing, overgeneralization), but instead leaves users to their own devices in determining how best to rethink the situations faced by peers. It might be enhanced by options enabling users to learn more about CBT and cognitive distortions, but currently does not have a teaching focus. It is narrowly designed to moderate peer support interactions and follow up on the progress of users in managing their personal struggles.
Ease of Use and User Experience – 4.33/5
Although Koko was formerly an app-based tool, it has since become solely a chatbot accessible from multiple platforms. The chatbot provides users a limited set of menu options at various points during the conversation, and has a linear, dialogue-based interaction with the user. Navigation is limited to the menu options presented by Koko at a given time. When menu options are presented, users select the appropriate option by clicking on it within the Koko chat interface. Notably, there is no option to stop chatting with Koko. Koko may send follow-up messages after a user attempts to disengage. Koko is quickly learnable and intuitive to use, as it simulates a human text-based conversation.
Visual Design and User Interface – 4.66/5
The Koko website’s chat server has a minimalist teal interface with no extraneous features. Users interact with it by typing responses or by clicking buttons containing canned responses where applicable. By providing canned responses, the system accelerates the pace of interaction and ensures that it can interpret responses. A similar interface is shown when Koko is used through a client, such as Facebook Messenger. The interface focuses the user on the content, and provides a minimal set of choices to reduce potential user confusion. The text-based interactions are consistent, and at times may seem a bit repetitious. Buttons are consistently shown above the text input box, enabling users to quickly find them. The design is appropriate for anyone comfortable using a chat interface. It may seem particularly natural to Millennials, who may have two decades of experience communicating life events to others through instant messaging applications. Koko requires its users to be able to read, and is inappropriate for children, as the struggles shared by peers may be disturbing.
Overall, Koko is an engaging, well-designed tool for enabling people to give and receive peer support. There is evidence that participating in a similar intervention improved participants’ mental health. Furthermore, providing peer support can help people get in the habit of rethinking thoughts, which may enable them better help themselves. While it can at times be fulfilling to help other real people, it can also be distressing. Some of the other participants on the platform are in serious crises, and Koko is unable to fully address their needs. Although Koko uses artificial intelligence to screen user comments for suicidality and to provide country-specific resources to suicidal users, descriptions of thoughts of suicide and self-harm (e.g. cutting) are sometimes distributed to peers in the supporting role. Thus, while Koko has the potential to be therapeutic, it also has the potential to harm participants.
Koko is very engaging and persistent in reaching out to its users, even after they have stopped participating in a chat with it. While this may be helpful to the extent that it maintains engagement and motivation, it may also be problematic, in that Koko may cause people to continue to ruminate on struggles that they previously shared with the system. Likewise, it may send users messages at times when they do not have adequate privacy. During testing, Koko reached out to the reviewer while he was at a dinner party and was not actively engaged in a chat with the system. It is unclear how to permanently end a conversation with Koko.