“Talking” Apps from a Talking Cure

Dr. Schueller is the Executive Director of PsyberGuide and an Assistant Professor at Northwestern University’s Feinberg School of Medicine. He is a faculty member of Northwestern’s Center for Behavioral Intervention Technologies (CBITs) and his work focuses on increasing the accessibility and availability of mental health resources through technology.

I’ll admit my bias upfront, I am completely bullish on the potential of artificial intelligence to transform mental health. This is not because I believe that your therapist will be replaced by a machine anytime soon. The automated therapist, ELIZA, developed in the 1960s showed us the limitations of machines to produce meaningful therapy-like conversations. Despite all of the advances in technology since then we still see “conversational artificial intelligences” making mistakes that are wholly unacceptable in the context of mental health. Failures to identify acute emergencies as such, responding to sensitive topics with nonsensical replies, and an inability to understand and adapt to emotional cues. The reason, however, that I’m excited about the potential of artificial intelligence is that I believe it will offer a new type of intelligence into mental health treatments that can supplement the strengths of human care. I also think artificial intelligence, especially conversational versions that can respond in plain speech can accelerate the development of digital mental health interventions as they will no longer be confined by visual heavy user interfaces (like websites and apps are) and can instead rely on text- or speech-based interactions that better mimic the processes of therapy.

Adam Miner along with other colleagues from Stanford University tackled this issue in a thoughtful piece recently published in JAMA.1 Miner and colleagues note that the gap is closing between what technology is capable of and how we imagine a human might respond in a similar situation. They also point out that closing this gap might not be necessary, as realism – defined as mistaking the conversational artificial agent as a real human – may be overrated. Again, if the notion is that such conversation artificial intelligences should be able to replace therapists we would want them to be able to act like therapists and do everything that therapists can currently do. However, if we think that artificial intelligences could benefit mental health services in a unique way then our question should be one of worth and value rather than the human-like quality of the interaction.

Two recent products incorporate conversational artificial intelligences to create digital mental health interventions: Woebot and Wysa. Both products are “chatbots” drawing from principles of cognitive-behavioral therapy. Woebot is accessible through Facebook Messenger and Wysa is available through Facebook Messenger, Android, and iOS (and reviewed on PsyberGuide!). Each product avoids direct comparisons with a therapist through the use of a non-human persona. Woebot plays the part of a robot developed to provide emotional support. Wysa is a “pocket penguin” design to compassionately support behavioral health. Cognitive-behavioral therapy is an efficacious treatment delivered face-to-face and in various Internet formats including websites and mobile apps. As such, both products can be considered “evidence-informed.” Woebot, additionally, has been subjected to a randomized controlled trial in 70 college students comparing the Woebot “chatbot” to a mental health ebook.2 Participants were given two weeks to use their intervention and assessed again on depression and anxiety. Participants who used Woebot showed significant reductions in depressive symptoms over the two weeks and were about 2.5 points lower on the PHQ-9 at the end of the study. It is worth noting that this is neither a clinically important difference (which would be 5 points) nor did people on average obtain remission. Although that should be weighed against the fact that this as a two-week intervention where a typically dose of face-to-face cognitive behavioral therapy would be 16 sessions and Internet cognitive behavioral therapy would be about 8 weeks. This was a small study with several other limitations but nevertheless it shows that such resources can have value within mental health especially if they can be scaled to reach people who might not otherwise receive treatment.

Another reason I’m excited about conversational artificial intelligences specifically is that they can help foster diversity in digital mental health interventions. I’m hoping we see a range of different products developed, guided by science and improved over time that can truly combine best practices in mental health and technology to create engaging and impactful products. One day we might all have our own personal robots, or penguins, or any of a sort of these class of new conversational artificial intelligences to help support our own emotional and behavioral health.

  1. Miner, A. S., Milstein, A., & Hancock, J. T. (2017). Talking to Machines About Personal Mental Health Problems. Jama318(13), 1217-1218.
  2. Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR Mental Health4(2), e19.
  3.  

 

 

 

Carving new paths where there are none: Considerations in eHealth research

Stoyan Stoyanov has a background of research in Psychology. He works at both Queensland University of Technology and University of Queensland in Brisbane, Australia with leading eMental Health experts Prof. Leanne Hides and Prof. David Kavanagh. Stoyan has a keen interest in the areas of wellbeing, prevention, early intervention, mental health support and suicide prevention for young people, using technology. He is currently involved in a project using eHealth to promote the autonomy and self-efficacy of clients of Australia’s largest youth service provider – Kids Helpline.

It was only six years ago, in 2011, when the suicide research group I was involved in at the University of Manchester UK, discussed developing a suicide monitoring app. I thought – “Automating mental health support? This is science fiction!”

Just a year later, as part of my team’s research project in the Queensland University of Technology, Brisbane, Australia, I was involved in conducting a systematic contextual review and quality evaluation of all health-related apps. Easier said than done. There were already thousands of them on the app stores with no adequate classification and no guidelines or bespoke previous research to lead this process. Thankfully, I had a team of masterminds: Prof. Leanne Hides and Prof. David Kavanagh. Their expertise in developing technological interventions for mental health and wellbeing served as a beacon to help me understand the complexity of the issue: distilling a set of criteria thorough enough to encompass all components of eHealth apps, and objective enough to withstand the critical approach of rigorous research.

Conducting this systematic contextual review was supposed to be the easier task – we were only going to follow the well-established PRISMA guidelines, as if conducting a systematic literature review. This meant we wanted to find ALL apps in a category – a task somewhat attainable before Google shut down their “Apps” filter. Unfortunately nowadays there is no good alternative where thorough, yet focused results are accessible. Trawling through app stores is the only option, yet results are convoluted and popularity-based. And let’s admit it – eHealth apps are not quite as popular as we might like.

In the end we did arrive at a meaningful sample of apps by limiting the number of mental health issues we were interested in, so we could manage the numbers. But how to rate their quality? App store ratings were, of course, meaningless – some can be paid for and boosted by companies, others the result of an angry mob of users, vengeful after the newest operating system (OS) rendered the app unusable. So this left us sifting through website evaluation criteria, user experience (UX) requirements and IT benchmarks. After numerous iterations, the MARS (Mobile App Rating Scale) was born!

Today the MARS has over 170 citations, since its publication in 2015 and has been translated into eight languages. Our team has received hundreds of requests for advice, information, support or collaboration, including its use on the PsyberGuide website. eHealth is blooming!

By now our team has explored and rated hundreds of apps and we learned a few lessons:

  • we need a diverse expertise in technology, design, and health in order to provide optimal, accurate ratings
  • Before rating any one app, one should first explore multiple apps in a category, in order to identify the norms, the trends and who is pushing the envelope
  • app rating requires a critical and objective view, which is difficult to attain despite the MARS’ focus on measuring objective features
  • rating apps comes with a level of responsibility as ratings can affect future uptake

What about app research beyond quality evaluation? Do eHealth apps work? New medications require rigorous research trials before they are permitted to the market, but eHealth apps do not go through such rigorous process. Could some apps be potentially harmful? Until today, health apps have been rarely evaluated and no ‘gold standard’ exists for conducting a proper evaluation. The practice of Randomized Controlled Trials seems to be the common trend, yet it is plagued by low adherence rates, difficulties in managing control conditions and accessibility, and – maybe worst of all – inability to compete with the high pace of technology development. Several times I’ve read a publication on a ‘new’ and ‘efficacious’ app only to find that the app itself is already obsolete in OS or UX terms, or even removed from the app stores.

As technology continues to develop and adapt in an exponential rate, we need to keep up. We, researchers in the field of eHealth, need to be flexible, agile and let’s admit it, often well financed, in order to even stay in the race with commercial level products. Early apps developed and reviewed were predominantly text-based but new apps incorporate passive data collection and biomarkers such as pulse and galvanic skin response. The next generation of eHealth products may increasingly be virtual humans such as conversational agents (like Siri or Alexa) or physical robots capable of supporting our mental health. Within 10 years the iPhone has created this app landscape and who knows what the next 10 years will hold. Before we know it even our robots will need eHealth as well.

 

 

PsyberGuide and The Mighty Partner to Help People with Mental Illness!

We’re thrilled to announce a new partnership that will bring our resources in front of The Mighty‘s wide-reaching readership. We will now have a growing home page on The Mighty and appear on many stories on the site.

The Mighty is a story-based health community focused on improving the lives of people facing disease, disorder, mental illness and disability. More than half of Americans are facing serious health conditions or medical issues. They want more than information. They want to be inspired. The Mighty publishes real stories about real people facing real challenges.

Here’s an example of the kind of stories on The Mighty: 14 Mental Health Apps People Living With Mental Illnesses Recommend.

At PsyberGuide, we’re dedicated to helping people with mental illness in their lives. With this partnership, we’ll be able to help even more people.

We encourage you to submit a story to The Mighty and make your voice heard.

Take a mental health test

Dr. Schueller is the Executive Director of PsyberGuide and an Assistant Professor at Northwestern University’s Feinberg School of Medicine. He is a faculty member of Northwestern’s Center for Behavioral Intervention Technologies (CBITs) and his work focuses on increasing the accessibility and availability of mental health resources through technology.

Last month Google launched a project through partnership with the National Alliance for Mental Ilness (NAMI) to provide a depression test to those who search for depression related terms. I have seen lots of debates about the implications of such a project, including one between Ken Duckworth and Simon Gilbody that was published in BMJ.1 Before I jump into their arguments for and against the use of Google’s online screening tool, I have a couple points of my own to add. First, the depression test itself is not a creation of Google. Google used the Patient Health Questionnaire 9-item scale (often referred to as the PHQ-9). The PHQ-9 is perhaps the most common depression test used in the United States. Odds are if you’ve seen a doctor, you’ve been asked a version of the PHQ-9. It asks people to report on the frequency of common symptoms of depression: low mood, lack of interest, sleep difficulties, eating troubles, low self-worth, difficulty concentrating, motor dysfunctions, and suicidal ideation. Your responses to these nine questions can strongly predict whether you would be considered clinically depressed. Mental health tests are useful because they can help translate experiences you are having into clinical language the treatment providers might use. Duckworth notes that online screening tests can be useful when they create a standard metric. I’ve heard lots of people use the term “depression” to refer to times when they’re feeling really down, and I’ve worked with a lot of people who would never use the term “depression” but experience a greater deal of distress with significant disruption in their lives. Standard tests, like the PHQ-9, however, can help reduce the subjective nature of mental health care. As Duckworth says “If someone calls their doctor to report a PHQ-9 score of 7, or of 17, any professional can triage the person appropriately.”

Gilbody, however, has reasonable concerns. He argues that screening programs should exist only when something will come from the results such as effective treatment and appropriate follow-up. Gilbody argues that Google will not provide these things, which is likely true. PsyberGuide, however, helps provide some identification of evidence-based app resources that could be useful if one is experiencing some of the symptoms captured by the PHQ-9. A mental health test could be the first step to help you select which apps might be appropriate for you and the difficulties your facing.

We’re beginning to work more closely with Mental Health America (MHA) to collaborate around their mental health tests. You can find the tests here and also in our resources section. MHA offers several tests that ask about a variety of mental health symptoms and can help identify a range of mental health conditions: depression, anxiety, bipolar, PTSD, psychosis, and eating disorders to name a few. Just like the Google test, the tests on MHA’s website represent gold-standard measures of clinical practice. For depression, they even use the very same test as Google – the PHQ-9. Taking these mental health tests can be useful to empower you with the language necessary to better understand your treatment options.

A word of caution about tests. They are not replacements for diagnosis conducted by a trained professional. If you take a mental health test you might be encouraged to seek a professional opinion. You might also be informed that you are not experiencing a level of distress that is indicative of clinical levels of a disorder. That does not mean that your distress is not real or causing impairments in your life. Tests, also represent a single point in time. As Gilbody cautions some of transient psychological distress may reduce overtime. PsyberGuide also lists symptom trackers that might contain established mental health tests too. These tools can better give an idea of symptom changes overtime and might better help you know when to consult a professional’s opinion.

Overall, I believe that mental health tests play a key role in getting people the help they need and monitoring progress overtime. We hope that mental health tests provided by MHA can be a useful tool and are excited to help people become more aware of them.

  1. Duckworth, K., & Gilbody, S. (2017). Should Google offer an online screening test for depression?. BMJ358, j4144.

 

 

 

My month with Headspace: Martha Neary

Martha Neary is the Project Coordinator of PsyberGuide. She is based at Northwestern University as a member of Northwestern’s Center for Behavioral Intervention Technologies. She has a strong interest and background in child mental health. Her research interests also include reducing healthcare disparities among minority populations and integrating technology into mental health interventions.

I am not a meditator. I would like to be. I like the idea of meditation, but have little personal experience of regular meditation. My parents, however, are avid meditators; my mother meditates at least 30 minutes nearly every day, so I have decided that this makes me Zen by association (it definitely doesn’t). My month with Headspace was the first time I meditated consistently. Through this practice, I meditated for 4 hours over 30 sessions. There were a lot of things I liked about the app, and some things I didn’t.

From an aesthetic point of view, I love how the app looks. It has nice colors and animations, it was fun to use and easy to navigate through. I liked the voice on the meditation tracks (Andy Puddicombe, Headspace co-founder). I found it soothing, more casual and informal than other guided meditations I’ve heard, and unlike Stephen, I enjoyed the British accent (this may, however, be because I’m Irish, and reluctantly admit that the British accent made me feel at least a little closer to home).

As a beginner meditator, I appreciated how the app eased me into meditating. The first 10 sessions were 3, 5, or 10 minutes. Longer meditations, even 15 or 20 minutes, have seemed somewhat intimidating in the past, so these were a more manageable starting point. After the first sessions, the meditations did get longer (10-20 minutes). But by the time this happened, my Headspace honeymoon period had fizzled. I found the longer meditations repetitive. I couldn’t push myself through the 3 beginner packs the app recommends starting with, so I completed 2 basic packs and moved onto the Happiness pack.

In my limited meditation experience, I always felt that I was doing meditation ‘wrong’ by being distracted by other thoughts. Headspace assured me that was ok (and even normal)! One of the core mantras of the app is that it’s not possible for us to block all thoughts. Instead when our minds wander, we can gently bring them back to focus. This is illustrated by a clever exercise (accompanied by the cute animation below); “Imagine yourself sitting by a busy highway. Cars continue to pass by. You can choose to sit there and notice the cars without focusing on any of them, or you can follow a car down the road.” This non-judgmental approach was perfect for a beginner meditator (and self-confessed perfectionist) like me.

Admittedly, I enjoyed being a little part of the Headspace craze. I discovered that many of my friends were also using the app, which was motivating.  The Headspace ‘buddy’ network is not necessarily something I tapped into though; I didn’t discover this feature until near the end of my month, which brings me to what I didn’t like about the app. After a month of using Headspace, I still don’t think I have explored all of the features. The quantity of content is extremely impressive: there are packs for health, happiness, work and performance, sport, as well as single meditations for rough days, anxious moments, travel. However, at times I felt overwhelmed by so much content.  I avoided exploring new tracks as I didn’t want to sift through the options, and didn’t have enough information to choose the most relevant one.  Maybe that’s because I wasn’t being as savvy a user as I should have been, and it’s certainly a good complaint to have, but I would have liked a more guidance to work through the app and discover the features most relevant for me. In an ideal world, this app would ask me ‘How are you feeling today?’ and based on my response, guide me to the appropriate tracks.

Being a very evidence-focused person, I would have also liked some feature to track my progress. After a month of meditating, do I actually feel calmer, less stressed, happier? It’s hard to tell. My final qualm with the app is that I think it’s expensive. After the 10-day free trial, subscription is $12.95 per month or $95.88 per year (USD). I’m a huge advocate for investing in your mental health, I’m just not convinced that this is a worthwhile investment for me personally. I take time to destress and take care of my mental health by taking walks, spending time outside, or reading. I find these more enjoyable, more effective, and they’re free. Maybe Headspace is just not for me, or maybe I just didn’t reap all of the potential benefits due to issues around navigation and content overload.

Meditation is a very personal thing. This is a great app, it’s beautifully designed and has an abundance of content, but I struggled to turn my daily meditations into a routine. Once the novelty of the first 10 days of meditating wore off, I found the meditations repetitive and struggled to complete them. I don’t know that any other app would have led to a different outcome. There is a myriad of meditation apps out there which I’m interested to explore (many reviewed on PsyberGuide, like Calm, Potential Project, Smiling Mind, to name a few). I’m not yet convinced that Headspace is the best one, although it is the best marketed and widely used. Sure, it was nice to take 10 minutes each day to sit quietly, without my phone or computer or other distractions, and that’s something I’ll try continue to do. I just don’t know if I necessarily need Headspace, or a paid monthly subscription, to do that.

My month with Headspace

Dr. Schueller is the Executive Director of PsyberGuide and an Assistant Professor at Northwestern University’s Feinberg School of Medicine. He is a faculty member of Northwestern’s Center for Behavioral Intervention Technologies (CBITs) and his work focuses on increasing the accessibility and availability of mental health resources through technology.

Headspace is arguably the most popular and widely used mental health app. Headspace’s website reports having over 15 million users. Andy Puddicombe’s, one of Headspace’s founders, TEDTalk has nearly 8 million views. On our PsyberGuide site, the information about Headspace is one of the most visited pages (behind just our home page and product guide). Because of this, we’ve decided to take a deeper dive into Headspace, using it as an example to learn more about digital mental health tools, and providing more content for those of you using Headspace or wanting to learn more.

First, though, I have a small confession. Of course, I’ve heard of Headspace, I’ve downloaded Headspace, I’ve looked at Headspace, but admittedly, I had never really used Headspace. That is to say, I’ve poked around the free content. Listening to the basics, watching the videos, playing around with the reminders. But, I never committed to practicing Headspace on a consistent basis. So, for the past month I’ve used Headspace. I logged 3 hours of practice through 28 sessions (sorry, I didn’t make it to 1 session each day, some days I really struggled to find the time and motivation, although it appears I’m not alone in this1). Through this month, I learned a few things and I wanted to share those thoughts with the PsyberGuide community.

Content. Headspace is one of the deepest mental health apps I’ve seen in terms of quantity of content. The audio tracks address diverse targets. This includes health issues covering all aspects of life, from pregnancy and cancer to stress and depression. Our professional lives and hobbies are noted in tracks on work & performance and sports. Headspace Pro provides content for those who want to take it to the next level (I definitely did not see myself as a “pro”). Headspace kids for ages grouped 5 and under, 6-8, and 9-12 (sadly I couldn’t get my 2-year-olds to sit through any of these so no opinions here). Content is meant to fit the structure of your schedule, your attention span, and your skill with tracks ranging from 1 to 10 minutes. I really liked the shorter meditations and my average duration ended up being 7 minutes. I would have loved a couple without that British accent though, no offense to the British, it just didn’t work for me.

Making it a practice. Here’s one place I would have liked more help. Headspace has a lot of features to make the app more “sticky.” Notifications, streaks, a play button prominently displayed when you start up the app. But I really struggled with transforming my use of the app into a true practice. And this is definitely a skill I have other places in my life. I’m a runner and put in about 2,000 miles each year. I’ve run 9 marathons and one 50k. Doing this has required me to train my body and my mind to tolerate runs that last a long time and long distances. However, my early days of running were short distances, alternating periods of running and walking, and I still have to boost my weekly mileage by about 40% to gear up for a race. As I began my month with Headspace, the 10 minute meditations really challenged me. I slogged through the Basic pack before I started to utilize the minis. This isn’t just a feature of Headspace, starting the practice of meditation is challenging no matter how you come to it. If I hadn’t committed to try this for 30 days, then I probably wouldn’t have been able to do even as much as I did. I would have loved to have someone or something help me better figure out ways to better use the tools to slowly build me up. Alternate minis, packs, and singles.

Focusing on why I’m doing it. Maybe I missed a feature, but I felt the metrics Headspace reflected back to me (e.g., number of minutes, number of sessions, streaks), were more about what Headspace wanted me to do (continue to use the app) rather than what I wanted to get out of Headspace. My goals are to improve my focus, reduce procrastination, feel more relaxed, along with all the things I set out to accomplish in my life. I would love to track progress on the things I care about or connect to what I hoped to get out of Headspace.

So for 30 days Headspace simplified the practice of meditation. It put it in my pocket and allowed me to take it with me wherever I went. To my office, on trips. It’s hard to say if I’m more mindful, more productive, or happier as a result of it. But I also think I realized early on that Headspace was not for me. I stuck with it because I wanted to see where it would go. It has a lot to offer and given its popularity it’s obviously doing a lot right. We’ll have another take on Headspace in a few weeks here and we’ll put up some more material regarding the app on PsyberGuide. I hope you find it useful and as always, feel free to post your own thoughts and experiences in the comments below or on our social media.

  1. Laurie, J., & Blandford, A. (2016). Making time for mindfulness. International journal of medical informatics96, 38-50. http://dx.doi.org/10.1016/j.ijmedinf.2016.02.010

 

 

 

 

Bringing apps into the therapy room

Dr. Schueller is the Executive Director of PsyberGuide and an Assistant Professor at Northwestern University’s Feinberg School of Medicine. He is a faculty member of Northwestern’s Center for Behavioral Intervention Technologies (CBITs) and his work focuses on increasing the accessibility and availability of mental health resources through technology.

Many people get excited about the potential of mobile apps to transform mental health care because not enough current treatment resources exist. Long waiting lists, limited availability in rural areas contribute to difficulties in receiving care. Therefore, a lot of apps, app-based services, and app research studies are intended to be used instead of traditional face-to-face services. However, this overlooks the fact that for many people, having a professional relationship with a licensed mental health provider might be necessary to deal with their mental health issues. Furthermore, licensed providers might be able to guide people to effective apps and help them steer clear of bad ones. Thus, mobile apps can play a role in the context of face-to-face practices. As adjuncts to traditional care they can increase people’s engagement with their treatment and hopefully make such treatments more effective and efficient.

Providers then need guidance as to what apps are effective, what features they offer, and how they might be able to be used in their practice. A recent review of mobile apps for mental health attempted to address some of these issues with questions such as what are the common features of apps that are effective? And what are the implications of such findings for practitioners? Joyce Lui, David Marcus, and Christopher Barry from Washington State University combed the research literature and identified 21 studies evaluating 18 different apps.1 These apps targeted anxiety disorders, mood disorders, post traumatic stress disorder, schizophrenia, and substance use disorders. Eight of those apps, or roughly half, were designed specifically as adjuncts to traditional therapy, including things such as DBT Coach for or The Stress Manager. We’ve identified several other apps on PsyberGuide that are intended for use with providers like the eCBT of the VA’s PE Coach.

Most apps have some form of symptom monitoring and a menu of therapeutic skills. Common skills included cognitive restructuring, relaxation techniques, drink refusal skills, and scheduling pleasurable activities. If you are a provider and these are skills you cover in your practice then there are several apps that can help support your clients practice these skills outside of sessions. If you are a client receiving therapy, apps that help monitor symptoms might provide you the tools necessary to bring objective data into the therapy room when you’re asked “So, how was your week?”

Very few apps offer testimonials from individuals who have recovered from the same disorder or used that app. This is unfortunate because learning how other people used the same app to overcome similar difficulties could be useful for both clients and providers in determining how that app might be useful for them. Social features within the apps often use people’s own contacts within the phone or can share progress on social media. I would strongly encourage clients and providers to be wary of the use of social media within mental health apps. Few apps provide details about how user’s information is gathered or returned to the software developers. Information may be shared to a user’s social media account inadvertently, without the user’s knowledge, or a user may later come to regret information that was shared intentionally. It would be great if more apps had safe and secure ways of connecting users to other people, but currently this does not seem to exist in most mental health apps.

In the end, it appears as though apps that could be brought into the therapy room face many of the same issues that mental health apps do generally. Too many apps with too little research evidence makes it hard to separate the good from the bad. Therefore, I would really encourage clients and providers to consult guides, like PsyberGuide, to engage with their professional communities (e.g., local and national organizations), and if you are using apps in your practice to share what you’re learning. We’re always happy to hear from people at PsyberGuide and will be working to develop more resources to help providers interested in using apps in their practice.

  1. Lui, J. H., Marcus, D. K., & Barry, C. T. (2017). Evidence-based apps? A review of mental health mobile applications in a psychotherapy context. Professional Psychology: Research and Practice48(3), 199-210. http://psycnet.apa.org/psycinfo/2017-07848-001/

 

 

 

 

“How do you use this thing?” Apps and Usability

Dr. Schueller is the Executive Director of PsyberGuide and an Assistant Professor at Northwestern University’s Feinberg School of Medicine. He is a faculty member of Northwestern’s Center for Behavioral Intervention Technologies (CBITs) and his work focuses on increasing the accessibility and availability of mental health resources through technology.


Although many researchers and academics have decried the lack of scientific evidence for many mobile apps available on the public app stores, fewer have brought up the glaring usability issues of such resources. Usability refers to how easy an app is to use. In order for a mobile health app to be useful, it must first be usable. Usability testing, that is the analysis of how well users can learn and use a product to achieve their goals and how satisfied users are with that process, has a long history in the development and evaluation of technologies. Engineers, computer scientists, and human-computer interaction specialists usually consider usability guidelines in their designs and evaluate usability as a first step in the evaluation of novel technologies. Usability testing is less frequently used in by clinical psychologists or behavioral scientists conducting research on mobile mental health apps, who often jump to evaluations of whether apps lead to intended changes in clinical outcomes (e.g., less depressed or anxious, increased well-being). Therefore, we often learn whether or not an app can be effective, without knowing how easy it is for people to use it on their own. Usability is captured in “Functionality” score on the Mobile App Rating Scale (MARS) that PsyberGuide uses to rate products listed on our site. Functionality on the MARS refers to the functioning, ease of use, navigation, flow logic, and gestural design of the app.

A recent study explored the usability of 4 mental health apps (Depression CBT, Mood Tools, Optimism, and T2 MoodTracker).1 Twenty-six patients were invited into the laboratory to complete a series of tasks on these apps and researchers watched, videotaped, and questioned participants to learn more about the usability of these apps. These tasks included data entry tasks such as entering one’s mood or taking a depression test and data retrieval tasks such as viewing a graph or watching a video. Only 51% of the time were participants able to complete the data entry tasks without assistance. Data retrieval tasks were even more challenging at 43%. Participants reported that their experiences with these apps were frustrating and instilled a lack of confidence that such tools could then be helpful for them. Not an extremely positive review of the usability of these apps.

I bring up these results as a word of warning both to those developing mobile health apps and those using mobile health apps. Developers, we can and should do better. Mobile health apps should be easy to use for a wide range of the people. The authors of this paper offered the following advice: (1) explain why each task is helpful, (2) use simple language and graphics, (3) reduce the number of screens, and (4) reduce manual entry as much as possible. And for those who want to use mobile health apps, this is one reason we provide multiple ratings of mental health apps on our website. PsyberGuide ratings address the credibility of a product. MARS ratings combine several aspects of an app, including its functionality. Lastly, expert reviews provide detailed information about why and how an app might be beneficial for you. An app might have a lot of research support, and thus a high PsyberGuide rating, but low scores for engagement, functionality, and aesthetics and thus a low MARS score. Usability and functionality might be a more important aspect for particular people. For designers, we need to better understand the capacities of the people we’re designing for. For users, it’s helpful to have a sense of your own capabilities with mobile technology and to select an app that best fits with those capabilities. If you’re having trouble using a mobile app, you’re not alone and it might be useful to run it by someone you trust – a doctor, family member, friend, to see if they can help you figure it out. And remember that for you to benefit from a mobile app, you need to use it, and one of our tasks at PsyberGuide is to help you find which apps are most usable.

  1. Sarkar, U., Gourley, G. I., Lyles, C. R., Tieu, L., Clarity, C., Newmark, L., … & Bates, D. W. (2016). Usability of commercially available mobile applications for diverse patients. Journal of general internal medicine31(12), 1417-1426. https://www.ncbi.nlm.nih.gov/pubmed/27418347

 

 

 

 

What Can Tooth Brushing Teach Us About Behavior Change in the Era of Digital Health?

by John Torous

John Torous MD is a board certified psychiatrist who has a background in computer science.  He co-directs the Beth Israel Deaconess Medical Center’s digital psychiatry program, www.psych.digital, where he also serves as a staff psychiatrist and a clinical informatics fellow. He also serves as the Editor-in-Chief of JMIR Mental Health. You can                                                                        follow him on Twitter @JohnTorousMD


All health apps, including mental health apps, experience a serious engagement problem. A large observational study of an asthma-monitoring app [1], real world users of an app for post traumatic stress disorder, PTSD Coach [2], and even the popular gamified physical activity app Pokémon GO [3] all show the same pattern. Many users download the app, most use it a few times, but few persist after a few weeks. In short, despite people’s strong interest in using health apps, few people stick with them.

It is useful to check our assumptions about ideal engagement against other healthy behavioral patterns. Take tooth brushing for example. This is healthy habit that seems routine and ubiquitous. The American Dental Association recommends that people brush their teeth twice a twice. What can tooth brushing teach us about helping people stick with health apps?

A first step is to check the data.  Apparently only 69% of Americans brush their teeth twice per day [4]. That means nearly one third are not able to stick with this simple healthy habit! If nothing else, this shows how challenging behavior change is, there are no ‘easy wins.’ But looking at the bright side, 69% is large amount of the population. What can these people teach us about success? What helps them stick with twice-daily tooth brushing? While there are of course many factors, one of the most common is that it is fast and part of their daily routine.

Mental health apps can learn from this. Brushing your teeth takes about two minute – but how long does it take to use most mental health apps. Many apps offer lessons drawn from traditional face-to-face therapies that take users hours to read! Others create entire digital ecosystems that take hours to master and significant time to navigate. What if mental health apps could be simpler and faster to use? A recent study by Dr. David Mohr and the IntelliCare team, including Stephen Schueller, PsyberGuide’s Executive Director, explored if a suite of mental health apps called IntelliCare designed for ultra brief use sessions would be engaging and effective. [5]. Findings showed that usage of the apps followed from this design consideration. App sessions averaged 1.1 minutes and people used the apps an average of 3.5 times per day. Furthermore, people experienced significant reductions in depression and anxiety with over three-fourths of people either in full remission or recovery after 8 weeks. The results of the study suggest that these ultra brief  apps were indeed both effective and engaging – highlighting a new paradigm for mental health apps – and perhaps all health apps.

Apps have much to offer mental health, but there is still much that apps must improve on to be more effective mental health tools. The IntelliCare study offers an encouraging solution to engagement. You can learn more about IntelliCare and find where to download here.

Trivia: What percent brush their teeth while driving: 0.2%. What percent of teen have used their smartphone while driving: 80%.

  1. Chan YF, Wang P, Rogers L, Tignor N, Zweig M, Hershman SG, Genes N, Scott ER, Krock E, Badgeley M, Edgar R. The Asthma Mobile Health Study, a large-scale clinical observational study using ResearchKit. Nature Biotechnology. 2017 Apr 1;35(4):354-62.
  2. Owen JE, Jaworski BK, Kuhn E, Makin-Byrd KN, Ramsey KM, Hoffman JE. mHealth in the wild: using novel data to examine the reach, use, and impact of PTSD coach. JMIR mental health. 2015;2(1):e7.
  3. https://medium.com/achievemint/can-augmented-reality-alter-reality-quantifying-the-pok%C3%A9mon-go-effect-561bf996d4b9
  4. https://www.deltadental.com/Public/NewsMedia/NewsReleaseDentalSurveyFindsShortcomings_201409.jsp
  5. Mohr DC, Tomasino KN, Lattie EG, Palac HL, Kwasny MJ, Weingardt K, Karr CJ, Kaiser SM, Rossom RC, Bardsley LR, Caccamo L. IntelliCare: An Eclectic, Skills-Based App Suite for the Treatment of Depression and Anxiety. Journal of Medical Internet Research. 2017;19(1):e10.

 

Can apps reduce anxiety?

Dr. Schueller is the Executive Director of PsyberGuide and an Assistant Professor at Northwestern University’s Feinberg School of Medicine. He is a faculty member of Northwestern’s Center for Behavioral Intervention Technologies (CBITs) and his work focuses on increasing the accessibility and availability of mental health resources through technology.


A recent meta-analysis by Joseph Firth and colleagues [1] explored whether the scientific evidence supports that smartphone apps can reduce symptoms of anxiety. Meta-analyses are often considered one of the most useful forms of scientific information because they combine the results of several studies to produce a more powerful investigation than any single study could provide. The findings of Firth and colleagues meta-analysis suggest that smartphone apps were an effective form of treatment for anxiety, although the size of the impact is what researchers would call small-to-moderate. That means, that we can reliably say that smartphone apps designed to decrease anxiety can do so, but that the impact would be less than one would expect with the gold standard face-to-face treatments or medications. This is not surprising, the dose of “treatment” from these smartphone apps was much shorter than what one would typically receive in terms of face-to-face treatment or medications, ranging from 4 to 10 weeks with an average of 6.1 weeks.

It is useful to dig into this meta-analysis a little more deeply as it tells us what we know (and don’t know) about the effectiveness of smartphone apps for anxiety and some of the limitations of the current research. First, the authors identified only 9 eligible studies that collectively represented 1837 total participants. There are many more than 9 apps targeting anxiety available in the Google Play and Apple iTunes store, which again shows that most of the apps available have no direct scientific evidence confirming their effectiveness. Conversely though, only 4 of the apps are currently available in those app marketplaces (Flowy, myCompass, SuperBetter, and components of the LivingSMART intervention which consisted of non-mental health apps such as Google Calendar, Evernote, Stayfocusd or SimplyNoise with instructions on how they could be applicable for mental health issues). We currently only have SuperBetter listed on PsyberGuide, along with an expert review, but will work to get these other resources listed in the near future. The 1837 people included in this meta-analysis likely represents a small portion of the total number of people with anxiety who have downloaded a mental health app. It is impossible to know how well these 1837 people represent people who download apps for anxiety more generally. These people tended to be adults, average age of 36.1 years, women (65.2%), and no study used a diagnosed anxiety disorder as a criterion for inclusion. Therefore, it’s hard to say how well apps would work for adolescents or older adults, men, or people with more severe anxiety.

The most impactful apps were those that were included along with some other treatment either face-to-face or Internet-based therapy programs. This is an important caution that for many people, apps alone are unlikely to be a sufficient treatment. If you’re not currently receiving treatment, an app can be a good way to introduce you to important treatment skills like deep breathing, exposure, goal setting, or self-monitoring. If you’re in treatment, apps can help support the work you are currently doing, although it might be useful to discuss with your provider for suggestions or which skills or apps might be the most useful. And lastly, if you’ve received treatment in the past, apps can help reinforce the work you’ve done or to keep up with some of the skills you’ve learned. If you use an app and don’t feel like you’re getting better that doesn’t mean you’re beyond help, it might be that you haven’t found the right app or that it would work better for you in combination with something else.

Thus, although this meta-analysis is a useful summary of the current research evidence, it illustrates the need for additional information: evidence produced by researchers on the efficacy of such apps, evidence gained from consumers on the effectiveness of such apps, and evidence from experts, like that provided in PsyberGuide, to better guide decisions regarding the usefulness of smartphone apps for anxiety. The findings provide enthusiasm that smartphone apps can work but more information is needed about which ones, for whom, and how people can get the most benefits out of such resources.

1. Joseph Firth, John Torous, Jennifer Nicholas, Rebekah Carney, Simon Rosenbaum, and Jerome Sarris. Can smartphone mental health interventions reduce symptoms of anxiety? A
meta-analysis of randomized controlled trials. Journal of Affective Disorders, 218, 15-22.

http://www.sciencedirect.com/science/article/pii/S0165032717300150