[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Next Article in Journal / Special Issue
Raising Awareness of Smartphone Overuse among University Students: A Persuasive Systems Approach
Previous Article in Journal
Systematic Review of Multimodal Human–Computer Interaction
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Human-Computer Interaction in Digital Mental Health

Australian Institute for Suicide Research and Prevention, Department of Applied Psychology, Griffith University, Messines Ridge Road, Mount Gravatt, QLD 4122, Australia
*
Author to whom correspondence should be addressed.
Informatics 2022, 9(1), 14; https://doi.org/10.3390/informatics9010014
Submission received: 30 January 2022 / Revised: 10 February 2022 / Accepted: 21 February 2022 / Published: 22 February 2022
(This article belongs to the Special Issue Feature Papers in Human-Computer Interaction)

Abstract

:
Human-computer interaction (HCI) has contributed to the design and development of some efficient, user-friendly, cost-effective, and adaptable digital mental health solutions. But HCI has not been well-combined into technological developments resulting in quality and safety concerns. Digital platforms and artificial intelligence (AI) have a good potential to improve prediction, identification, coordination, and treatment by mental health care and suicide prevention services. AI is driving web-based and smartphone apps; mostly it is used for self-help and guided cognitive behavioral therapy (CBT) for anxiety and depression. Interactive AI may help real-time screening and treatment in outdated, strained or lacking mental healthcare systems. The barriers for using AI in mental healthcare include accessibility, efficacy, reliability, usability, safety, security, ethics, suitable education and training, and socio-cultural adaptability. Apps, real-time machine learning algorithms, immersive technologies, and digital phenotyping are notable prospects. Generally, there is a need for faster and better human factors in combination with machine interaction and automation, higher levels of effectiveness evaluation and the application of blended, hybrid or stepped care in an adjunct approach. HCI modeling may assist in the design and development of usable applications, and to effectively recognize, acknowledge, and address the inequities of mental health care and suicide prevention and assist in the digital therapeutic alliance.

1. Introduction

Human-computer interaction (HCI) involves the design and development of computer technology with a focus on facilitating its use from accumulated influences [1]. HCI also offers the implementation and evaluation of technologies with an importance on user experience (UX).

1.1. Early Evolution of Psychological Science in Human-Computer Interaction

Although HCI emerged in the 1950s, its psychological aspects did not arise until the 1970s when cognitive engineering interfaced with computer science and technology [2]. HCI prospered in the 1980s, becoming an important factor for computer scientists to design and develop successful computer UXs, notably graphical interfaces via multiple tiled windows with various applications (most commonly text processing and spreadsheets but also video games and the World-Wide-Web) [3]. In the early 1980s, there was a strong clinical resistance to the use of automation in mental health stemming from a lack of understanding of the psychological factors of HCI [4]. In the mid-1980s, human needs of HCI were overshadowed by the technological advances of harder (more technical) sciences and constrained by obstacles related to the potential of its body of knowledge (low impact, scope, affect and user application) [5]. Innovative computer system design in the late 1980s sought solutions to a range of cognitive, social and organizational problems [6].
Developments in computer technology continued to dominate HCI theory development in the 1990s because of the inherently unclear conceptual structure of cognitive psychology [7]. There was a progressive and challenging test domain for technological enhancement and application [8]. Few computer services were used in regular mental health care mainly because of usability and adoption issues, but randomized controlled trials (RCTs) demonstrated the potential of self-help and clinical integration from the use of desk- or laptop computers for screening and tracking for phobic, anxiety, panic, and obsessive-compulsive disorders, nonsuicidal depression, obesity, and smoking cessation [9].

1.2. Modern Developments in Human-Computer Interaction and Digital Mental Health

Themes of early modern developments in HCI (since 2005) suggest that researchers grappled with human factors. An evaluation of an automated telephony mental health care system guided HCI included the findings that users preferred a system that sounded and spoke like a human-professional rather than a machine [10]. Information and communication technologies (ICTs) were primarily designed for commercial purposes—there were suggestions of bias, misinformation and disinformation because no web-based technologies and designs met the needs of people recovering from a severe mental illness who are seeking employment (codesign for HCI and mental health innovation were generally lacking for marginalized populations) [11]. But young people were very satisfied with internet-based self-help [12].
A lack of understanding of socio-cultural values and a lack of evaluation of the patterns of use [13] led to a call for culturally sensitive user interfaces and more inclusive theory and evaluation processes [14] to improve access to culturally appropriate mental health care for Indigenous Australians [15,16]. After meaningful community engagement, a conceptual framework of an Indigenous Australian project was changed from ‘digital mental health’ to the culturally relevant ‘digital social and emotional wellbeing’, which led to outcomes mostly through an online therapy program and comprehensive website [17].
Strong evidence was established for technology-based mental health interventions in anxiety and depression treatment [18]. Design and evaluation guidelines described the accumulated HCI knowledge [19]. A review of behavioral intervention technologies (BITs) in mental health discussed the varied levels of evaluation (ranging from basic development and evaluation to implementation studies) for web-based and mobile interventions, social media, virtual worlds as well as humans and gaming [20]. There was a call for the integration of theoretical and research concepts from behavioral and psychological sciences, engineering, computer science, HCI, and communications, and designs to improve development and evaluation methodologies, as well as to develop implementation models for tests [20]. Computational innovations from natural language processing (NLP) used in machine learning algorithms were not yet captured [21]. Other HCI challenges included design interfaces, sustainable adoption and the unobtrusive use of mobile phones and wearable sensors [22], digital phenotyping for mental health, [23] and wearable interfaces, online communities, and positive computing [24].
Digital mental health treatments did not make a successful transition from positive efficacy trials to implementations [25]. Although the potential of digital mental health (including machine learning and gaming) means the genre must be generally considered in the context of real-world use [26], a research scan found that only 30% of the papers consisted of clinical studies [27]. It was recommended to separately consider the efficacy and retention of digital tools for interventions as well as the opportunity for creativity in computer and games design to precisely build and assess preventive or therapeutic tools [27]. An integrative review found a general oversight of HCI by mental health care practitioners in the deployment of e-mental health interventions for therapeutic purposes, resulting in many applications falling short of safety and quality assurance [28]. The use of digital technologies in mental health care was hindered by privacy, trust and UX issues (e.g., substantial cognitive load and a lack of personalization) which negatively affected engagement and retention [29,30,31,32]. Future digital health studies were recommended to provide increased priority to the human factors in combination with machine interaction and automation categories of HCI [33]. To our knowledge, there is no literature specifically focusing on these two categories of HCI in relation to digital mental health.
The COVID-19 pandemic overwhelmed mental health care resources [34]. Web-based therapies and apps facilitated access and follow-up (although lacking in evidence from population studies) [35]. Virtual psychiatric care visits became standard practice and there was a marked increase in the use of mental health care apps (e.g., Talkspace) [36]. The strong evidence base for telemental health meant that it was recommended for rapid scaleup and envisaged as an adjunct approach in the long term (especially for youth and indigenous populations) [37]. But there are inequities stemming from accessibility to and competency with digital mental health care, in addition to safety and privacy issues from insecure digital platforms, and the lack of a stepped model of care [36]. It was recommended to rectify these issues and focus investment on secure, stable digital platforms and interactive, personalized artificial intelligence (AI)-based apps for real-time monitoring and treatment [36].
Although the use of digital mental health tools has increased, the literature suggests that HCI is among several issues that needs to be addressed if these tools are to be effective. Therefore, this integrative review will explore some recent and relevant literature on digital mental health and HCI. The aim is to provide an integrated summary of the potential for various stakeholders to assist HCI to stimulate better and faster digital tools and technology resulting in safe and higher quality mental health care.

2. Methods

Two authors independently assessed all abstracts against the inclusion and exclusion criteria according to the five-step amendment [38] (see Table 1) of a modified integrative review framework [39]. This methodology was applied to purposively sample, critique and synthesize the empirical and theoretical literature converging “digital mental health” and “human-computer interaction” (i.e., with a focus on effectiveness, feasibility, accessibility, sociocultural inclusion, rigor and readiness for adoption and upkeep). The review followed the HCI categories outlined by Stowers and Mouloua [33] (i.e., usability; safety; security, privacy and trust; automation; training and simulation; information/patient records; virtual mental healthcare; and human factors–machine interaction). The human factors–machine interaction and automation categories were especially focused on in the literature search as recommended by Stowers and Mouloua [33].
The searches were within the Science Direct, Sage, Google Scholar, CrossRef and ACM Digital Library databases as well as reference lists of relevant reviews. Search terms included human-computer interaction, digital mental health, web-based and smartphone technology, artificial intelligence, digital interventions, digital phenotyping, telehealth, phone, email, internet, virtual reality, video games, and combinations of these terms. Journal articles and book chapters informed a summary of the historical period of HCI in mental health (1970 to 2004) and conference proceedings, journal articles, media articles as well as websites were analyzed for relevant contributions in the modern era (2005 to 2022).
The purposive sample results were classified from the main themes of investigation/discussion according to the main groups outlined in our definition of digital mental health: web-based and smartphone technologies; artificial intelligence; digital phenotyping; and immersive technologies [40]. The results summarize the body of knowledge and contextualize the opportunities and challenges of HCI in pertinent digital mental health solutions. The discussion critically elaborates on the key HCI challenges and the potential of digital mental health in an assistive capacity. It also expands upon recently addressed issues—ethics, the therapeutic relationship, and applicable models of care for an adjunct approach to mental health care [40].

3. Results

3.1. Web-Based and Smartphone Technologies

Digital mental health interventions are predominantly delivered by self-guided or clinician support (guided) approaches via online programs and apps (web-based and smartphone technologies) for evidence-based therapy to the user [30]. Although web-based interventions (using text-based didactic information, audio, video and animation) were noted in 2013 as being effective in standalone or coach or therapist-assisted treatment, little was known about their design and implementation (e.g., the relationship between the quality and design of websites and user retention and outcomes) [20]. A systematic review conducted in the same year found Internet-based interventions with a cognitive behavioral focus are the most promising in reducing symptoms of depression in young people (with regards to efficacy, adherence and engagement) [41]. Although limited by the high number of heterogenous web-based interventions and a low number of included studies, a systematic review and meta-analysis found effectiveness in terms of a reduction of depression and anxiety and enhanced quality of life and mindfulness skills (e.g., in those with clinical anxiety) [42]. An abundance of mental health-related apps via mobile and desktop devices increased the accessibility and use of internet-based mental health screening, treatment and after-care [43], as well as design for mental wellbeing [44]. Meta-analyses of RCTs including computerized tools found positive outcomes for different mental health disorders [45,46], but there was difficulty in translating these findings in clinical practice because of low engagement [27,47,48].
Digital mental health interventions require innovative methods to increase their potential. A mixed-methods study investigated the behavior and experiences of web-based and smartphone intervention users with the intent to increase engagement—the data analysis specified differences between the overall intervention and different aspects of it with the aim to demonstrate how passive data can help to individualize treatment and improve/assure quality [49]. The limited uptake of some evidence-based services called for higher levels of empirical evidence and adherence to engagement for web-based interventions (including in trials)—comprehensive evaluation was recommended to increase patient safety and fidelity to clinical service guidelines [30]. An RCT of a virtual clinic for university students with psychological distress found utility in the results; although not effective in reducing symptoms of depression, anxiety, or psychological distress there was satisfaction with the virtual approach and suggestions for investigating guided and/or tailored treatments in a stepped model of care [50]. The stepped approach was described as targeting those with mild to moderate anxiety and depression—those people could start with self-help or chatbots followed by therapist-guided digital therapy where there is no improvement; non-responders are coordinated to face-to-face therapy [36]. An evaluation of Internet-delivered treatment systems noted its clinical effectiveness, but recommended guidelines be established for testing of usability with regards to up-to-date and relevant design and technology [51].
Varied levels of empirical evidence have been noted for digital platforms with suggestions for hybrid and stepped models of care to increase usability. A platform that integrates standalone e-mental health services with a global network of face-to-face youth mental health services (MOST+) was noted as being reliable, acceptable, and scalable at the pilot evaluation stage [52]. An RCT with Australian secondary schools established a small but positive effect on help-seeking intentions for mental health from a web-based mental health service that integrates screening, with stepped intervention and CBT [53]. Future trials were recommended to determine a threshold for a clinically meaningful significance. A scoping review of online preventive (early) interventions for youth found a wide-ranging effectiveness, usability, and acceptability but recommended the codesign of clinical trials based on the clinical staging model [54]. A qualitative study recommended service users with a severe mental illness and mental health workers cooperatively use e–mental health resources (e.g., an interactive website) in a community mental health practice [55]. Although an investigation of a parents’ online fora of informational support on children’s mental health found a significant amount of evidence-based knowledge, the quality of mental health information shared on digital platforms is relatively unknown [56].
A youth codesigned mobile phone app was proposed as potentially useful for self-monitoring and the management of mood symptoms in young people with depression, suicidal ideation and self-harm [57]. But various studies have concluded that apps used for self-help for a range of mental health disorders are of poor quality [27]. The efficacy and usability of apps is less than that of digital platforms (a small number of RCTs demonstrated an effect on the primary outcome measure) [58]. It was suggested to evaluate the potential of apps beyond empirical evidence and usefulness as a monitoring tool for digital phenotyping [58]. A cluster RCT with adolescents evaluated the effectiveness of a mental health self-monitoring mobile app. Efficacy was not found but there is a potential assistive capacity with regards to providing real-time summaries of client data to therapists. Codesign was noted as important to increase engagement, but digital platforms/websites may be better investments for providing mental health information [59]. A review of the usability of mobile mental health apps found it to not be the focus of HCI evaluation; there was a suggestion to establish a usability questionnaire to measure prospects beyond effectiveness, efficiency, and satisfaction [60].
HCI methods and knowledge may help to foster the digital therapeutic alliance (e.g., in mental health apps) [61]. A conceptual study proposed that HCI theories (i.e., persuasive system design, affective computing, eudemonic psychology and positive computing, and the human-smartphone connection) may contribute to a befitting, customized measure of the digital therapeutic alliance (as opposed to translating from traditional measures of the therapeutic alliance) [61]. A review found clients were generally satisfied with the therapeutic alliance through videoconference therapy, but computers may assist or interfere with the client-psychologist relationship [62]. A qualitative exploration of the acceptability of a digital mental health platform for delivering CBT by a virtual coach with university students found engagement issues related to a lack of interpersonal factors in the digital therapeutic alliance and suggested to improve the platform’s functionality and to change the avatar to be less humanlike to increase usability and effectiveness [63].
Social media use and the associated analysis of behavioral change is proposed to have potential in explaining and intervening with mental health. A meta-review analyzed computer-mediated communication (CMC) via ICTs (e.g., email, mobile texting, instant messenger, and social network sites) in terms of their operationalization (i.e., technology-centered or user-centered) and association with a diversity of mental health [64]. There was a very small negative effect from the use of social network sites—more comprehensive mental health outcomes are required as well as more rigorous understanding of the characteristics of interactions and transmitted messages (rather than just analyzing screen time) [64]. An unsupervised approach (clustering via machine learning algorithms that used NLP) was applied in an analysis to understand users’ behavioral features with a social network site (mainly Twitter) and distinguish normal users from at-risk users (the latter were characterized by the scale of change of their use) [65]. The proposed early intervention approach lacked discussion on the potentially hindering human factors–machine interaction with regards to a clinical psychologist needing to be resourced and trained to observe and verify findings (determining true and false positive cases) as well as providing counseling and treatment. A similar study that identified patterns of language in social media users aimed to distinguish between users diagnosed with a mental disorder and healthy users with a model of emotion evolution to assist clinicians in diagnosing patients (with depression, anorexia, and self-harm tendencies) [66].
A blended approach possibly using a combination of digital tools in addition to pharmaceuticals may elicit enhanced insights. A machine learning algorithm applied NLP method in content analysis of SMS text messages on a digital mental health service platform (i.e., Talkspace) to identify users’ COVID-19 pandemic-related concerns [67]. This study demonstrated the potential to increase understanding of contextual mental ill-health from a blend of internet-based technology with digital phenotyping and AI. The overall significant increase in anxiety was drawn from a blended approach (i.e., unstructured therapy transcript data as well as clinical assessment for anxiety and depression). A bio-affective-digitalism theoretical framework was proposed in a case study noting the potential of mixing human and non-human factors through an ingestible sensor (Abilify MyCite) that is connected to an online portal on the Internet and smart phones as well as trackers, wearable patches, apps, and programming [68].
A blend of peer-to-peer and digital platform approaches may be a formidable adjunct to clinical care. In 2015, a systematic review of online peer-to-peer support for young people found a lack of studies and effectiveness despite increased use as an active intervention [69]. More recently, there is the suggestion that utilizing (para-)professionals offers a good potential to fill gaps in mental health care (i.e., if traditional mental health care systems effectively collaborate). A framework of engagement was applied in the analysis of patterns of use in online peer-to-peer support platforms (Reddit and Talklife) which facilitate mental health support but requires users to interact and engage [70]. Increased engagement with online platforms may be derived from the use of mental health subcommunities (e.g., as per Reddit) and mutual discourse for peer-supporter retention (particularly relevant on Talklife) [70]. However, it was acknowledged that encouraging users to report self-disclosures may have potentially negative consequences with regard to their vulnerability. There is increasing use of peer support workers, but it is not yet known if this approach can successfully blend with clinical support and engagement [71]. A scalable peer-to-peer paraprofessional training and supervision program in an American university setting demonstrated reliability, replicability, and adaptability for supporting CBT delivery in a hybrid model of care [72].

3.2. Artificial Intelligence

AI such as computer vision, NLP, machine learning and reinforcement learning systems facilitates machines to perform sophisticated and anthropomorphic functions [73]. Early evidence of HCI through affective NLP found that automated assistive systems have the potential to emotionally respond from interpreting human language and an accumulation of sentiment from text and speech [74]. In recent years, AI is being applied in multi-faceted ways—predictive tools are being tested and used to determine mental ill-health and suicide risks [75] and to coordinate tailored treatment plans [76]. Furthermore, therapeutic chatbots provide readily available support [77,78] and interventions, [79] including through video games [80]. A review of AI and mental health outlined data sources as electronic health records, mood rating scales, brain imaging data, monitoring systems and social media platforms to predict, organize, or subgroup a range of mental ill-health and suicidality [81]. Machine learning automates processes, analyzes big data, and assists mental health care practitioners with making decisions on an individual’s mental ill-health or suicide risk, but there has yet to be any accurate prediction of specific risks across populations [40,82].
AI technology design and use in mental health care has increased quality, accessibility, affordability, convenience, and efficiency [83,84,85,86]. The main advantages of machine learning are that it is scalable and highly accurate in mental ill-health prediction, but it is mostly conceptual and lacking in empirical evidence, which limits its clinical application [87]. The main disadvantages are a lack of information on model building and uncertain accuracy for suicide risk prediction [81,88,89], a lack of external evaluation of population studies [90,91], different evaluation approaches in cohort studies [92,93]), and a lack of user-centered design processes that thwart HCI [87].
Usability challenges for machine learning include the sufficient skills and time required to develop and run models, users lacking trust in the models, and the struggle rooted in human–machine learning disagreement [82]. Human-centered AI (HAI) was suggested to counter HCI deficiencies—it is a user-centered HCI approach consisting of human factors design, ethically aligned design, and technology that covers human intelligence [94]. Human factors design can benefit from cost–benefit analyses to provide information on expectations and to clarify the prediction target [82]. In addition, it is important to build trust, decrease disagreement, improve responsibility, explain a model’s logic, quantify specific contributions to the prediction, assess the performance metrics, and illustrate historical predictions from previous studies [82].
The adjunct potential of chatbots in mental health care is a good example of how HAI can be applied in research. A review of chatbots and conversational agents used in mental health found a small number of academic psychiatric studies with limited heterogeneity—there is a lack of high-quality evidence for diagnosis, treatment or therapy but there is a high potential for effective and agreeable mental health care if correctly and ethically implemented [95]. A major research constraint is that chatbots and predictive algorithms may be biased and perpetuate inequities in the underserved and the unserved [96,97,98,99]. The ethics of a patient-therapist relationship and the limited skills and emotional intelligence of chatbots requires a solution [100]. NLP and other machine learning algorithms could potentially help solve problems by identifying an ideal digital therapeutic bond [101].
The failure to achieve effectiveness and external evaluation of useful and real problem-solving AI solutions in the first two waves of AI (1950s–1970s and 1980s–1990s) led to the call for optimized and ethical user-centered designs (UCD) for explainable, comprehensible, useful, and usable AI [94]. The most notable UCD application is in Explainable AI (XAI) e.g., it may facilitate users to understand the algorithm and psychological theory parameters as well as the outputs (characterized by evaluation of strengths and weaknesses) to assist in increasing the decision-making efficiency [94,102]. XAI is mostly applied with predictive technologies—a user interface (UI) provides a holistic understanding of the machine-patient-therapist relationship to improve safety and efficacy and instil responsibility [103]. The concept of Explainable Robotics introduced the potential of human–robot interactions whereby machine learning algorithms give explanations that help robots communicate with humans in a trustworthy and acceptable way [104].
The call for blending clinical and AI approaches in the screening and treatment of psychiatric disorders may potentially boost UCD in AI. The lack of research focusing on the prevention of the sequalae of psychiatric disorders and the vulnerability of ex-COVID patients to mental health disorders led to a conceptual analysis that explored blending clinical approaches (i.e., drawn from clinical rating scales and self-rating questionnaires) with state-of-the-art AI-based tools for expedited and thorough diagnosis, prevention, and treatment of psychiatric disorders [105]. A multimodal psychopathology prediction protocol combined AI methods with comprehensive psychological, neurophysiological, semantic, acoustic, and facial/oculometric measurements and features [105]. These different categories of multidisciplinary data were suggested to be integrated and coordinated with large scale research efforts. But the lack of use of a HAI framework and the global context of needing more fruitful and efficient strategies in coping with post-COVID overall mental health deterioration requires direction and support. It was suggested that the World Health Organization (WHO) may assist by establishing a multinational interdisciplinary task force for policy design, planning and development of more advanced AI-based innovations in digital psychiatry [105].

3.3. Digital Phenotyping

Digital phenotyping is personal sensing from capturing metadata. It unobtrusively measures how a user interacts with the device and might provide a depiction of cognitive traits and affective states as well as add precision to mental health diagnoses and outcomes from combining sensor data, speech and voice data, and HCI [106]. A HCI protocol involving digital biomarkers for cognitive function—a psychometric assessment in conjunction with monitoring of the use a smartphone app (that ran unobtrusively in the background and captured perceptible user activity e.g., swipes, taps, and keystroke events)—found that it could possibly act as a continuous ecological surrogate for laboratory approaches [107].
The potential of digital phenotyping for assisting with effective care of young people with psychological distress called for research to address the practicalities of its future clinical application [108]. A novel hybrid study addressed unmet needs of HCI—the need to increase the adoption and reuse of clinically relevant apps and platforms as well as digital phenotyping [109]. The design of a freely available smartphone platform suitable for youth and underserved populations accounted for HCI considerations e.g., patient demands for trust, control, and community as well as clinician demands for transparent, data driven, and translational tools [109]. The large array of smartphone apps and lack of reproducibility influenced a digital phenotyping design (tracking on a smartphone app) of young adults with a psychotic illness (noting that population-level models are not yet possible) [108]. The individualized actionable clinical insights were noted as feasible but require replication studies and the training of clinicians to expressively use the data, as well as the integration of apps into clinical care [110].
Digital phenotyping is potentially useful to predict abnormal behavior, but it does not provide a causal explanation or psychological understanding of it [111]. Digital sensory phenotyping may provide objective and continuous assessment which may facilitate better clinical interventions, but data security needs to be improved and further research is needed to determine the utility of its data, evaluation and efficiency [112]. Privacy, confidentiality and data sharing concerns were noted about unregulated digital phenotyping and associated digital neuromarketing which potentially undermines human-human interaction [113]. The following recommendations were made to deter negative consequences: technical and public evaluation of technologies and media before release; regulatory processes need to be in place with careful monitoring; public awareness and education on apps; and taxing the information gathering [113].
The benefits of a blended approach of digital phenotyping with clinical assessment have been proposed. For example, digital phenotyping of passive data from smartphones and wearables in addition to questionnaire results could potentially measure the suicidal ideation process through ecological momentary assessment [114]. After analysis of a systematic review [115], digital phenotyping was not recommended for random screening of mental disorders because of the risk of ‘false positives’, but RCTs were advised for monitoring the mental health pilots and as an assistive tool for clinicians to predict recovery or early relapse [116]. The main challenges of digital phenotyping were noted as reliability, clinical utility, privacy, regulation and application—it was suggested that clinicians help to resolve quality, safety and data security issues by guiding the use of approved and reliable apps for voluntary monitoring [116].

3.4. Immersive Technologies

HCI for computer and video games was limited to entertainment purposes until the 2000s, when pilot studies of serious video games emerged as an adjunct to psychotherapy for adolescents [117] and a clinical intervention for schizophrenia [118], anxiety disorders [119] and attention deficit hyperactivity disorders (ADHDs) [120]. A controlled longitudinal study analyzed the effectiveness of a video game on the PlayMancer platform [121]. Enhanced HCI included emotion recognition from speech audio data and integration of user requirements with the game scenario. The alternative therapeutic approach assisted with coping and self-control strategies in patients with diagnosed eating disorders and pathological gambling. However, the evaluation did not progress beyond trials. A prototype evaluation study used an adapted eye-tracking device as a novel HCI integration in video games—73% of users reported that it was easy to fix their visual attention on that point for a couple of seconds to trigger a system action [122]. The design and integration of this system was noted as low-cost—the eye-tracker hardware is affordable, and the software was designed from open source code libraries [122]. Eye-tracking has been established for a decade with the aim to improve human factor aspects of HCI in psychological research [123] but it has yet to be outlined for which psychological symptoms or disorders it is useful.
There has been an assortment of studies that demonstrate that video games may be effective in an adjunct approach to the treatment of ADHD. An evaluation of clinical trials with ADHD adolescents found that video games help the therapeutic relationship, but the HCI design needs to better account for the target audience (e.g., literacy difficulties) to increase engagement and improve the user experience [124]. A systematic review and meta-analysis of RCTs related to the effectiveness of serious games for the treatment of mental disorders found a positive moderate effect on symptoms [125]. A randomized, double-blind, parallel-group, controlled trial using a video-game interface in a digital therapeutic intervention for pediatric patients with ADHD found no adverse effect and it is potentially useful for objectively measuring inattention [126]. In addition, there is effectiveness with a medication-treated pediatric ADHD population and note of the need for pragmatic RCTs to build the evidence generation complemented by qualitative human-centered investigations [80]. An RCT tested the effectiveness of a serious video game (i.e., development and usability) and found that it may complement the current multimodal approach for treating ADHD [127].
A narrative review [128] and a systematic review [129] found efficacy for stress and anxiety reduction from the use of commercial off-the-shelf video games (i.e., exergames, casual video games, action games, action-adventure games, and augmented reality games used on various gaming platforms, including consoles, personal computers, smartphones, mobile consoles, and virtual reality systems). Although commercial video games have design features that instil a sense of flow [128], the systematic review acknowledged that custom-made games (i.e., for serious purposes like education, training, or behavior modification) better integrate biofeedback techniques for relaxation and are more appropriate for adults in regard to their stress and anxiety responses [129]. Future studies were suggested to include a diversity of age groups, a variety of video game genres, the most recent, popular and widely used gaming platforms, the amount of uses required for optimal effect, to follow methodological guidelines for reporting research findings, as well as to describe the individual characteristics of users (e.g., personality and cognitive ability) and their preferences for genre and gaming platform.
Virtual Reality (VR) has been noted as becoming widely accepted in psychology and neuroscience as the most advanced form of HCI allowing individuals to act, communicate and be present in a computer-generated environment [130]. Researchers and VR video game companies are working together for mental health support. The use of a VR biofeedback video game led to a decrease in trait anxiety, but like previous studies the game design did not meet expectations requiring more spatial opportunities for optimized engagement (no improvements observed over a control application for guided relaxation) [131]. Commercially available VR games were used to demonstrate the potential of effectively decreased state anxiety and increased positive emotions in users—the pilot study reinforced adoption of the emotional design conceptual framework that is widely adopted in designing appealing immersive solutions [132]. VR is marked by preliminary evidence; it can be used for improving some symptoms of psychosis in those with schizophrenia [133], with a clinical trial finding excellent usability of a therapeutic VR human–human interface [134]. But VR is mostly noted for use in a CBT intervention for depressive and anxiety disorders—there is a good potential to generally expand immersive intervention for psychological symptoms (e.g., exposure therapy) [128].

4. Discussion

4.1. Human-Computer Interaction Challenges and Digital Mental Health as an Adjunct to Care

Logical applications were found to be lacking in digital mental health implementation [25], which in effect called for HCI challenges to be addressed. In 2018, a HCI symposium focused on understanding the users of mental health technology, their context of use, how therapeutic technology is used as well as the importance and methods of design [135]. Five key challenges were noted (i.e., entrepreneurship, publishing, funding, theory, and outcomes), and interdisciplinary collaboration was advised to evolve from lessons learned as well as to increase and integrate knowledge and practice to realize faster and better global mental health [135]. HCI was recognized as a key factor for increasing accessibility to diverse communities and decreasing the inequities of mental healthcare through a human-centered approach, as well as delivering enjoyable experiences. Human-centered design (HCD) aims to connect with and understand the needs of service users while retaining a systems viewpoint. HCD methods include journey mapping, prototyping, and user testing [136].
The traditional practice of psychology and psychiatry have inherent issues e.g., clinical mental health diagnostic systems perform poorly in the detection of the early stages of mental disorders [137] and there are inequities that arise from constrained healthcare systems—those with severe mental disorders take up the bulk of resources, in effect marginalizing the underserved (predominantly those with low to moderate mental distress) [138]. Social and environmental factors are often not considered enough in predicting and explaining mental illnesses and disorders [139]. The integration of predictive models with digital screening tools was proposed for faster and better assessment of the underserved for preventive and early intervention of mental health problems in ‘majority web delivered and minority in-clinic’ care [138]. The expected rise in the future mental ill-health burden is likely to be compounded by the socio-economic challenges of the COVID-19 response and widening inequalities (making some people more at-risk) [140]. In addition to prevention via advocacy for vulnerable communities with and without pre-existing psychiatric conditions, mental health practitioners and researchers need to prepare for faster and better treatment through the provision, maintenance, and improvement of existing services [140].
Digital approaches are broadly considered to have a good potential as a mental health care adjunct [63,141,142,143]. For example, a noted prospect is merging facial recognition and NLP emotion-detection software to provide a complex picture of mood and mental states [79]. Demand outstripping supply of mental health resources during COVID-19 led to the increased use of telehealth, but investment is needed through funding, research, policy changes, training, and equity to provide better access and quality [35]. HCI was suggested as useful to uncover UX and clinical design implications from clinical trials to create patient-centered telehealth solutions [144]. Although digital interventions via common digital tools (e.g., video conferences, social networks, telephone calls, and emails) are effective at the population level for common disorders, mental health care practitioners are grappling with using and overcoming difficulties in connecting with other digital tools (e.g., web-based screening and intervention, AI, smartphone, immersive and wearable technologies, the Internet of Things (IoT) and digital phenotyping) [40]. Specifications are needed for subpopulations [145,146] and the psychological disorders and symptoms for which different types of digital tools are effective and readily useful [147].

4.2. Ethics, the Digital Therapeutic Alliance and Blended, Hybrid and Stepped Models of Care

Human factors in combination with machine interaction and automation may positively or negatively affect ethical, quality, safe and secure research and clinical care. A case study highlighted the ethical predicament of promoting a mental health app that uses digital phenotyping to predict negative mood states—the use of anonymized behavioral data (e.g., digital biomarkers) for commercial purposes was proposed to provide an incomplete assessment (e.g., does not address the psychosocial and sociopolitical determinants of mental health and the context in which people experience emotional distress) [148]. There are a range of ethical risks related to the security of sensitive data, socio-cultural adaptability, and fitting education and training of medical professionals [81,149,150]. In addition, there should be consideration of quality therapeutic aspects (e.g., the need for empathetic and inclusive care) which makes the digital therapeutic relationship important to delivering effective, efficient, and patient-centered care [151] in a blended [84] hybrid [152,153] or stepped model of care [36,50,53].
A relatively new HCI theory—concordance—was applied in a three-month user study that found encouraging evidence that users can develop a therapeutic alliance with an interactive online support system [154]. The small feasibility trial suggested larger scale studies employ a design approach involving peer/moderator support as well as automated feedback [154]. A narrative review investigated the psychological aspects of HCI via assessment of the Digital Therapeutic Alliance (DTA) for people with serious mental illnesses and recommended that evidence-based studies facilitate responsible outcomes from a three-tiered approach from the perspectives of patient/user, mental health care practitioners and machines [155]. The therapeutic relationship is central to driving positive change in mental health care [101].
The use of multimodal digital platforms for Technology Enabled Clinical Care (TECC) [156] and informatics infrastructure [157] are emerging as ways to assist faster and better mental health care pathways including for severe psychological distress and suicidality. A fundamental issue is how to not continue adding to the array of digital platforms, apps and electronic medical record systems that are effectively serving in isolation from one another [156]. The design and development of TECC using dynamic simulation modelling and health service implementation research is crucial to how these technologies are adopted and implemented if service efficiency developments and clinical outcomes are to be attained, especially for youth mental health services [156]. The building and testing of HCI models of the UI components for different types of digital health interventions (i.e., predictive HCI modeling of applications) and the development and evaluation of UI for digital health systems such as electronic health record systems was proposed to add to HCD processes and heuristics techniques [158]. Integrating predictive modeling with HCD (i.e., adding real humans into the loop of simulations by computer algorithms that run human-created models) may be useful for advising evidence-based UI design guidelines to support the development of safer and more effective UIs for digital health interventions [158].

5. Conclusions

The adjunct approach of integrating digital mental health solutions into clinical care is promising, but mental health practitioners are required to play a larger role in overcoming the challenges of HCI and collaborating with researchers, policymakers, governing bodies and developers/entrepreneurs on ways for products and services to be effectively designed, developed, used, strategized, funded and scaled. HCI research and development has unfulfilled potential in furthering long-term evidence for mental health and suicidality. Web-based interventions require higher levels of empirical evidence (e.g., quality and design), engagement and user retention. A significant finding of this integrative review with regards to theoretical and empirical literature synthesis (focusing on effectiveness, feasibility, accessibility, sociocultural inclusion, rigor and readiness for adoption and upkeep) is the potential to design, develop and use an integrated-multimodal digital mental health platform in a stepped, blended or hybrid approach. The evidence base suggested a lack of quality, useful, usable apps. Digital phenotyping is potentially useful in combination with psychometric assessment (to predict abnormal behavior). Serious video games may serve as a complementary approach to treating ADHD in young people, but custom-made games are appropriate for reducing stress and anxiety in adults. VR is an advanced form of HCI—there is potential for using immersive games to help those with schizophrenia as well as in CBT for those with depressive and anxiety disorders.
Although at a conceptual stage, HCI theories are central to tailoring an apt DTA. The HAI framework is of note (i.e., consideration of human factors design, ethically aligned design, and technology that fully reflects human intelligence). The most relevant to HCI professionals is human factors design. There is a need to increase the capacity of explainable, comprehensible AI and useful, usable AI in a stepped, blended or hybrid approach. The human factors-machine interaction and automation categories of HCI are marked by the need for clinical mental health practitioners to observe and verify real-time machine learning findings from screening and treatment. There is potential for peer-to-peer approaches to help fill mental health care gaps. A primarily web-delivered approach for mental health care has the potential to benefit the underserved (often assessed as low-moderate cases of anxiety or depression) as well as assist in better and faster coordination of individualized care for those at-risk of suicide (TECC). There can be severe and compounding consequences of not grasping the window of opportunity for effectively intervening in the sequalae of mental ill-health or suicidality.
It is important that HCI is better incorporated into technological developments for digital mental health (e.g., digital platforms and AI) resulting in higher quality, safety and usability. HCI modeling may help achieve these results by raising evidence-based digital health system design. It points towards integration of predictive modeling with usability and software engineering approaches (e.g., up-to-date and relevant patient/user safety and clinical guidelines). More specifically for digital mental health, there also needs to be a pragmatic codesign incorporating demands from mental health practitioners and users to help strengthen the HCD process and to instil an understanding of how an application achieves real-world effectiveness. Mental health care practitioners may assist towards effective, responsible and very fast digital mental health tools and user populations may assist in aiming towards excellent, enjoyable UX and consistency in validated practices.
Summary of key points:
  • The integrative review found that HCI has long needed to be better integrated into technological developments for mental health care.
  • The design, development, implementation, and evaluation of digital mental health tools has the potential to help resolve systemic mental health care issues (e.g., through better and faster service for the underserved with low to moderate anxiety and depression as well as TECC for those at-risk of suicide).
  • Digital mental health tools best serve as an adjunct to mental health care—users and mental health practitioners can help improve effective outcomes through codesign of HCI (e.g., the DTA, clinical guidelines on validating machine learning findings as well as stepped models of care that utilize supporting resources—peer workers).
  • There are many web-based or smartphone technology products and services available (especially apps) which serve in telehealth and (self-)guided digital interventions as well as AI, immersive technologies, and digital phenotyping. But a lack of HCI investment has resulted in unrealized potential (e.g., a secure, trusted and eminent integrated-multimodal digital platform using AI has yet to be effectively designed, developed, used, strategized, funded and scaled).
  • Future research for enhanced quality, safety and usability may benefit from integrating a predictive model with HCD (i.e., adding real humans into the loop of simulations by computer algorithms that run human-created models).

Author Contributions

Conceptualization, L.B.; investigation, L.B. and D.D.L., writing—original draft preparation, L.B.; writing—review and editing, L.B. and D.D.L.; supervision, D.D.L.; project administration, L.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Conflicts of Interest

The authors declare that they have no conflict of interest.

References

  1. Dix, A. Human-Computer Interaction. In Encyclopedia of Database Systems; Liu, L., Özsu, M.T., Eds.; Springer: Boston, MA, USA, 2009. [Google Scholar] [CrossRef]
  2. Card, S.K.; Moran, T.P.; Newell, A. The Psychology of Human-Computer Interaction; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar] [CrossRef]
  3. Myers, B.A. A brief history of human-computer interaction technology. Interactions 1998, 5, 44–54. [Google Scholar] [CrossRef] [Green Version]
  4. Johnson, J.H.; Godin, S.W.; Bloomquist, M.L. Human factors engineering in computerized mental health care delivery. Behav. Res. Methods Instrum. 1981, 13, 425–429. [Google Scholar] [CrossRef] [Green Version]
  5. Newell, A.; Card, S.K. The Prospects for Psychological Science in Human-Computer Interaction. Hum.-Comput. Interact. 1985, 1, 209–242. [Google Scholar] [CrossRef]
  6. Booth, P. An Introduction to Human-Computer Interaction (Psychology Revivals); Psychology Press: Hove, East Sussex, UK, 2014. [Google Scholar] [CrossRef]
  7. Morrison, D. The applications of cognitive theory and science to HCI: A psychological perspective. Int. J. Hum.-Comput. Interact. 1992, 4, 3–5. [Google Scholar] [CrossRef]
  8. Carroll, J.M. Human–computer interaction: Psychology as a science of design. Int. J. Hum.-Comput. Stud. 1997, 46, 501–522. [Google Scholar] [CrossRef] [Green Version]
  9. Marks, I. Computer Aids to Mental Health Care. Can. J. Psychiatry 1999, 44, 548–555. [Google Scholar] [CrossRef] [Green Version]
  10. Farzanfar, R.; Frishkopf, S.; Friedman, R.; Ludena, K. Evaluating an automated mental health care system: Making meaning of human–computer interaction. Comput. Hum. Behav. 2007, 23, 1167–1182. [Google Scholar] [CrossRef]
  11. Martin, J.; McKay, E.; Shankar, J. Bias Misinformation and Disinformation: Mental Health Employment and Human Computer Interaction. In Proceedings of the 2006 InSITE Conference, Manchester, UK, 25–28 June 2006. [Google Scholar]
  12. Burns, J.M.; Davenport, T.A.; Durkin, L.A.; Luscombe, G.M.; Hickie, I.B. The internet as a setting for mental health service utilisation by young people. Med. J. Aust. 2010, 192, S22–S26. [Google Scholar] [CrossRef]
  13. Dearden, A.; Finlay, J. Pattern Languages in HCI: A Critical Review. Hum. –Comput. Interact. 2006, 21, 49–102. [Google Scholar] [CrossRef]
  14. Keating, B.; Campbell, J.; Radoll, P. Evaluating a new pattern development process for interface design: Application to mental health services. In Proceedings of the 34th International Conference on Information Systems (ICIS 2013), Milano, Italy, 15–18 December 2013; Chau, M., Baskerville, R., Eds.; Association for Information Systems (AIS): Atlanta, GA, USA, 2013; pp. 1–10. Available online: https://aisel.aisnet.org (accessed on 13 December 2021).
  15. Dingwall, K.M.; Puszka, S.; Sweet, M.; Mills, P.P.J.R.; Nagel, T. Evaluation of a culturally adapted training course in Indigenous e-mental health. Australas. Psychiatry 2015, 23, 630–635. [Google Scholar] [CrossRef]
  16. Radoll, P.J.; Campbell, J. Editorial for the Indigenous use of Information and Communication Technologies Section. Australas. J. Inf. Syst. 2015, 19. [Google Scholar] [CrossRef] [Green Version]
  17. Bennett-Levy, J.; Singer, J.; Rotumah, D.; Bernays, S.; Edwards, D. From Digital Mental Health to Digital Social and Emotional Wellbeing: How Indigenous Community-Based Participatory Research Influenced the Australian Government’s Digital Mental Health Agenda. Int. J. Environ. Res. Public Health 2021, 18, 9757. [Google Scholar] [CrossRef] [PubMed]
  18. Griffiths, K.M.; Farrer, L.; Christensen, H. The efficacy of internet interventions for depression and anxiety disorders: A review of randomised controlled trials. Med. J. Aust. 2010, 192, S4–S11. [Google Scholar] [CrossRef]
  19. Doherty, G.; Coyle, D.; Matthews, M. Design and evaluation guidelines for mental health technologies. Interact. Comput. 2010, 22, 243–252. [Google Scholar] [CrossRef]
  20. Mohr, D.C.; Burns, M.N.; Schueller, S.M.; Clarke, G.; Klinkman, M. Behavioral intervention technologies: Evidence review and recommendations for future research in mental health. Gen. Hosp. Psychiatry 2013, 35, 332–338. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  21. Valencia-García, R.; García-Sánchez, F. Natural Language Processing and Human–Computer Interaction. Comput. Stand. Interfaces 2013, 35, 415–416. [Google Scholar] [CrossRef]
  22. Olff, M. Mobile mental health: A challenging research agenda. Eur. J. Psychotraumatol. 2015, 6, 27882. [Google Scholar] [CrossRef] [Green Version]
  23. Dinakar, K.; Chen, J.; Lieberman, H.; Picard, R.; Filbin, R. Mixed-Initiative Real-Time Topic Modeling & Visualization for Crisis Counseling. In Proceedings of the 20th International Conference on Intelligent User Interfaces, Atlanta, GA, USA, 29 March–1 April 2015; pp. 417–426. [Google Scholar] [CrossRef] [Green Version]
  24. Calvo, R.A.; Dinakar, K.; Picard, R.; Maes, P. Computing in Mental Health. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA’16), San Jose, CA, USA, 7–12 May 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 3438–3445. [Google Scholar] [CrossRef]
  25. Mohr, D.C.; Lyon, A.R.; Lattie, E.G.; Reddy, M.; Schueller, S.M. Accelerating Digital Mental Health Research From Early Design and Creation to Successful Implementation and Sustainment. J. Med. Internet Res. 2017, 19, e153. [Google Scholar] [CrossRef]
  26. Zhang, M.W.B.; Ho, R.C.M. Enabling Psychiatrists to Explore the Full Potential of E-Health. Front. Psychiatry 2015, 6, 177. [Google Scholar] [CrossRef] [Green Version]
  27. Khazaal, Y.; Favrod, J.; Sort, A.; Borgeat, F.; Bouchard, S. (Eds.) Computers and Games for Mental Health and Well-Being. Front. Res. Top. 2018, 9, 141. [Google Scholar] [CrossRef]
  28. Søgaard Neilsen, A.; Wilson, R.L. Combining e-mental health intervention development with human computer interaction (HCI) design to enhance technology-facilitated recovery for people with depression and/or anxiety conditions: An integrative literature review. Int. J. Ment. Health Nurs. 2018, 28, 22–39. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  29. Torous, J.; Nicholas, J.; Larsen, M.E.; Firth, J.; Christensen, H. Clinical review of user engagement with mental health smartphone apps: Evidence, theory and improvements. Evid.-Based Ment. Health 2018, 21, 116–119. [Google Scholar] [CrossRef] [PubMed]
  30. Batterham, P.J.; Calear, A.L.; O’Dea, B.; Larsen, M.E.J.; Kavanagh, D.; Titov, N.; Gorman, P. Stakeholder perspectives on evidence for digital mental health interventions: Implications for accreditation systems. Digit. Health 2019, 5, 205520761987806. [Google Scholar] [CrossRef]
  31. Blandford, A. HCI for health and wellbeing: Challenges and opportunities. Int. J. Hum.-Comput. Stud. 2019, 131, 41–51. [Google Scholar] [CrossRef]
  32. Scholten, H.; Granic, I. Use of the principles of design thinking to address limitations of digital mental health interventions for youth. J. Med. Internet Res. 2019, 21, e11528. [Google Scholar] [CrossRef] [PubMed]
  33. Stowers, K.; Mouloua, M. Human Computer Interaction Trends in Healthcare: An Update. Proc. Int. Symp. Hum. Factors Ergon. Health Care 2018, 7, 88–91. [Google Scholar] [CrossRef]
  34. Necho, M.; Tsehay, M.; Birkie, M.; Biset, G.; Tadesse, E. Prevalence of anxiety, depression, and psychological distress among the general population during the COVID-19 pandemic: A systematic review and meta-analysis. Int. J. Soc. Psychiatry 2021, 67, 892–906. [Google Scholar] [CrossRef]
  35. Torous, J.; Jän Myrick, K.; Rauseo-Ricupero, N.; Firth, J. Digital Mental Health and COVID-19: Using Technology Today to Accelerate the Curve on Access and Quality Tomorrow. JMIR Ment. Health 2020, 7, e18848. [Google Scholar] [CrossRef]
  36. Gratzer, D.; Torous, J.; Lam, R.W.; Patten, S.B.; Kutcher, S.; Chan, S.; Yatham, L.N. Our Digital Moment: Innovations and Opportunities in Digital Mental Health Care. Can. J. Psychiatry 2020, 66, 5–8. [Google Scholar] [CrossRef]
  37. Malla, A.; Joober, R. COVID-19 and the Future with Digital Mental Health: Need for Attention to Complexities. Can. J. Psychiatry 2020, 66, 14–16. [Google Scholar] [CrossRef]
  38. Balcombe, L.; De Leo, D. Athlete Psychological Resilience and Integration with Digital Mental Health Implementation Amid Covid-19. In Anxiety, Uncertainty, and Resilience during the Pandemic Period—Anthropological and Psychological Perspectives; IntechOpen: London, UK, 2021. [Google Scholar]
  39. Whittemore, R.; Knafl, K. The integrative review: Updated methodology. J. Adv. Nurs. 2005, 52, 546–553. [Google Scholar] [CrossRef]
  40. Balcombe, L.; De Leo, D. Digital Mental Health Amid COVID-19. Encyclopedia 2021, 1, 1047–1057. [Google Scholar] [CrossRef]
  41. Rice, S.; Goodall, J.; Hetrick, S.; Parke, R.A.; Gilbertson, T.; Amminger, G.; Davey, C.; McGorry, P.; Gleeson, J.; Alvarez-Jimenez, M. Online and Social Networking Interventions for the Treatment of Depression in Young People: A Systematic Review. J. Med. Internet Res. 2014, 16, e206. [Google Scholar] [CrossRef] [Green Version]
  42. Sevilla-Llewellyn-Jones, J.; Santesteban-Echarri, O.; Pryor, I.; McGorry, P.; Alvarez-Jimenez, M. Web-Based Mindfulness Interventions for Mental Health Treatment: Systematic Review and Meta-Analysis. JMIR Ment. Health 2018, 5, e10278. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  43. Price, M.; Yuen, E.K.; Goetter, E.M.; Herbert, J.D.; Forman, E.M.; Acierno, R.; Ruggiero, K.J. mHealth: A Mechanism to Deliver More Accessible, More Effective Mental Health Care. Clin. Psychol. Psychother. 2013, 21, 427–436. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  44. Thieme, A.; Wallace, J.; Meyer, T.D.; Olivier, P. Designing for mental wellbeing. In Proceedings of the 2015 British HCI Conference, Lincoln Lincolnshire, UK, 13–17 July 2015; Association for Computing Machinery (ACM): New York, NY, USA, 2015. [Google Scholar] [CrossRef]
  45. Spek, V.; Cuijpers, P.; Nyklicek, I.; Riper, H.; Keyzer, J.; Pop, V. Internet-based cognitive behaviour therapy for symptoms of depression and anxiety: A meta-analysis. Psychol. Med. 2007, 37, 319. [Google Scholar] [CrossRef]
  46. Andersson, G.; Cuijpers, P.; Carlbring, P.; Riper, H.; Hedman, E. Guided Internet-based vs. face-to-face cognitive behavior therapy for psychiatric and somatic disorders: A systematic review and meta-analysis. World Psychiatry 2014, 13, 288–295. [Google Scholar] [CrossRef]
  47. Gilbody, S.; Littlewood, E.; Hewitt, C.; Brierley, G.; Tharmanathan, P.; Araya, R.; White, D. Computerised cognitive behaviour therapy (cCBT) as treatment for depression in primary care (REEACT trial): Large scale pragmatic randomised controlled trial. BMJ 2015, 351, h5627. [Google Scholar] [CrossRef] [Green Version]
  48. Gilbody, S.; Brabyn, S.; Lovell, K.; Kessler, D.; Devlin, T.; Smith, L. Telephone-supported computerised cognitive–behavioural therapy: REEACT-2 large-scale pragmatic randomised controlled trial. Br. J. Psychiatry 2017, 210, 362–367. [Google Scholar] [CrossRef] [Green Version]
  49. Chen, A.T.; Wu, S.; Tomasino, K.N.; Lattie, E.G.; Mohr, D.C. A multi-faceted approach to characterizing user behavior and experience in a digital mental health intervention. J. Biomed. Inform. 2019, 94, 103187. [Google Scholar] [CrossRef]
  50. Farrer, L.M.; Gulliver, A.; Katruss, N.; Fassnacht, D.B.; Kyrios, M.; Batterham, P.J. A novel multi-component online intervention to improve the mental health of university students: Randomised controlled trial of the Uni Virtual Clinic. Internet Interv. 2019, 18, 100276. [Google Scholar] [CrossRef] [PubMed]
  51. Yogarajah, A.; Kenter, R.; Lamo, Y.; Kaldo, V.; Nordgreen, T. Internet-delivered mental health treatment systems in Scandinavia—A usability evaluation. Internet Interv. 2020, 20, 100314. [Google Scholar] [CrossRef] [PubMed]
  52. Alvarez-Jimenez, M.; Rice, S.; D’Alfonso, S.; Leicester, S.; Bendall, S.; Pryor, I.; Gleeson, J. A Novel Multimodal Digital Service (Moderated Online Social Therapy+) for Help-Seeking Young People Experiencing Mental Ill-Health: Pilot Evaluation Within a National Youth E-Mental Health Service. J. Med. Internet Res. 2020, 22, e17155. [Google Scholar] [CrossRef] [PubMed]
  53. O’Dea, B.; Subotic-Kerry, M.; King, C.; Mackinnon, A.J.; Achilles, M.R.; Anderson, M.; Christensen, H. A cluster randomised controlled trial of a web-based youth mental health service in Australian schools. Lancet Reg. Health-West. Pac. 2021, 12, 100178. [Google Scholar] [CrossRef]
  54. van Doorn, M.; Nijhuis, L.A.; Egeler, M.D.; Daams, J.G.; Popma, A.; van Amelsvoort, T.; McEnery, C.; Gleeson, J.F.; Öry, F.G.; Avis, K.A.; et al. Online Indicated Preventive Mental Health Interventions for Youth: A Scoping Review. Front. Psychiatry 2021, 12, 580843. [Google Scholar] [CrossRef]
  55. Williams, A.; Fossey, E.; Farhall, J.; Foley, F.; Thomas, N. Impact of Jointly Using an e–Mental Health Resource (Self-Management and Recovery Technology) on Interactions Between Service Users Experiencing Severe Mental Illness and Community Mental Health Workers: Grounded Theory Study. JMIR Ment. Health 2021, 8, e25998. [Google Scholar] [CrossRef] [PubMed]
  56. Mertan, E.; Croucher, L.; Shafran, R.; Bennett, S.D. An investigation of the information provided to the parents of young people with mental health needs on an internet forum. Internet Interv. 2021, 23, 100353. [Google Scholar] [CrossRef]
  57. Hetrick, S.; Robinson, J.; Burge, E.; Blandon, R.; Mobilio, B.; Rice, S.; Simmons, M.; Alvarez-Jimenez, M.; Goodrich, S.; Davey, C. Youth Codesign of a Mobile Phone App to Facilitate Self-Monitoring and Management of Mood Symptoms in Young People with Major Depression, Suicidal Ideation, and Self-Harm. JMIR Ment. Health 2018, 5, e9. [Google Scholar] [CrossRef]
  58. Faurholt-Jepsen, M.; Kessing, L.V. Apps for mental health care: The raise of digital psychiatry. Eur. Neuropsychopharmacol. 2021, 47, 51–53. [Google Scholar] [CrossRef]
  59. Kenny, R.; Fitzgerald, A.; Segurado, R.; Dooley, B. Is there an app for that? A cluster randomised controlled trial of a mobile app–based mental health intervention. Health Inform. J. 2019, 26, 1538–1559. [Google Scholar] [CrossRef] [Green Version]
  60. Inal, Y.; Wake, J.D.; Guribye, F.; Nordgreen, T. Usability Evaluations of Mobile Mental Health Technologies: Systematic Review. J. Med. Internet Res. 2020, 22, e15337. [Google Scholar] [CrossRef] [PubMed]
  61. D’Alfonso, S.; Lederman, R.; Bucci, S.; Berry, K. The Digital Therapeutic Alliance and Human-Computer Interaction. JMIR Ment. Health 2020, 7, e21895. [Google Scholar] [CrossRef]
  62. Cataldo, F.; Chang, S.; Mendoza, A.; Buchanan, G. A Perspective on Client-Psychologist Relationships in Videoconferencing Psychotherapy: Literature Review. JMIR Ment. Health 2021, 8, e19004. [Google Scholar] [CrossRef] [PubMed]
  63. Venning, A.; Herd, M.C.; Oswald, T.K.; Razmi, S.; Glover, F.; Hawke, T.; Redpath, P. Exploring the acceptability of a digital mental health platform incorporating a virtual coach: The good, the bad, and the opportunities. Health Inform. J. 2021, 27, 146045822199487. [Google Scholar] [CrossRef]
  64. Meier, A.; Reinecke, L. Computer-Mediated Communication, Social Media, and Mental Health: A Conceptual and Empirical Meta-Review. Commun. Res. 2020, 48, 1182–1209. [Google Scholar] [CrossRef]
  65. Joshi, D.; Patwardhan, D.M. An analysis of mental health of social media users using unsupervised approach. Comput. Hum. Behav. Rep. 2020, 2, 100036. [Google Scholar] [CrossRef]
  66. Uban, A.-S.; Chulvi, B.; Rosso, P. An emotion and cognitive based analysis of mental health disorders from social media data. Future Gener. Comput. Syst. 2021, 124, 480–494. [Google Scholar] [CrossRef]
  67. Hull, T.D.; Levine, J.; Bantilan, N.; Desai, A.N.; Majumder, M.S. Analyzing Digital Evidence from a Telemental Health Platform to Assess Complex Psychological Responses to the COVID-19 Pandemic: Content Analysis of Text Messages. JMIR Form. Res. 2021, 5, e26190. [Google Scholar] [CrossRef]
  68. Flore, J. Ingestible sensors, data, and pharmaceuticals: Subjectivity in the era of digital mental health. New Media Soc. 2020, 23, 2034–2051. [Google Scholar] [CrossRef]
  69. Ali, K.; Farrer, L.; Gulliver, A.; Griffiths, K.M. Online Peer-to-Peer Support for Young People with Mental Health Problems: A Systematic Review. JMIR Ment. Health 2015, 2, e19. [Google Scholar] [CrossRef] [Green Version]
  70. Sharma, A.; Choudhury, M.; Althoff, T.; Sharma, A. Engagement Patterns of Peer-to-Peer Interactions on Mental Health Platforms. In Proceedings of the International AAAI Conference on Web and Social Media, Atlanta, GA, USA, 6–9 June 2020; Volume 14, pp. 614–625. [Google Scholar]
  71. Shalaby, R.A.H.; Agyapong, V.I.O. Peer Support in Mental Health: Literature Review. JMIR Ment. Health 2020, 7, e15572. [Google Scholar] [CrossRef] [PubMed]
  72. Rosenberg, B.M.; Kodish, T.; Cohen, Z.D.; Gong-Guy, E.; Craske, M.G. A novel peer-to-peer coaching program to support digital mental health: Design and implementation. JMIR Ment. Health 2022, 9, e32430. [Google Scholar] [CrossRef]
  73. American Psychological Association. The Promise and Challenges of AI. Monit. Psychol. 2021, 52. Available online: http://www.apa.org/monitor/2021/11/cover-artificial-intelligence (accessed on 2 November 2021).
  74. Calix, R.A.; Javadpour, L.; Knapp, G.M. Detection of Affective States from Text and Speech for Real-Time Human–Computer Interaction. Hum. Factors J. Hum. Factors Ergon. Soc. 2011, 54, 530–545. [Google Scholar] [CrossRef] [PubMed]
  75. Bantilan, N.; Malgaroli, M.; Ray, B.; Hull, T.D. Just in time crisis response: Suicide alert system for telemedicine psychotherapy settings. Psychother. Res. 2020, 31, 289–299. [Google Scholar] [CrossRef] [PubMed]
  76. Iorfino, F.; Occhipinti, J.-A.; Skinner, A.; Davenport, T.; Rowe, S.; Prodan, A.; Hickie, I.B. The Impact of Technology-Enabled Care Coordination in a Complex Mental Health System: A Local System Dynamics Model. J. Med. Internet Res. 2021, 23, e25331. [Google Scholar] [CrossRef]
  77. Fitzpatrick, K.K.; Darcy, A.; Vierhile, M. Delivering Cognitive Behavior Therapy to Young Adults with Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR Ment. Health 2017, 4, e19. [Google Scholar] [CrossRef]
  78. Denecke, K.; Abd-Alrazaq, A.; Househ, M. Artificial Intelligence for Chatbots in Mental Health: Opportunities and Challenges. In Lecture Notes in Bioengineering; Springer: Cham, Switzerland, 2021; pp. 115–128. [Google Scholar] [CrossRef]
  79. Malgaroli, M.; Hull, T.D.; Wiltsey Stirman, S.; Resick, P. Message Delivery for the Treatment of Posttraumatic Stress Disorder: Longitudinal Observational Study of Symptom Trajectories. J. Med. Internet Res. 2020, 22, e15587. [Google Scholar] [CrossRef]
  80. Kollins, S.H.; Childress, A.; Heusser, A.C.; Lutz, J. Effectiveness of a digital therapeutic as adjunct to treatment with medication in pediatric ADHD. Npj Digit. Med. 2021, 4, 1–8. [Google Scholar] [CrossRef]
  81. Graham, S.; Depp, C.; Lee, E.E.; Nebeker, C.; Tu, X.; Kim, H.-C.; Jeste, D.V. Artificial Intelligence for Mental Health and Mental Illnesses: An Overview. Curr. Psychiatry Rep. 2019, 21, 116. [Google Scholar] [CrossRef] [PubMed]
  82. Zytek, A.; Liu, D.; Vaithianathan, R.; Veeramachaneni, K. Understanding the Usability Challenges of Machine Learning In High-Stakes Decision Making’. IEEE Trans. Vis. Comput. Graph. 2021, 28, 1161–1171. [Google Scholar] [CrossRef] [PubMed]
  83. Luxton, D.D.; Anderson, S.L.; Anderson, M. Ethical Issues and Artificial Intelligence Technologies in Behavioral and Mental Health Care. In Artificial Intelligence in Behavioral and Mental Health Care; Elsevier BV: Amsterdam, The Netherlands, 2016; pp. 255–276. [Google Scholar] [CrossRef]
  84. Fiske, A.; Henningsen, P.; Buyx, A. Your Robot Therapist Will See You Now: Ethical Implications of Embodied Artificial Intelligence in Psychiatry, Psychology, and Psychotherapy. J. Med. Internet Res. 2019, 21, e13216. [Google Scholar] [CrossRef]
  85. Carr, S. “AI gone mental”: Engagement and ethics in data-driven technology for mental health. J. Ment. Health 2020, 29, 125–130. [Google Scholar] [CrossRef] [PubMed]
  86. Robert, L.P., Jr.; Bansal, G.; Lütge, C. ICIS 2019 SIGHCI workshop panel report: Human–computer interaction challenges and opportunities for fair, trustworthy and ethical artificial intelligence. AIS Trans. Hum.-Comput. Interact. 2020, 12, 96–108. [Google Scholar] [CrossRef]
  87. Thieme, A.; Belgrave, D.; Doherty, G. Machine Learning in Mental Health. ACM Trans. Comput.-Hum. Interact. 2020, 27, 1–53. [Google Scholar] [CrossRef]
  88. Shatte, A.B.R.; Hutchinson, D.M.; Teague, S.J. Machine learning in mental health: A scoping review of methods and applications. Psychol. Med. 2019, 49, 1426–1448. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  89. Moon, S.J.; Hwang, J.; Kana, R.; Torous, J.; Kim, J.W. Accuracy of Machine Learning Algorithms for the Diagnosis of Autism Spectrum Disorder: Systematic Review and Meta-Analysis of Brain Magnetic Resonance Imaging Studies. JMIR Ment. Health 2019, 6, e14108. [Google Scholar] [CrossRef] [Green Version]
  90. Chen, Q.; Zhang-James, Y.; Barnett, E.J.; Lichtenstein, P.; Jokinen, J.; D’Onofrio, B.M.; Fazel, S. Predicting suicide attempt or suicide death following a visit to psychiatric specialty care: A machine learning study using Swedish national registry data. PLoS Med. 2020, 17, e1003416. [Google Scholar] [CrossRef]
  91. Jiang, T.; Nagy, D.; Rosellini, A.J.; Horváth-Puhó, E.; Keyes, K.M.; Lash, T.L.; Gradus, J.L. (Suicide prediction among men and women with depression: A population-based study. J. Psychiatr. Res. 2021, 142, 275–282. [Google Scholar] [CrossRef]
  92. Machado, C.; Dos, S.; Ballester, P.L.; Cao, B.; Mwangi, B.; Caldieraro, M.A.; Kapczinski, F.; Passos, I.C. Prediction of suicide attempts in a prospective cohort study with a nationally representative sample of the US population. Psychol. Med. 2021, 1–12. [Google Scholar] [CrossRef] [PubMed]
  93. García de la Garza, Á.; Blanco, C.; Olfson, M.; Wall, M.M. Identification of Suicide Attempt Risk Factors in a National US Survey Using Machine Learning. JAMA Psychiatry 2021, 78, 398. [Google Scholar] [CrossRef]
  94. Xu, W. Toward human-centered AI: A perspective from human-computer interaction. Interactions 2019, 26, 42–46. Available online: https://interactions.acm.org/archive/view/july-august-2019/toward-human-centered-ai (accessed on 21 December 2021). [CrossRef] [Green Version]
  95. Vaidyam, A.N.; Wisniewski, H.; Halamka, J.D.; Kashavan, M.S.; Torous, J.B. Chatbots and Conversational Agents in Mental Health: A Review of the Psychiatric Landscape. Can. J. Psychiatry 2019, 64, 456–464. [Google Scholar] [CrossRef]
  96. Obermeyer, Z.; Powers, B.; Vogeli, C.; Mullainathan, S. Dissecting racial bias in an algorithm used to manage the health of populations. Science 2019, 366, 447–453. [Google Scholar] [CrossRef] [Green Version]
  97. Straw, I.; Callison-Burch, C. Artificial Intelligence in mental health and the biases of language based models. PLoS ONE 2020, 15, e0240376. [Google Scholar] [CrossRef]
  98. Bigman, Y.E.; Yam, K.C.; Marciano, D.; Reynolds, S.J.; Gray, K. Threat of racial and economic inequality increases preference for algorithm decision-making. Comput. Hum. Behav. 2021, 122, 106859. [Google Scholar] [CrossRef]
  99. Brown, J.E.H.; Halpern, J. AI chatbots cannot replace human interactions in the pursuit of more inclusive mental healthcare. SSM-Ment. Health 2021, 1, 100017. [Google Scholar] [CrossRef]
  100. Denecke, K.; Abd-Alrazaq, A.; Househ, M.; Warren, J. Evaluation Metrics for Health Chatbots: A Delphi Study. Methods Inf. Med. 2021, 60, 171–179. [Google Scholar] [CrossRef] [PubMed]
  101. Darcy, A.; Daniels, J.; Salinger, D.; Wicks, P.; Robinson, A. Evidence of Human-Level Bonds Established With a Digital Conversational Agent: Cross-sectional, Retrospective Observational Study. JMIR Form. Res. 2021, 5, e27868. [Google Scholar] [CrossRef]
  102. Gunning, D.; Aha, D. DARPA’s Explainable Artificial Intelligence (XAI) Program. AI Mag. 2019, 40, 44–58. [Google Scholar] [CrossRef]
  103. Barredo Arrieta, A.; Díaz-Rodríguez, N.; Del Ser, J.; Bennetot, A.; Tabik, S.; Barbado, A.; Herrera, F. Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI. Inf. Fusion 2020, 58, 82–115. [Google Scholar] [CrossRef] [Green Version]
  104. Setchi, R.; Dehkordi, M.B.; Khan, J.S. Explainable Robotics in Human-Robot Interactions. Procedia Comput. Sci. 2020, 176, 3057–3066. [Google Scholar] [CrossRef]
  105. Ćosić, K.; Popovic´, S.; Šarlija, M.; Kesedžic, I.; Gambiraža, M.; Dropuljic´, B.; Mijic´, I.; Henigsberg, N.; Jovanovic, T. AI-Based Prediction and Prevention of Psychological and Behavioral Changes in Ex-COVID-19 Patients. Front. Psychol. 2021, 12, 782866. [Google Scholar] [CrossRef]
  106. Insel, T.R. Digital phenotyping: A global tool for psychiatry. World Psychiatry 2018, 17, 276–277. [Google Scholar] [CrossRef] [Green Version]
  107. Dagum, P. Digital biomarkers of cognitive function. Npj Digit. Med. 2018, 1, 1–3. [Google Scholar] [CrossRef]
  108. Huckvale, K.; Venkatesh, S.; Christensen, H. Toward clinical digital phenotyping: A timely opportunity to consider purpose, quality, and safety. Npj Digit. Med. 2019, 2, 1–11. [Google Scholar] [CrossRef] [Green Version]
  109. Torous, J.; Wisniewski, H.; Bird, B.; Carpenter, E.; David, G.; Elejalde, E.; Keshavan, M. Creating a Digital Health Smartphone App and Digital Phenotyping Platform for Mental Health and Diverse Healthcare Needs: An Interdisciplinary and Collaborative Approach. J. Technol. Behav. Sci. 2019, 4, 73–85. [Google Scholar] [CrossRef] [Green Version]
  110. Wisniewski, H.; Henson, P.; Torous, J. Using a Smartphone App to Identify Clinically Relevant Behavior Trends via Symptom Report, Cognition Scores, and Exercise Levels: A Case Series. Front. Psychiatry 2019, 10, 652. [Google Scholar] [CrossRef] [Green Version]
  111. Stanghellini, G.; Leoni, F. Digital Phenotyping: Ethical Issues, Opportunities, and Threats. Front. Psychiatry 2020, 11, 473. [Google Scholar] [CrossRef]
  112. Dai, J.; Chen, Y.; Xia, C.; Zhou, J.; Liu, C.; Chen, C. Digital Sensory Phenotyping for Psychiatric Disorders. (2020). J. Psychiatry Brain Sci. 2020, 5, e200015. [Google Scholar] [CrossRef]
  113. Akbarialiabad, H.; Bastani, B.; Taghrir, M.H.; Paydar, S.; Ghahramani, N.; Kumar, M. Threats to Global Mental Health from Unregulated Digital Phenotyping and Neuromarketing: Recommendations for COVID-19 Era and Beyond. Front. Psychiatry 2021, 12, 713987. [Google Scholar] [CrossRef] [PubMed]
  114. Braciszewski, J.M. Digital Technology for Suicide Prevention. Adv. Psychiatry Behav. Health 2021, 1, 53–65. [Google Scholar] [CrossRef]
  115. Dogan, E.; Sander, C.; Wagner, X.; Hegerl, U.; Kohls, E. Smartphone-Based Monitoring of Objective and Subjective Data in Affective Disorders: Where Are We and Where Are We Going? Systematic Review. J. Med. Internet Res. 2017, 19, e262. [Google Scholar] [CrossRef] [Green Version]
  116. Müller, L.; De Rooy, D. Digital biomarkers for the prediction of mental health in aviation personnel. BMJ Health Care Inform. 2021, 28, e100335. [Google Scholar] [CrossRef]
  117. Coyle, D.; Matthews, M.; Sharry, J.; Nisbet, A.; Doherty, G. Personal Investigator: A Therapeutic 3D Game for Adolescent Psychotherapy. Int. J. Interact. Technol. Smart Educ. 2005, 2, 73–88. [Google Scholar] [CrossRef] [Green Version]
  118. Morris, S.; Dickinson, D.; Bellack, A.S.; Tenhula, W.N.; Gold, J.M. The development of a computer-based cognitive remediation program for schizophrenia. Schizophr. Res. 2003, 60, 326. [Google Scholar] [CrossRef]
  119. Walshe, D.G.; Lewis, E.J.; Kim, S.I.; O’Sullivan, K.; Wiederhold, B.K. Exploring the Use of Computer Games and Virtual Reality in Exposure Therapy for Fear of Driving Following a Motor Vehicle Accident. CyberPsychol. Behav. 2003, 6, 329–334. [Google Scholar] [CrossRef]
  120. Arns, M.; de Ridder, S.; Strehl, U.; Breteler, M.; Coenen, A. Efficacy of Neurofeedback Treatment in ADHD: The Effects on Inattention, Impulsivity and Hyperactivity: A Meta-Analysis. Clin. EEG Neurosci. 2009, 40, 180–189. [Google Scholar] [CrossRef]
  121. Fernández-Aranda, F.; Jiménez-Murcia, S.; Santamaría, J.J.; Gunnard, K.; Soto, A.; Kalapanidas, E.; Penelo, E. Video games as a complementary therapy tool in mental disorders: PlayMancer, a European multicentre study. J. Ment. Health 2012, 21, 364–374. [Google Scholar] [CrossRef]
  122. Cáceres, E.; Carrasco, M.; Ríos, S. Evaluation of an eye-pointer interaction device for human-computer interaction. Heliyon 2018, 4, e00574. [Google Scholar] [CrossRef] [Green Version]
  123. Mele, M.L.; Federici, S. Gaze and eye-tracking solutions for psychological research. Cogn. Process. 2012, 13, 261–265. [Google Scholar] [CrossRef] [PubMed]
  124. Coyle, D.; Doherty, G.; Sharry, J. An Evaluation of a Solution Focused Computer Game in Adolescent Interventions. Clin. Child Psychol. Psychiatry 2009, 14, 345–360. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  125. Lau, H.; Smit, J.; Fleming, T.; Riper, H. Serious games for mental health: Are they accessible, feasible, and effective? A systematic review and meta-analysis. Front. Psychiatry 2020, 7, 209. [Google Scholar] [CrossRef] [Green Version]
  126. Kollins, S.H.; DeLoss, D.J.; Cañadas, E.; Lutz, J.; Findling, R.L.; Keefe, R.S.E.; Faraone, S.V. A novel digital intervention for actively reducing severity of paediatric ADHD (STARS-ADHD): A randomised controlled trial. Lancet Digit. Health 2020, 2, e168–e178. [Google Scholar] [CrossRef]
  127. Rodrigo-Yanguas, M.; Martin-Moratinos, M.; Menendez-Garcia, A.; Gonzalez-Tardon, C.; Royuela, A.; Blasco-Fontecilla, H. A Virtual Reality Game (The Secret Trail of Moon) for Treating Attention-Deficit/Hyperactivity Disorder: Development and Usability Study. JMIR Serious Games 2021, 9, e26824. [Google Scholar] [CrossRef]
  128. Kowal, M.; Conroy, E.; Ramsbottom, N.; Smithies, T.; Toth, A.; Campbell, M. Gaming Your Mental Health: A Narrative Review on Mitigating Symptoms of Depression and Anxiety Using Commercial Video Games. JMIR Serious Games 2021, 9, e26575. [Google Scholar] [CrossRef]
  129. Pallavicini, F.; Pepe, A.; Mantovani, F. Commercial Off-The-Shelf Video Games for Reducing Stress and Anxiety: Systematic Review. JMIR Ment. Health 2021, 8, e28150. [Google Scholar] [CrossRef]
  130. Riva, G.; Serino, S. Virtual Reality in the Assessment, Understanding and Treatment of Mental Health Disorders. J. Clin. Med. 2020, 9, 3434. [Google Scholar] [CrossRef]
  131. Weerdmeester, J.; van Rooij, M.M.J.W.; Maciejewski, D.F.; Engels, R.C.M.E.; Granic, I. A randomized controlled trial assessing the efficacy of a virtual reality biofeedback video game: Anxiety outcomes and appraisal processes. Technol. Mind Behav. 2021, 2. [Google Scholar] [CrossRef]
  132. Pallavicini, F.; Pepe, A. Virtual Reality Games and the Role of Body Involvement in Enhancing Positive Emotions and Decreasing Anxiety: Within-Subjects Pilot Study. JMIR Serious Games 2020, 8, e15635. [Google Scholar] [CrossRef] [PubMed]
  133. Rus-Calafell, M.; Garety, P.; Sason, E.; Craig, T.; Valmaggia, L. Virtual reality in the assessment and treatment of psychosis: A systematic review of its utility, acceptability and effectiveness. Psychol. Med. 2018, 48, 362–391. [Google Scholar] [CrossRef] [PubMed]
  134. Brander, M.; Egger, S.T.; Hürlimann, N.; Seifritz, E.; Sumner, R.W.; Vetter, S.; Magnenat, S. Virtual Reality Human–Human Interface to Deliver Psychotherapy to People Experiencing Auditory Verbal Hallucinations: Development and Usability Study. JMIR Serious Games 2021, 9, e26820. [Google Scholar] [CrossRef] [PubMed]
  135. Calvo, R.A.; Dinakar, K.; Picard, R.; Christensen, H.; Torous, J. Toward Impactful Collaborations on Computing and Mental Health. J. Med. Internet Res. 2018, 20, e49. [Google Scholar] [CrossRef] [PubMed]
  136. Flood, M.; Ennis, M.; Ludlow, A.; Sweeney, F.F.; Holton, A.; Morgan, S.; Moriarty, F. Research methods from human-centered design: Potential applications in pharmacy and health services research. Res. Soc. Adm. Pharm. 2018, 17, 2036–2043. [Google Scholar] [CrossRef]
  137. Hickie, I.B.; Scott, E.M.; Cross, S.P.; Iorfino, F.; Davenport, T.A.; Guastella, A.J.; Naismith, S.L.; Carpenter, J.S.; Rohleder, C.; Crouse, J.J.; et al. Right care, first time: A highly personalised and measurement-based care model to manage youth mental health. Med. J. Aust. 2019, 211, S3–S46. [Google Scholar] [CrossRef] [Green Version]
  138. Davenport, T.A.; Cheng, V.W.S.; Iorfino, F.; Hamilton, B.; Castaldi, E.; Burton, A.; Scott, E.M.; Hickie, I.B. Flip the Clinic: A Digital Health Approach to Youth Mental Health Service Delivery During the COVID-19 Pandemic and Beyond. JMIR Ment. Health 2019, 7, e24578. [Google Scholar] [CrossRef]
  139. Compton, M.T.; Shim, R.S. Mental Illness Prevention and Mental Health Promotion: When, Who, and How. Psychiatr. Serv. 2020, 71, 981–983. [Google Scholar] [CrossRef]
  140. Simon, F.A.J.; Schenk, M.; Palm, D.; Faltraco, F.; Thome, J. The Collateral Damage of the COVID-19 Outbreak on Mental Health and Psychiatry. Int. J. Environ. Res. Public Health 2021, 18, 4440. [Google Scholar] [CrossRef]
  141. Rocheleau, J.N.; Shaughnessy, K.; Courtice, E.L.; Reyes, R.; Huijbregts, T.; Gahir, R.; Fischler, I. Evaluating a Mobile Digital Health Platform as an Adjunct to Patients’ Mental Healthcare Using a Longitudinal, Mixed-Methods Study Design; SAGE Publications Ltd.: Thousand Oaks, CA, USA, 2018. [Google Scholar]
  142. Malhotra, S.; Chakrabarti, S.; Shah, R. A model for digital mental healthcare: Its usefulness and potential for service delivery in low- and middle-income countries. Indian J. Psychiatry 2019, 61, 27–36. [Google Scholar] [CrossRef]
  143. Menon, V.; Varadharajan, N. Digital approaches for mental health service delivery in low- and middle-income countries like India: Key implementational challenges and recommendations. Asian J. Psychiatry 2020, 50, 101962. [Google Scholar] [CrossRef] [PubMed]
  144. Jalil, S.; Myers, T.; Atkinson, I.; Soden, M. Complementing a Clinical Trial with Human-Computer Interaction: Patients’ User Experience with Telehealth. JMIR Hum. Factors 2019, 6, e9481. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  145. Balcombe, L.; De Leo, D. An Integrated Blueprint for Digital Mental Health Services Amidst COVID-19. JMIR Ment. Health 2020, 7, e21718. [Google Scholar] [CrossRef] [PubMed]
  146. Balcombe, L.; De Leo, D. Psychological Screening and Tracking of Athletes and Digital Mental Health Solutions in a Hybrid Model of Care: Mini Review. JMIR Form. Res. 2020, 4, e22755. [Google Scholar] [CrossRef] [PubMed]
  147. Balcombe, L.; De Leo, D. The Potential Impact of Adjunct Digital Tools and Technology to Help Distressed and Suicidal Men: An Integrative Review. Front. Psychol. 2022, 12, 796371. [Google Scholar] [CrossRef] [PubMed]
  148. Cosgrove, L.; Karter, J.M.; Morrill, Z.; McGinley, M. Psychology and Surveillance Capitalism: The Risk of Pushing Mental Health Apps During the COVID-19 Pandemic. J. Humanist. Psychol. 2020, 60, 611–625. [Google Scholar] [CrossRef]
  149. Aung, Y.Y.M.; Wong, D.C.S.; Ting, D.S.W. The promise of artificial intelligence: A review of the opportunities and challenges of artificial intelligence in healthcare. Br. Med. Bull. 2021, 139, 4–15. [Google Scholar] [CrossRef]
  150. Lee, E.E.; Torous, J.; De Choudhury, M.; Depp, C.A.; Graham, S.A.; Kim, H.-C.; Jeste, D.V. Artificial Intelligence for Mental Health Care: Clinical Applications, Barriers, Facilitators, and Artificial Wisdom. Biol. Psychiatry Cogn. Neurosci. Neuroimaging 2021, 6, 856–864. [Google Scholar] [CrossRef]
  151. Torous, J.; Hsin, H. Empowering the digital therapeutic relationship: Virtual clinics for digital health interventions. Npj Digit. Med. 2018, 1, 1–3. [Google Scholar] [CrossRef]
  152. Balcombe, L.; De Leo, D. Digital Mental Health Challenges and the Horizon Ahead for Solutions. JMIR Ment. Health 2021, 8, e26811. [Google Scholar] [CrossRef]
  153. Wies, B.; Landers, C.; Ienca, M. Digital Mental Health for Young People: A Scoping Review of Promises and Challenges. Front. Digit. Health 2021, 3, 91. [Google Scholar] [CrossRef]
  154. Lederman, R.; Gleeson, J.; Wadley, G.; D’Alfonso, S.; Rice, S.; Santesteban-Echarri, O.; Alvarez-Jimenez, M. Support for Carers of Young People with Mental Illness. ACM Trans. Comput.-Hum. Interact. 2019, 26, 1–33. [Google Scholar] [CrossRef]
  155. Tremain, H.; McEnery, C.; Fletcher, K.; Murray, G. The Therapeutic Alliance in Digital Mental Health Interventions for Serious Mental Illnesses: Narrative Review. JMIR Ment. Health 2020, 7, e17204. [Google Scholar] [CrossRef] [PubMed]
  156. Iorfino, F.; Piper, S.E.; Prodan, A.; LaMonica, H.M.; Davenport, T.A.; Lee, G.Y.; Hickie, I.B. Using Digital Technologies to Facilitate Care Coordination Between Youth Mental Health Services: A Guide for Implementation. Front. Health Serv. 2021, 1, 745456. [Google Scholar] [CrossRef]
  157. Blitz, R.; Storck, M.; Baune, B.T.; Dugas, M.; Opel, N. Design and Implementation of an Informatics Infrastructure for Standardized Data Acquisition, Transfer, Storage, and Export in Psychiatric Clinical Routine: Feasibility Study. JMIR Ment. Health 2021, 8, e26681. [Google Scholar] [CrossRef] [PubMed]
  158. Paton, C.; Kushniruk, A.W.; Borycki, E.M.; English, M.; Warren, J. Improving the Usability and Safety of Digital Health Systems: The Role of Predictive Human-Computer Interaction Modeling. J. Med. Internet Res. 2021, 23, e25281. [Google Scholar] [CrossRef] [PubMed]
Table 1. Five step integrative review literature search method.
Table 1. Five step integrative review literature search method.
(1) Problem identification
(2) Literature search
 ▪ Participant characteristics
 ▪ Reported outcomes
 ▪ Empirical or theoretical approach
(3) Author views
 ▪ Clinical effectiveness
 ▪ User impact (feasibility/acceptability)
 ▪ Social and cultural impact
 ▪ Readiness for clinical or digital solutions adoption
 ▪ Critical appraisal and evaluation
(4) Determine rigor and contribution to data analysis
(5) Synthesis of important foundations or conclusions into an integrated summation
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Balcombe, L.; De Leo, D. Human-Computer Interaction in Digital Mental Health. Informatics 2022, 9, 14. https://doi.org/10.3390/informatics9010014

AMA Style

Balcombe L, De Leo D. Human-Computer Interaction in Digital Mental Health. Informatics. 2022; 9(1):14. https://doi.org/10.3390/informatics9010014

Chicago/Turabian Style

Balcombe, Luke, and Diego De Leo. 2022. "Human-Computer Interaction in Digital Mental Health" Informatics 9, no. 1: 14. https://doi.org/10.3390/informatics9010014

APA Style

Balcombe, L., & De Leo, D. (2022). Human-Computer Interaction in Digital Mental Health. Informatics, 9(1), 14. https://doi.org/10.3390/informatics9010014

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop