Google and Apple need to do more to protect children with depression from accessing unsuitable apps on their marketplaces, say researchers.
Depression is a major affective disorder affecting more than 300 million people worldwide and the number of mobile apps offering treatment for depression is growing rapidly.
Many of these apps offer beneficial features for users and could be useful for hard to reach groups, such as adolescents.
However, a review of 29 of the highest rated apps found when searching for “depression” on Google Play and Apple’s App Store marketplaces revealed that current marketplace regulations and categories are not sufficient for data protection and age suitability for mental health apps.
All the apps in the study are rated by Google Play and Apple’s App Store as suitable for children, and most are classified as suitable for pre-school age toddlers.
Researchers believe there is a mismatch between age ratings used by app stores, which were created to target material such as violent content in video games, and the need to regulate health-related apps. These mental health apps, which are designed and rated in line with marketplace guidelines, should have clear age restrictions due to the personal and sensitive nature of the content, and the associated risk of harm.
The review, led by researchers from Lancaster University and Trinity College Dublin, found some apps, two of the 29, contain negative emotional content, such as images or quotes used to capture negative thoughts. This kind of content, researchers argue, could be harmful to vulnerable individuals, and particularly children.
Lancaster University Research Associate Chengcheng Qu said: “Surprisingly these apps with potentially disturbing content are rated as PEGI 3 or PEGI 12 on the app marketplaces, which indicates to potential users, or parents, that the apps’ content merely includes bad language.
“Prior studies have shown that adolescents’ exposure to negative content may trigger negative behaviour, such as self-harm. There is therefore a clear need to look at how to protect vulnerable app users, such as those at risk of self-harm or suicide.”
Data protection was another concern for researchers. Fewer than half of the apps, 12 of the 29, provide privacy policies to protect children’s data. Of those that do, just over half claim to restrict users to specific age groups – however, these restrictions are inconsistent with the low age ratings given on the marketplaces.
Although many of the apps are free to download, a third, 10 of the 29, contain adverts. Of these, 80 per cent state in their privacy policies that users’ information would be captured and shared with 3rd parties, including advertisers. However, this information is not within the app descriptions in the marketplaces, potentially leading parents to download inappropriate apps for their children.
The research found many apps provide features that can benefit users across a range of different age groups. Though it revealed evidence on the science underpinning many mental health apps is often lacking.
While most apps in the study claim to be informed by evidence-based treatments, only two provide direct peer-reviewed evidence on their effectiveness for reducing the symptoms of depression.
Professor Corina Sas from Lancaster University and Principal Investigator of the AffecTech project funding this research said: “The potential of these types of apps is promising, especially for reaching groups of people, such as adolescents, who are less likely to seek professional support offline. However, there is a real and urgent need for Google and Apple to regulate their marketplaces to safeguard users and ensure these mental health apps have a positive impact.
“Greater regulation and transparency would help mitigate ethical risks around missing, inadequate or inconsistent privacy policies, sharing data with advertisers, child data protection and the safeguarding of vulnerable users as well as providing clarity about the level of scientific validation behind individual apps.”
Dr Gavin Doherty, from Trinity College Dublin, adds: “Introducing a more appropriate set of age ratings that takes into account the sensitivity of the content and data handled by health apps would be a relatively straightforward and helpful step to take, and would give clarity to app developers.”
Some of the changes researchers call for include:
· A clearer definition of age restrictions on the marketplaces to help users, especially younger users and parents, select age-appropriate apps.
· Developers should make their privacy policies easier to read.
· Marketplace guidelines should clearly communicate the level of scientific evidence underpinning each app.
· App developers should consider the presence of negative content within their apps when selecting an age rating on the marketplace.
· Apps should include safeguards for users viewing highly negative content, such as limiting the amount of negative content they can see.
· Apps not specifically designed for children and adolescents should consider introducing customisable designs for non-adult users.
· Designers of apps should consider ways to include parental support or supervision while children are using these apps.
The research is outlined in the paper ‘Functionality of Top-Rated Mobile Apps for Depression: Systematic Search and Evaluation’, which has been published by JMIR Mental Health.
The research was conducted as part of the AffecTech project, a Marie Sklodowska-Curie Innovative Training Network funded by the European Commission’s Horizon 2020 programme.
The paper’s authors are Chengcheng Qu, Professor Corina Sas and Claudia Dauden Roquet from Lancaster University and Dr Gavin Doherty of Trinity College Dublin.Back to News