Abstract
There are many apps available for parents that are designed to help them monitor their pregnancy or child’s development. These apps require parents to share information about themselves or their children in order to utilise many of the apps’ features. However, parents remain concerned about their children’s privacy, indicating a privacy paradox between concerns and actions. The research presented here conducted an analysis of parenting apps alongside a survey of parents to determine if their concerns regarding sharing information about their children was at odds with their use of parenting apps.
A survey of 75 parents found that they had strong concerns around the availability of information about their children but were using apps within which they shared this information. Parents were not giving consideration to the information requested when using apps. This should be of concern to developers given the growing awareness of users’ rights in relation to managing their data.
We propose new guidelines for app developers to better protect children’s privacy and to improve trust relationships between developers and users.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Jack’s mum shares information about her son’s ADHD using a child development tracker and social network app. Eight years later she wants to enroll her child in an exclusive private school. Unbeknown to her, they search online to find evidence of behavioural issues before they decide whether or not to admit him. They find the original posting, and the ensuing discussion where others give her behavioural advice. The school decides not to admit her child.
Pregnancy and the experience of becoming a parent is a life changing event and in this digital age it is not surprising that parents look to online resources and mobile applications to provide them with information and support during this period (Prior 2016). Mobile application developers have responded to this and there are apps for a large variety of parenting issues from conception to pregnancy development, contraction monitors and baby development trackers. Parents are becoming used to sharing information about their children before they are even born (Lupton and Pedersen 2016).
However, by sharing information about their children through these apps, parents are unwittingly creating a digital footprint for their child and potentially compromising their child’s privacy. Recent breaches in privacy in apps such as Sitter (an app used for hiring babysitters), in which information including address, credit card details and information on users’ children were leaked in a data breach highlights the potential risks in having this information stored online, even in trustworthy apps (Abel 2018).
At the same time parents report being concerned about what information is available about their child online (Madden et al. 2012). This research seeks to build on previous work in the area of sharing children’s information on social media, and in the security and privacy of mobile health applications to examine how parents’ views of privacy match with the mobile apps they install and use on their phones and the permissions they grant to these apps.
By better understanding how parents’ concerns may impact on their use of these apps there is the potential for designers and developers of parenting apps to reach out to parents by including usable security and privacy measures which are clear and straight forward for users. These could be applied in similar manner to other HCI guidelines such as accessibility.
2 Related Work
2.1 Parenting and Privacy
In recent years there has been an increasing interest in parenting and data privacy within the HCI community (Ammari et al. 2015; Moser et al. 2017).
Previous work in the field has looked at the information teenagers and children share about themselves on social media, and the implications this has on their privacy (Marwick and Boyd 2014; Silva et al. 2017). There has been a growing awareness in recent years, particularly as the digital native generation has aged and become parents of young children themselves that parents are increasingly sharing private information about their children with online audiences.
Much of this work has been conducted from a sociological perspective and examined sharenting – a term used to describe parents who over share information about their children online. There have been concerns over how children of so called “mummy bloggers” (professional bloggers who post regularly about parenting and updates on their children) may feel in the future when reading posts about themselves (Blum-Ross and Livingstone 2017; Orton-Johnson 2017). One large study looking at Facebook sharenting was conducted by Marasli et al. (2016) who examined the Facebook profiles of 94 parents and looked at the information shared through these profiles about their children. The study considered the social implications of this information being shared but also briefly touched on the potential for the children to become victims of identity theft. However, hiding this information from so called “big data” companies can be a considerable challenge, Vertesei (2014) looked at the steps necessary to prevent corporations from discovering a pregnancy and likened the necessary steps to being similar to those used by people wishing to commit criminal acts.
There has also been research into the data theft implications of this information being shared. Brosch (2016) examined the Facebook profiles of participants and noted the significant number of parents uploading photos of birth certificates or sharing their child’s date of birth. The risks in sharing this sort of information was discussed by Minkus et al. (2015) who crawled a large number of adult profiles on Facebook for evidence of children in their public profiles and then combined this with public records to identify information on the children. Minkus highlights the value of this information to a data broker. It is possible that parents are not aware of the dangers in sharing information about children online, Steinberg (2017) offers suggestions for protecting children’s privacy and suggests that this model could be viewed in a similar manner to the “back to sleep” and second hand smoking campaigns of the 1990s and early 2000s. Most of the work done into sharenting to date has focussed on social media, however the growing availability of parenting and pregnancy apps means that these are increasingly becoming another avenue for parents to share information about their children with others.
2.2 Mobile Apps and Privacy
The number of apps available to provide information and guidance on a range of topics beginning at ovulation tracking through to pregnancy, childbirth and parenting continues to grow at a rapid rate (Lupton et al. 2016). According to market research from 2013, pregnancy apps are more popular than fitness apps (Dolan 2013), and while there has been less research done into this recently it is thought that their popularity continues to grow (Haelle 2018). It is possible that parents feel that by searching for information, sharing images and monitoring the development of their children they are performing good parenthood (Lupton et al. 2016) and there is an increasing awareness that users are sharing a large volume of information through these mobile apps.
There has been interest in the amount of information being shared in mHealth apps (mobile phone apps related to health) in general for several years. One concern since health apps began to appear was the trustworthiness of the information, however this has now grown into concerns regarding the security of users’ health information (Adhikari and Richards 2014). It has been suggested that data breaches in mHealth apps are more common than might have been thought (Adhikari and Richards 2014), and this may be due to a limited understanding of security and privacy in mHealth apps and the risks associated with this information being leaked (Dehling et al. 2015). Plachkinova et al. (2015) created a taxonomy of mHealth apps in order to investigate their security and privacy concerns and suggest that information on privacy should be available in an app’s description so that users can read it before downloading.
It is argued that the rush to produce mHealth apps has led to some aspects of privacy and security not being considered (Martínez-Pérez et al. 2014). At the same time there appears to be a paradox in that users have high concerns about their privacy online but are also willing to trade their personal information freely when they feel there is a benefit to them (Wilson and Valacich 2012). It is still not clear whether this paradox is due to users’ desire for instant gratification and is a behavioural mechanism which cannot be altered (Acquisti and Gross 2006), or a case of learned helplessness. Learned helplessness describes a situation in which users feel that it is inevitable that at some point their data will be compromised and as a result feel there is no point in taking privacy protecting actions (Shklovski et al. 2014).
Research into privacy concerns surrounding parenting apps has been more limited than general mHealth apps, however it is now a growing concern. An Australian study found that many women using pregnancy apps were not concerned by the privacy of the information shared or the accuracy of the information they receive (Lupton and Pedersen 2016). Lupton (2016) argues that monitoring apps related to conception and pregnancy may have been created for the purpose of acquiring data for data breaches. There is a risk that by sharing information in order to use apps, parents and their children could effectively become recruited as unpaid contributors to the “digital labour workforce” (Lupton and Williamson 2017).
This study examines the links between the security concerns of parents when considering sharing information about their children, and the information they are willing to provide in order to use parenting apps. It differs from previous studies such as Lupton and Pedersen’s work (2016) in that it is targeted specifically at users of parenting and pregnancy apps and looks at a wider range of privacy issues.
In this study we will investigate the extent to which parents are aware of the interaction between data sharing on apps and their privacy and propose the following hypothesis:
-
H1. Parents are conscious of security and privacy dangers in sharing information about their children.
-
H2. Parents with privacy and security concerns install, use and grant permissions to apps on their mobile phone without considering the security implications.
-
H3. Parents do not consider data sharing implications while selecting and installing apps.
Using these hypothesis, we look to answer the research question, what consideration do parents give to security and privacy concerns when installing parenting apps?
3 Methodology
In this present study, we explore the extent to which parents consider the security and privacy implications of providing data about themselves and their children in mobile apps and whether this influences the decisions they make about installing apps related to pregnancy and parenting.
A two stage approach was taken within this study. Firstly a poll was conducted with members of online parenting groups to discover popular parenting and pregnancy apps. This was combined with an analysis of trending Apps for Parents from the App Store.
The initial poll involved asking members of Parenting Facebook Groups which apps they used, and collating those responses. This was then followed by a survey of users of these apps to determine the extent to which they consider the security implications when installing apps.
3.1 App Analysis
Eleven of the most popular parenting apps were selected for this study. These apps were then further analysed to ascertain which data types the apps requested during registration, and to review the terms and conditions of these apps. All of the apps were available through Google Play or the Apple Store.
The identified apps are shown in Table 1, along with a description of the purpose of the app.
We installed these apps onto our own personal mobile devices (one Android and one iOS), and logged the data that each app requested during the registration process. Table 2 shows the data that each app requested from the user.
Data Collected by Apps.
It is evident from Table 2, that there is a very broad range of data collected by the various apps. Some of the data requested relates exclusively to the user, for example name, email address and photo, while other data types relate to the user’s child(ren), for example child’s date of birth, child’s medical history, and child’s place of birth. To appreciate the sensitivity of the different data types, we further categorised the data depending on the longevity of the data. For each data type stored by the app, we categorised the data type depending on whether the data was static (i.e. would not change during the course of the owner’s lifetime), flexible (may change over time) or dynamic (likely to change frequently). If there was a data breach, static and flexible data will be of more value in constructing a further attack on a victim, as the data is more likely to be accurate and useful in generating an attack hook that could appear authentic to the victim. In Table 2, the static data types have been coloured red, flexible as orange and dynamic as green. Some of the data could be considered publicly available, (e.g. if the data is already available in the public domain, e.g. from websites such as 192.com), while others would be considered private. For each of the data types, those considered private (i.e. not generally shared online or via social media) are denoted by (P). Some of this information is considered private by certain organisations, and used to confirm identity (e.g. date of birth, place of birth).
Privacy Threats from Apps.
Privacy threats relating to the use of parenting apps include: a breach of user confidentiality, failure to protect the data, and client or server end bugs that could lead to a security breach. Given the sensitivity of some of the data collected, it is important that users are fully aware of what data is being collected, how and where it is being stored, and how the data is going to be used. We reviewed the Privacy Policies of each of the apps, to ascertain where data was stored and how it was used. Typically, the paid apps had the best level of privacy protection, where the user’s data was generally stored only on the user’s phone. This would ensure that a breach of the organisation’s infrastructure would not lead to compromisation of the user’s data, and the data was not being passed to third parties for marketing purposes.
Any threats to the user’s privacy from these types of apps is limited to app security on the device itself, and the ease with which malicious apps installed on the device may be able to access the data on the device. The free apps stated various levels of data sharing within their privacy policies, with all of them requesting permission to store user data on the organisation’s servers, with that data being passed to third parties for marketing purposes. Typically, the social media apps required the user to agree to share their data with other users.
Based on the information that is stored by these apps, a list of potential attacks that could be conducted using this information was compiled. A description of these potential attacks and how information could be used to help orchestrate such an attack is described below:
Spear phishing is an email spoofing attack that targets a specific user or organisation, using information that has been harvested to make the email look more authentic. Spear phishing emails are generally designed to gain unauthorised access to sensitive information, either by persuading the user to click on a link in the email or open a malicious attachment. If someone’s email address was leaked from a parenting app, along with other personal information, this information could be used to construct an authentic looking spear phishing attack.
Identity fraud can occur if personal information has been stolen, for example following a breach of user confidentiality. Information such as name, address, date of birth can be used to gain access to existing accounts, or used to obtain goods or services by impersonating the user to open new accounts.
If a user’s details are breached, this information could be used to construct a password guessing attack, or reset a user’s password if ‘security questions’ can be guessed. If a user’s password hash has been stolen, user’s details could be used to generate a password dictionary, unique to that particular user that could be used to then orchestrate a hybrid password attack.
It would be relatively easy for someone who was seeking access to children, or vulnerable mothers, to use the apps to gain access to this demographic for grooming purposes.
App Reviews.
The reviews of the apps in Table 1 which were posted in both the Google Play and Apple App stores were examined for any mention of privacy or security. Over 500 reviews were examined, eleven reviews were related to security or privacy, these were spread across seven apps.
Three reviews mentioned issues in deleting data, either users could not work out how to delete their data, or having thought they deleted it they discovered that it was still stored by the app provider. Four reviews expressed concern over the amount of data which was being collected, these reviews related to two apps. While many reviews revealed users’ frustration at being forced to create a Facebook account to log into some apps only one review related this back to a concern around their personal privacy. Of the remaining three reviews, one raised concern around the other users of the app and how they were vetted, one commented that it was not easy to change the privacy settings of photos stored on the app and the remaining review stated that they liked not having to give special privacy permissions to install the app.
3.2 Survey
Following analysis of the parenting apps selected for the study, we conducted a survey of parents who used parenting apps to determine what consideration they gave to security concerns when using the apps.
Participants and Design.
An online survey was used to gather parents’ views on security, privacy and app use. The use of an online survey enabled us to reach a wider geographical area than would have been possible through other methods. The survey questions were trialled with seven participants prior to validate the survey and ensure there were no barriers to survey completion. The survey was designed to be completed in under ten minutes.
Recruitment.
Participants were recruited to the survey over the period of one month between April and May 2018. Recruitment took place through a variety of online mediums including Twitter, Facebook groups targeted at parents and parenting websites (such as NetMums and BabyCentre). We were aware that some parents who used parenting apps may not use social media and so leaflets about the study were put in playgroups, parenting cafes and children’s recreation centres in the local area.
Following the informed consent process there were four sections to this survey.
Demographics.
Participants were asked for their age, gender, level of education, if they had children, if they used parenting apps and their country of residence. If participants selected no to either having children or using parenting apps they were excluded from the study. As the survey was examining the amount of personal information which participants are willing to share we deliberately avoided asking for information by means of which participants could be identified. This meant that the demographic information elicited was limited but still ensured that any themes in participant groups could be identified.
Use of Apps.
Before participants were asked about their privacy concerns they were asked to identify which of the apps in Table 2 they used. There were eleven apps available for them to select and the option of none of the above.
Privacy Concerns.
The questions on privacy concerns were separated into two sections. The first section asked questions relating to general attitudes to security and privacy on a five point Likert scale ranging from Strongly Disagree to Strongly Agree.
The second section looked specifically at the information participants were willing to share in order to use an app. The following information asked for by the apps were listed and participants rated on a five point Likert scale their level of comfortableness with sharing this information, ranging from Very Uncomfortable to Very Comfortable.
Reaction to Apps.
The final stage of the survey revealed a grid which presented the types of information stored by each of the apps. Once participants had examined this grid they were asked whether they were comfortable or uncomfortable with this information being collected. Depending on their answer they were then directed either to a question asking why they felt comfortable sharing this information or to a question asking them given they were uncomfortable with this what they were likely to do with these apps in the future.
Ethical consent was sought and obtained by the authors’ institution prior to the commencement of the survey. The survey asked for no identifiable information and provided details at the end on websites which could provide participants with further information on keeping their data private if they wished.
4 Results
4.1 Descriptive Statistics
During the month that the survey was available 101 participants began the study, 26 were excluded because they were either not parents or did not use parenting apps. In total 75 participants completed the survey. Of those participants 97% (n = 73) were female. Participants were composed predominantly of the millennial generation, the majority of participants were aged between 26 and 35 see Table 3.
The participants were mainly from the United Kingdom with the remainder coming from USA, Australia, Singapore and Ukraine. Participants had a variety of educational experiences ranging from not completing high school to having a PhD. The majority of participants had at least an undergraduate degree. All participants in the study had used at least one app with the average being 2.6 (sd = 1.6). The most popular app used was Baby Center (52%, n = 39).
4.2 Levels of Concern
When asked how concerned they were about the data being collected and shared about themselves and their children, the participants were above neutral in their level of concern for every category - see Table 4. Participants were more concerned about the data being gathered about their children than they were about themselves. The highest average level of concern related to information being collected about children for advertising purposes (n = 4.04), followed closely by concerns around who could see information about their children (n = 4). When considering information being shared about their children, only three participants (4%) were completely unconcerned with the volume of information stored about their children online, and with whom could see it. One participant (1.33%) was completely unconcerned about the information stored about their child for marketing purposes (Fig. 1).
4.3 Levels of Comfort
The examination of the information participants would be comfortable sharing revealed that there were only two categories of information in which participants on average rated feeling comfortable or very comfortable in sharing, these were their own name (n = 3.45) and their email address (n = 3.45). By contrast the average level of comfort for their child’s name was 2.56. Participants were least comfortable in sharing their child’s medical history (n = 1.56), followed closely by their own medical history (n = 1.6). The median for both these forms of data was 1 (Fig. 2).
4.4 Continued Use of Apps
Having been presented with the information regarding what information is stored by the apps, the participants were asked if they were happy to continue using the app, 81.3% of participants (n = 61) were not happy.
Participants who were unhappy were asked about their future plans for use of these apps and the majority (65%, n = 39) said that they would consider altering the settings on the apps. Only two participants (3.3%) planned to continue using the app as before and 31.7% (n = 19) would consider deleting the app.
The participants who were happy with the information being stored on the app were asked for the reason behind this. For the majority (78.6%, n = 11) this was because they trusted the app providers to use the data appropriately, while the remaining three participants (21.4%) were not concerned about how this data was used.
By splitting the participants into two groups it is possible to examine any differences between them in their levels of concerns around issues about their data being shared and in their degrees of comfort in sharing specific forms of information.
In both cases the data was normally distributed and application of the MannWhitney U test shows that participants who were unhappy with the data being stored on the apps had reported higher levels of concerns around data sharing issues than the participants who were happy with the data stored on the apps (z = 3.01, p = 0.003). However there was no significant difference between their levels of comfort in sharing specific items of information (p = 0.14).
5 Discussion
5.1 H1. Parent’s Consciousness of Security and Privacy Dangers
Parents in the study on average were concerned about the information stored about them and their children. Areas of particular concern were who could have access to this information and how this information could be used for marketing purposes. This suggests they are conscious of the dangers in sharing information about their children.
5.2 H2. Installation and Use of Apps
There were no significant differences in the amount of information parents with high security concerns were potentially sharing through apps compared to those with lower security concerns. This indicates that despite having privacy and security concerns, parents are installing apps which request large amounts of information. It should be noted that this study did not specifically ask what information was being shared, but from our analysis of the apps we can conclude that this information would be required in order to use the app effectively.
5.3 H3. Sharing of Information
When participants were shown the amount of information that could be stored by the different apps they were using, 81.3% (n = 61) they stated that they were uncomfortable with this and of these the majority (n = 39, 65%) would be looking to change the settings on these apps. This indicates that parents are installing apps which request information about their children and are not taking on board that they are sharing information in doing this.
The results suggest that although the majority of parents are concerned by online security and privacy issues surrounding their children, they give little to no consideration to this when installing mobile parenting apps.
5.4 Parents Views and Actions
Parents in the study reported being highly concerned about the information being shared about themselves and their children and uncomfortable with many types of information being shared. However, this appeared to be in contrast with their behaviour of using many apps which stored personal information on them. This privacy paradox has been noted in previous studies (Norberg et al. 2007), in which it has been noted that despite their disclosed intentions users frequently share personal data. Much of this previous work has looked at social media however, as this study has shown, this behaviour can also be found in the use of apps. This study’s results are in contrast to those found in Lupton and Pedersen’s work (2016) which found participants were not overly concerned with the data being stored or used by parenting apps.
One potential reason why this contrast can be seen so clearly in this study could have been due to stories in the media during the time period the survey was available. The Cambridge Analytica/Facebook scandal was first covered in the press in the UK in March 2018, shortly before we deployed this survey and during the study was being discussed at on many popular parenting websites (Babycentre 2018; Mumsnet 2018). The heightened awareness is often that users’ data is a commodity may have explained this difference.
5.5 Guidelines for Developers
Those participants who responded that they were unhappy with the apps having their information indicated that they were likely to have been unaware of the amount of information being stored before the survey – only 3.3% would continue to use the apps with no changes. As pointed out by some commentators, by raising awareness of how data is processed there is the opportunity to build trust between app providers and their users. If user awareness in this area continues to rise then app providers may find themselves at risk of losing users if this trust is not present. Parental app providers could be at an additional risk, as this study shows parents are more guarded about their child’s data than their own.
There are existing guidelines for developers regarding how much data they should request from and store about users. However, what our study suggests is that parents are particularly concerned about the amount of information being stored on their children and that should they become aware of this it may impact upon their use of the app.
We suggest that in addition to complying with current data legislation and only storing the minimum required amount of information, developers of parenting apps should work with parents to determine the levels of information they are comfortable in sharing with specific applications. Developers should also give consideration as to how they present the terms and conditions of their apps. If parents feel confident that they understand the digital footprint they are creating for their child then there is the possibility that they may be more inclined to continue to use the app.
In addition there may be benefits in allowing parents to use the apps without providing specific information about their child. For example will it affect the advice given by the app if they don’t provide their child’s gender or exact date of birth. Consider if there are less identifying ways to gather this data such as the child’s age, could an icon be used instead of a photo of their child? Developers should give consideration to these issues when designing parenting apps.
5.6 Limitations
There are several limitations within this survey. Firstly by asking adults to self report on their concerns around data privacy there is the risk of a social desirability bias, we were not able to validate how accurate this reporting was. Secondly we did not investigate if parents who are concerned about the data being stored are actually sharing their child’s data or using dummy data.
When asking participants about their comfort in sharing specific pieces of information this was on a very general form, there would be benefit in future studies examining in which situations they would be comfortable sharing these different pieces of information.
Finally we have not been able to follow up on the study to determine if the participants took any action after discovering the amount of data they were sharing.
6 Conclusion and Future Work
There has been a large growth in recent years in pregnancy and parenting apps, and data on parents is of interest to many corporations. At the same time parents are becoming increasingly concerned with what data is available online about their children and who can access this. This study has shown that despite these concerns many parents are using apps which store this information and have not given consideration as to who may have access to this information.
This survey showed that despite being unhappy with the information stored by app providers, the majority of participants were still planning to continue using the apps. However of those who were planning to continue using the apps, many would look to alter the settings on the apps, it should be noted that this is behavioural intention and it is not clear if it became actual behaviour.
Future studies should investigate if after being informed about the data stored by apps, participants make any changes to the settings on apps or the information, and if this has been influenced by the recent press stories regarding the commoditization of data. It would also be beneficial to work with parents who are non-users to discover any privacy concerns that have stopped them from using these apps.
References
Abel, R.: Babysitting app Sitter exposed the data of 93,000 customers, SC Media (2018). https://www.scmagazine.com/home/network-security/babysitting-app-sitter-exposed-the-data-of-93000-customers/. Accessed: 19 Nov 2018
Acquisti, A., Gross, R.: Imagined communities: awareness, information sharing, and privacy on the facebook. In: Danezis, G., Golle, P. (eds.) PET 2006. LNCS, vol. 4258, pp. 36–58. Springer, Heidelberg (2006). https://doi.org/10.1007/11957454_3
Adhikari, R., Richards, D.: Security and privacy issues related to the use of mobile health apps. In: 25th Australasian Conference on Information Systems (ACIS 2014), Auckland, New Zealand, pp. 1–11 (2014)
Ammari, T., et al.: Managing children’s online identities: how parents decide what to disclose about their children online. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI 2015), pp. 1895–1904. ACM, New York (2015). https://doi.org/10.1145/2702123.2702325
Babycentre: Cambridge Analytica and your social media use (2018). https://community.babycentre.co.uk/post/a31540953/cambridge-analytica-and-your-social-media-use. Accessed 4 June 2018
Blum-Ross, A., Livingstone, S.: “Sharenting”, parent blogging, and the boundaries of the digital self. Popul. Commun. 15(2), 110–125 (2017). https://doi.org/10.1080/15405702.2016.1223300
Brosch, A.: When the child is born into the internet: sharenting as a growing trend among parents on facebook. New Educ. Rev. 43(1), 225–235 (2016). https://doi.org/10.15804/tner.2016.43.1.19
Dehling, T., et al.: Exploring the far side of mobile health: information security and privacy of mobile health apps on iOS and Android. JMIR mHealth uHealth (2015). https://doi.org/10.2196/mhealth.3672. Edited by G. Eysenbach. Toronto, Canada
Dolan, B.: Report finds pregnancy apps more popular than fitness apps, mobihealthnews (2013) http://www.mobihealthnews.com/20333/report-finds-pregnancy-apps-more-popular-than-fitness-apps. Accessed 4 June 2018
Haelle, T.: Pregnancy Apps: Your Patients Use Them—Are You Up to Speed? Medscape (2018). https://www.medscape.com/viewarticle/892945. Accessed 4 June 2018
Lupton, D.: “Mastering your fertility”: the digitised reproductive citizen. In: McCosker, A., Vivienne, S., Johns, A. (eds.) Negotiating Digital Citizenship: United States, pp. 81–93. Rowman & Littlefield Publishers (2016)
Lupton, D., Pedersen, S.: An Australian survey of women’s use of pregnancy and parenting apps. Women and Birth 29(4), 368–375 (2016). https://doi.org/10.1016/j.wombi.2016.01.008
Lupton, D., Pedersen, S., Thomas, G.M.: Parenting and digital media: from the early web to contemporary digital society. Sociol. Compass 10(8), 730–743 (2016). https://doi.org/10.1111/soc4.12398
Lupton, D., Williamson, B.: The datafied child: the dataveillance of children and implications for their rights. New Media Soc. 19(5), 780–794 (2017). https://doi.org/10.1177/1461444816686328
Madden, M., et al.: Parents, Teens, and Online Privacy, Washington, USA (2012). https://files.eric.ed.gov/fulltext/ED537515.pdf
Marasli, M., et al.: Parents’ shares on social networking sites about their children: sharenting. Anthropologist 24(2), 399–406 (2016). https://doi.org/10.1080/09720073.2016.11892031
Martínez-Pérez, B., de la Torre-Díez, I., López-Coronado, M.: Privacy and security in mobile health apps: a review and recommendations. J. Med. Syst. 39(1), 181 (2014). https://doi.org/10.1007/s10916-014-0181-3
Marwick, A.E., Boyd, D.: Networked privacy: how teenagers negotiate context in social media. New Media Soc. 16(7), 1051–1067 (2014). https://doi.org/10.1177/1461444814543995
Minkus, T., Liu, K., Ross, K.W.: Children seen but not heard: when parents compromise children’s online privacy. In: Proceedings of the 24th International Conference on World Wide Web, Florence, Italy, pp. 776–786. International World Wide Web Conferences Steering Committee (WWW 2015) (2015). https://doi.org/10.1145/2736277.2741124
Moser, C., Chen, T., Schoenebeck, S.Y.: Parents’ And children’s preferences about parents sharing about children on social media. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI 2017), pp. 5221–5225. ACM, New York (2017). https://doi.org/10.1145/3025453.3025587
Mumsnet: MN, FB, marketing companies and our data (2018). https://www.mumsnet.com/Talk/site_stuff/a3199266-MN-FB-marketing-companies-and-our-data. Accessed 4 June 2018
Norberg, P., Horne, D., Horne, D.: The privacy paradox: personal information disclosure intentions versus behaviors. J. Consum. Aff. 41(1), 100–126 (2007). https://doi.org/10.1111/j.1745-6606.2006.00070.x
Orton-Johnson, K.: Mummy blogs and representations of motherhood: “bad mummies” and their readers. Soc. Media Soc. 3(2), 2056305117707186 (2017). https://doi.org/10.1177/2056305117707186
Plachkinova, M., Andrés, S., Chatterjee, S.: A taxonomy of mHealth apps – security and privacy concerns. In: 2015 48th Hawaii International Conference on System Sciences, pp. 3187–3196 (2015). https://doi.org/10.1109/hicss.2015.385
Prior, S.: The millennial mum – technology use by new mothers. In: Proceedings of British HCI 2016 - Fusion, Bournemouth, UK, pp. 20.1–20.3 (2016)
Shklovski, I., et al.: Leakiness and creepiness in app space: perceptions of privacy and mobile app use. In: Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems (CHI 2014), pp. 2347–2356. ACM, New York (2014). https://doi.org/10.1145/2556288.2557421
Silva, C.S., et al.: Privacy for children and teenagers on social networks from a usability perspective: a case study on facebook. In: Proceedings of the 2017 ACM on Web Science Conference (WebSci 2017), pp. 63–71. ACM, New York (2017). https://doi.org/10.1145/3091478.3091479
Steinberg, S.B.: Sharenting: children’s privacy in the age of social media. Emory Law J. 66(4), 839–884 (2017)
Vertesi, J.: My Experiment Opting Out of Big Data Made Me Look Like a Criminal, Time (2014). http://time.com/83200/privacy-internet-big-data-opt-out/. Accessed 11 June 2018
Wilson, D.W., Valacich, J.: Unpacking the privacy paradox: irrational decision-making within the privacy calculus. In: International Conference on Information Systems, ICIS 2012, vol. 5, pp. 4152–4162 (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Prior, S., Coull, N. (2020). Parents Unwittingly Leak Their Children’s Data: A GDPR Time Bomb?. In: Moallem, A. (eds) HCI for Cybersecurity, Privacy and Trust. HCII 2020. Lecture Notes in Computer Science(), vol 12210. Springer, Cham. https://doi.org/10.1007/978-3-030-50309-3_31
Download citation
DOI: https://doi.org/10.1007/978-3-030-50309-3_31
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-50308-6
Online ISBN: 978-3-030-50309-3
eBook Packages: Computer ScienceComputer Science (R0)