Abstract
Smart homes – a residence with innovative, interconnected, and automated technologies – can enhance the resident’s quality of life and well-being. Despite these potentials, users’ may have concerns about the increased automation which negatively influence their technology acceptance. Missing trust in automated technologies and privacy concerns have been identified as crucial barriers for smart home adoption. Still, privacy and trust perceptions in smart homes have not yet been deeply understood. Also, the effect of different automation levels has not been studied so far. In a qualitative empirical approach, we examine perceptions of privacy and trust in smart home technologies depending on the level of automation (using two juxtaposed scenarios: partially automated vs. highly automated). 10 adults (20 to 87 years) were interviewed. Trust in smart home technologies comprises multiple dimensions of not only trust in the functionality of the technology but also in the human stakeholders involved and in connected technologies. Privacy in smart home does not only regard informational privacy (data protection) but also physical, social, and psychological dimensions of privacy which are often neglected. The results show that privacy and trust in smart home are interdependent. The degree of automation strongly influences privacy and trust perceptions – with a higher automation leading to more concerns. Our results contribute to a deeper understanding of privacy and trust in smart homes. The negative impact of the level of automation on privacy and trust perceptions is a guide for the development of smart home technologies that meet users’ acceptance.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Increasing digitization entails global progress and change that affect many areas of life [12], also domestic environments that become “sensitive, adaptive, and responsive to human needs” [30]. Smart homes offer great potentials for the control of energy usage, improvement of security, comfort, entertainment, and communication, but also in terms of health management with regard to aging and vulnerable residents [22]. Though research and development concerning home automation is growing rapidly, the use of holistic environments is still rare in practice compared to single technologies that have already become popular (e.g., surveillance camera, smart speaker) [22]. With regard to the users’ perspective, a possible reason is that automation may lead to uncertainties, particularly related to privacy and trust concerns. To this, previous research showed that, although users appreciated smart home advantages (especially energy management), they also feared perceived disadvantages in terms of being dependent on technology along with the feeling of loosing control [3, 44]. Future scenarios of smart homes with more advanced automation may provide even more benefits but may also increase concerns and feelings of loss of control. Therefore, the technology acceptance by the users is a decisive factor for the successful roll-out of smart home environments. User requirements, especially regarding privacy and trust concerns, need to be understood and considered in the development of smart home technologies.
To better understand perceptions of privacy and trust in this context, further research is needed to identify future users’ needs and demands, but also concerns regarding life in smart home under consideration of different automation levels – especially as previous studies are limited to either specific technologies and systems or generic smart home descriptions as basis for use evaluation – to derive acceptance-relevant factors as implications for the technical development.
1.1 Smart Home Environments
Smart homes are provided with integrated sensor and actuator networks for data collection and task performance based on wireless communication to support residents in their personal living environment for improved quality of life [9, 11]. Smart home technologies and applications are manifold, including monitoring systems (video tracking, biometrics control, etc.), integrated pressure pads (e.g., in the floor or furniture), light and temperature sensors, smart meters, emergency buttons, and home robots, appropriately designed for security, entertainment and communication, convenience and comfort, energy efficiency, ambient assisted living (AAL) and health care [3]. To give one example, lighting and temperature in a room can be regulated using (floor) sensors, such as for motion or pressure, that detect the presence or absence of a person, allowing the lights and heating systems to be switched on or off accordingly [32].
So far, users are provided with an intuitive handling of the smart home technology using speech, gestures, and familiar interfaces, such as smartphone or tablet, to easily control and operate their living environment [18]. For example, if residents receive visitors, they can recognize and communicate with them through a monitoring system (camera, microphone, speaker) and remotely open the door (e.g., at the push of a button or via voice control) [32]. With regard to advanced home automation, the idea is to have fully networked appliances that operate without human involvement [39]. To stay with the example of home visitors, the security system automatically performs access control by identifying (un)welcome visitors (e.g., via face recognition and pre-set information) to open respectively lock the door [31]. Consequently, when the active performance of tasks is provided by highly automated systems, users take a more passive role. The repercussions on users’ acceptance of smart home, especially regarding privacy concerns and trust in the technology have not yet been investigated.
1.2 Technology Acceptance
Research on technology acceptance engages in the comprehension of factors that influence the intention to adopt and use technologies. A very influential model on technology acceptance is the TAM (Technology Acceptance Model [10]) and its extensions [21]. Here, central influencing factors on acceptance are the perceived usefulness and the perceived ease of use. As measurement for acceptance the behavioral intention to use a technology is used based on the Theory of Reasoned Action [1]. The TAM and most of its extensions are originally developed for the use of Information and Communication Technologies (ICT) on the workplace. For consumer technologies, the UTAUT2 (Unified Theory of Adoption and Use of Technology 2 [42]) has been developed which includes additional factors which are important to consumers like hedonic motivation and price value.
Still, research shows that regarding digital technologies like the Internet of Things (IoT) and special contexts like smart homes, these models do not consider the whole diversity of factors that influence acceptance [2, 29, 38]. In order to increase users’ confidence and acceptance, Wilson et al. [44] found that smart home technology needed to be reliable and guarantee privacy, corresponding to the findings of Balta-Ozkan et al. [3] who revealed information disclosure (data security) and hacker attacks (the system’s physical security) as major barriers to use. In addition, it was apparent that users tended to distrust governments and industries out of the concern for profit-making when it comes to the use of automated smart home technology [3].
This shows that when technologies are integrated into the intimate space of the own home, collect manifold data, and take over control of actions – as it is the case in smart homes – factors like privacy concerns and trust become decisive [36].
1.3 Privacy
Privacy is a key element of personal freedom, autonomy, and well-being [33]. Privacy can be defined as the control over access to the self [7]. Control being the core element, privacy is not about absolute withdrawal but about having the ability to achieve the optimal level of withdrawal and disclosure in each situation [26].
In regard to digital technologies, the aspect of informational privacy is often focused, meaning the control over access to personal information [7]. But especially regarding smart home technologies the other three dimensions of privacy must not be neglected: Physical privacy describes the control over access to the physical self like freedom from surveillance and unwanted intruders; Social privacy constitutes of control over the who, what, when, and where of personal encounters with others; and psychological privacy is the protection from intrusions upon one’s thoughts, feelings, and values [7]. All four privacy dimensions may be infringed by smart home technologies and thus need to be considered to understand privacy requirements regarding the acceptance of smart homes.
In various studies, concerns about privacy have been identified as important barrier to the use of ICT in general [4] and smart home technologies in particular [43, 46]. Potential users see risks and concerns regarding access and misuse of personal information, invasion of personal space, feelings of surveillance, obtrusiveness and more [5, 14, 17]. Influencing factors on privacy concerns are, e.g., the perceived sensitivity of data that is collected by the technology [41], characteristics of the technology like the type or location of sensors [15], who has access to the data [37], and the ability to control the technology [13].
1.4 Trust
Trust is the key to situations characterized by uncertainties [19] that may arise when different stakeholders are involved (Can I rely on the other?). This applies to interactions between humans with varying roles, needs, and demands as well as to those involving humans and machines [20].
Considering trust in automated technology, previous studies have already identified a wide range of influencing factors, including the environment (e.g., organizational and cultural setting), situation (e.g., tasks and related risks), previous experiences, as well as user characteristics (demographics, personality, attitudes, etc.), cognition (e.g., ability and expertise), emotion and affection (e.g., technology liking) [16, 24, 25, 34]. Besides, trust in automation was found to be application-sensitive. Brell et al. [6] revealed that trust in home automation was significantly higher in contextual comparison with autonomous driving, provided that the participants also showed a higher willingness to use smart homes. Against the background of increasing home automation, it is of great interest to what extent these findings differ regarding diverse automation levels.
As home automation concerns a particularly private and intimate application field, it is therefore not perceived without risks [22]. Two strong research directions focus on aging in place (given the demographic change) and energy management (as a major motive for use). Steinke et al. [40] identified reliability and ease of use as trust predictors in AAL, provided that trust also depended on the type of technology and varied with regard to user diversity. Wilson et al. [44] found that, in general, dependencies on infrastructure (technologies and networks) and third parties (experts) were perceived as risky. With special focus given to energy management, Paetz et al. [28] and Balta-Ozkan et al. [3] revealed concerns about being betrayed and manipulated by politics and companies for economic reasons. To this, Yang et al. [45] identified trust in the service provider as a central acceptance factor for smart home technologies.
Understanding the role of trust in the decision-making whether or not to use home automation requires further research to explore user requirements in balancing perceived (dis)advantages with regard to multiple applications of use, but also in-depth knowledge of conditions and aspects that can create (dis)trust.
1.5 Research Questions
For the development of smart home technologies that fit the users needs and wishes it needs to be understood what influences the perception of technologies and the technology acceptance by potential users. Much research on technology acceptance in smart homes focuses on the confirmation and extension of existing models, using quantitative approaches [22]. This research shows that privacy concerns and trust in automation are important factors that indeed need to be included into the modeling of technology acceptance in smart homes (e.g., [6, 38]). But this quantitative and confirmatory research still falls short in two important aspects.
Firstly, technology acceptance, trust, and privacy are context-dependent [6, 26, 35]. Still, not much is yet known about privacy and trust perceptions in the special context of smart homes. Therefore, further exploratory examinations are needed to study how these differences affect the perceptions of privacy and trust, respectively, in smart homes. Only then can technological developments be perfectly fitted to users wishes.
Secondly, privacy and trust are mostly examined as two independent variables. Trade-offs between conflicting factors, like privacy concerns and benefits, like energy savings and quality of life, are a core element of the genesis of acceptance [27]. First results show that there are interactions between trust and privacy [6, 35]. In a way, trust can mitigate privacy concerns [35]; but privacy concerns also lower trust into automation [6]. To better understand how these two important factors interact with and influence each other regarding smart home acceptance further qualitative research is needed.
Also, an important factor for privacy and trust in automation, respectively, is the perceived control over the technology [37] and correspondingly the level of automation [16]. The level of automation has not yet been considered regarding the acceptance of smart home nor for users’ perceptions of trust and privacy in this context. It may help to understand key aspects of smart home acceptance.
This study aims to close these research gaps using an exploratory, qualitative research approach to answer the following research questions:
-
What are perceptions of and influences to privacy and trust in smart homes?
-
How does the level of automation influence perceptions of privacy and trust?
-
How do privacy and trust perceptions interact and depend on one another?
2 Method
The study aim was to explore privacy and trust perceptions of smart homes as regards two automation levels (semi-automated vs. fully-automated). To this, a qualitative research approach was designed with \(N=10\) participants using guided group interviews (two or three participants each) for a reciprocal sharing of opinions between the respondents to maximize the diversity of perspectives. The participants were acquired in the personal environment of the research team on voluntary basis without gratification. As interview location, a familiar environment was chosen, usually at one of the participants’ own places.
The sample consisted of \(n=5\) women and \(n=5\) men. The participants’ age ranged from 20 to 87 years (\(M=43\); \(SD=24.05\)). The interviews were recorded (audio) and transcribed verbatim. Data analysis based on qualitative content analysis [23]. The categories were developed guided by theory and inductively supplemented based on the research material.
Prior to the interviews, the participants were given an introduction to the topic, procedure, and privacy policy. The interview guideline consisted of three parts with open questions and interactive elements. PART I provided a deeper understanding of the content by discussing the use of technology and experiences with home automation. Each participant received a ground plan in which they were asked to draw technologies they regularly used at home (see Fig. 1).
PART II explored perceptions of and influences to privacy and trust in smart homes with special regard given to the degree of automation. Two scenarios were designed, based on the state of art (see for example [8, 22, 30]), contrasting “semi-automated” (Scenario A) and “fully-automated” (Scenario B) home environments, which were randomized for evaluation within each interview group. First, the participants were provided with clearly illustrated text descriptions of an everyday situation for detailed but easy-to-understand scenario presentations. The integration of smart home technology into daily routines was outlined by demonstrating the active role of users in semi-automated respective the passive role of users in fully-automated smart homes. The understanding of the scenarios for all participants was ensured prior to the study in pretests with participants of different age groups and technology generations.
Next, pictures of smart home technologies were handed out, selected under consideration of diverse application fields (e.g., comfort, entertainment, security, health care, and energy efficiency), including explanations on how they work for each automation scenario (listed in Table 1). A ranking system was used for evaluation to ensure comparability. The assessment of perceived privacy based on a traffic light system with the colors green (= “I have no privacy concerns”), yellow (= neutral), and red (= “I have privacy concerns”). To assess the participants’ trust, three emoticons were used with a positive (= “I trust the technology”), neutral, and negative (= “I distrust the technology”) facial expression. Subsequently, the participants were asked to discuss further issues of privacy and trust related to the presented technologies, such as What aspects and conditions are particularly important to you in terms of privacy and trust? What are exclusion criteria for use? Do you think to have control over your data? Do you think the technologies presented are reliable? etc.
PART III concluded the interview, in which the participants were asked to decide and reason on which smart home scenario (semi-automated vs. fully-automated) they would like to use in practice.
3 Results
In the following, trust perceptions, influences on these, and their dependence on the automation level are described. Thereafter, privacy is focused. The results are illustrated with statements from the participants that are presented at the end of each section. These are indicated in italics, referenced by number and the speaker is identified regarding their gender (f/m) and age (in years).
3.1 Composition of Trust in Smart Home
Trust in a smart home is based on multiple trust perceptions (see Fig. 2). One factor is the trust into the functionality with trust in the reliability of the technology as a decisive part (see Q1). But also the trust in data protection by the technology is important (see Q1, Q2).
Also foreground is the trust in human stakeholders. This is, on the one hand, the trust in the developers and providers (Q3). Very important to many participants was, on the other hand, trust in a good technical service (Q3), as they did not trust the technology to be totally reliable.
As smart home technologies rely on technical infrastructure like networks and the internet, trust in connected technologies and IT is also important for trust in smart home (Q4). Related to trust in smart home is also trust in technologies in general (Q5).
-
Q1: “I think trust refers to two things: how reliable the technology is and how much I trust the company to handle my data.” (f, 27)
-
Q2: “[What conditions are needed for you to trust these technologies?] Data security. That it is ensured that my data cannot be accessed by unauthorized persons.” (f, 53)
-
Q3: “Any technology is only as good as the people who are behind it and who make the technology work or who are responsive in case of failure.” (f, 53)
-
Q4: “I would not necessarily trust that the Wi-Fi will always work. So I think it’s more about trusting other things that are not the technology itself.” (f, 28)
-
Q5: “If the standard technology, i.e. a mobile phone wouldn’t break after two years, [...] then I would be happy to look into smart home.” (m, 57)
3.2 Influences on Trust in Smart Home
Influences on trust in smart home can be structured into three aspects: the technology, the user, and the context (Fig. 3).
Technology. Multiple characteristics of the technology influence trust. Again, the reliability plays the decisive role. This perceived reliability is influenced by the maturity of the technology (Q6) and level of connectivity with other technologies (Q7). However, as most participants do not trust the technology to be, absolutely reliable, they perform a risk evaluation based on the application type: the less risky the task taken over by the smart home technology is, the more they trust that technology (Q8). Therefore, it is also important how they perceive privacy risks: the more severe potential privacy infringements, the lower the trust into the technology. Additionally, the participants search for a safety net in case the technology fails. Here, particularly a good technical service and manual control over the technology or task at hand are important (Q9). Moreover, the ease of using the technology influences trust.
User. Also, individual users vary in their trust perceptions in smart home. An important user factor is the experience with the technology. On the one hand, personal experiences with the smart home build trust (Q10). On the other hand, experiences of peers (Q10) and shared experiences in (online) reviews (Q11) are considered. Besides experiences with the technology in question other experiences influence trust perceptions. Here, especially the familiarity with similar technologies (Q12) and technologies in general (Q13) are important. Also, having experienced a threat like a break-in may impact users trust perceptions (Q14).
Additionally, the participants state that user diversity factors like age may have an impact on trust in smart home (Q15).
Context. Besides technology characteristics and user factors, also the context influences trust in smart home. For once, user do not regard trust independently from the benefits of the technology (Q16). Similarly, the necessity to use a technology overrides distrust (Q17).
-
Q6: “I also think it’s important to say that all the systems you just introduced are in absolute, not even beta, but alpha phase, which is not and will not work in any way feasible.” (m, 20)
-
Q7: “Before it didn’t work. And now think about upgrading from three to seven or eight technologies. That means there’s a better chance that something won’t work.” (m, 57)
-
Q8: “I trust such systems more the less tasks they take over. So I don’t trust a system that takes over driving my car while I’m on the highway at 160 miles per hour. But if the system is only there to play music in my car, I trust it 100%, because if the music doesn’t work, it’s not that bad.” (m, 20)
-
Q9: “But as I said, always with the option of manual operation in case I don’t like it or want to have it otherwise. The individual adjustment must still be possible.” (f, 53)
-
Q10: “What would create trust would be if a friend bought it, recommended it to me, and then I would try it at his home. So with people who are close to me, who I trust 100%, if they tell me it’s safe.” (m, 20)
-
Q11: “[Is there something that would create trust?] Customer reviews” (m, 21)
-
Q12: “Yeah I think that’s more familiar to what we have right now. That makes it easier for me to move to smart technologies.” (f, 28)
-
Q13: “That’s the worst, because when I think about what technologies are not working for me right now ...” (f, 53)
-
Q14: “I would also say that everyone’s trust limits are different. If you have had an incident, for example a burglary, you will have more confidence in such a system.” (m, 20)
-
Q15: “Because for us younger people that would not be too much change. But older people may be more concerned regarding technologies or changes.” (f, 28)
-
Q16: “I trust the big picture. This smart meter is super useful.” (f, 27)
-
Q17: “[What would create trust for you?] Once it becomes vital.” (m, 20)
3.3 The Impact of the Automation Level on Trust
All participants agreed that trust is higher in the semi-automated smart home regarding all ten technologies on average than in the fully-automated one. Risks are perceived as more severe (Q19) and giving up control increases distrust in the fully-automated smart home (Q18). Still, for some applications, trust is still perceived as positive in the fully-automated smart home, and some applications are perceived negatively in the semi-automated one (see Fig. 4).
-
Q18: “I think I would trust things here [in the semi-automated smart home] more. I think it’s the fact that you have more control here.” (f, 28)
-
Q19: “[Which of the two smart homes would you prefer?] That is quite clear. The second [semi-automated] one. It has the least technical risk.” (m, 87)
3.4 Composition of Privacy in Smart Home
In smart homes, all four dimensions of privacy [7] are important. Informational privacy is the most often mentioned aspect describing the control of access to personal data. Informational privacy in smart home means no unwanted access (without consent) to the data collected by smart home technologies (Q20), no misuse and no user profiling of that data (see Fig. 5). Physical privacy is centrally about being alone. In smart home, this particularly means no visual surveillance and no unwanted intrusion to the home (Q21), e.g., no break-ins.
Also the social privacy, i.e. the control over conversation partners & listeners, can be confined in a smart home, e.g. by recordings of conversations (Q22). Psychological privacy is not mentioned directly by the participants. However, their statements show that they do not want technologies to be insufficiently adjusted to personal needs as that does not correspond to individual wishes and preferences (Q23).
-
Q20: “Because this smart home knows almost everything about me and what I’m doing, my privacy is no longer protected when someone can access this data.” (f, 27)
-
Q21: “[Is there something that is very important for your privacy?] That there are not suddenly strange people in my home.” (m, 87)
-
Q22: “[What does privacy mean for you in a smart home?] That conversations are not overheard.” (m, 21)
-
Q23: “This is like a prison for me, because the technology always does what others think is right for you.” (m, 20)
3.5 Influences on Privacy in Smart Home
Influences on the perception of privacy in smart home again stem from three sources, the technology, the user, and the context (Fig. 6).
Technology. Technology-wise the data handling and data protection are important. Users again perform a risk evaluation and want to prevent misuse & selling of data (Q24). Important criteria for this risk evaluation are who has access to data (Q24, Q25) and trust in data protection in general. In more detail, it is also important where the data is stored and whether the devices are connected to the internet or other technologies. Data provision with consent is no privacy infringement.
The type of technology and application is decisive for the privacy perception. In the participants’ statements, we again observe the risk evaluation they perform (Q26, Q27). The riskier for the personal privacy, the less a technology is accepted. Decisive criteria besides the application are the location of the technology in the home and the sensitivity of the collected data.
A feeling of control over the technology can mitigate privacy concerns. Also, reliability is mentioned as influence on privacy perceptions. Moreover, one participant explains that there is a difference between humans knowing data and an inhuman machine doing so (Q28).
User. Also for privacy perceptions, user diversity is influential. On the one hand, users differ in their personality and thereby disposition to value privacy (Q29). On the other hand, experiences with privacy infringements and technologies play again an important role.
Context. Context-related influences on privacy perceptions stem from the benefit of data collection (Q30) and how privacy infringing alternatives are (Q31), e.g., moving into a nursing home.
-
Q24: “Fear of abuse of this data would be a big problem. If the information is for me it is okay. If it is used in the medical field, in the physiotherapy field also good. But if it is used for advertising purposes, marketing strategies development, then I am not willing to provide this data.” (f, 53)
-
Q25: “I think it depends on whom you refer to for privacy. To my neighbors and friends I am still as private or not private as before. But the question is where the data is stored and who can access this data.” (f, 27)
-
Q26: “So with the fridge I think to myself who cares what I eat. Blinds too, I don’t know if they interfere with my privacy. I think the front door is super critical. If my front door just lets people in, I it’s not cool.’ (f, 27)
-
Q27: “Well, I think it is important to know what the data is used for, because it brings with it dangers.” (m, 30)
-
Q28: “When a milkman analyses my daily routine and then stands outside my door at the exact time of day to bring me milk when I’ve just been awake for ten minutes, I’m happy to see that milkman. But if you wake up and suddenly he is standing in front of your door because the computer has told him to, I find that strange. I would not feel comfortable with that.” (m, 20)
-
Q29: “I think it depends on you personality regarding privacy.” (f, 27)
-
Q30: “If the information is for me it is okay. If it is used in the medical field also good. But if it is used for advertising purposes, I don’t want that.” (m, 57)
-
Q31: “With the blinds, every neighbor can also see when I do this.” (m, 20)
3.6 The Impact of the Automation Level on Privacy
All participants agreed that the perceived privacy is higher in the semi-automated smart home when they regard all technologies together. The main argument is even more surveillance in the fully-automated smart home which infringes on privacy as well as less control (Q32). Still, the ratings vary between applications: every rating (positive, neutral, negative) is prevalent for each automation level (see Fig. 7).
-
Q32: “[Do you feel like you’d get your privacy in this [fully-automated] apartment?] No, even less. There is even more data being collected, and even more is controlled by the technology.” (m, 57)
4 Discussion
In an exploratory group interview approach, users’ notions and perceptions of privacy and trust in smart homes were investigated with special regard to increasing automation – which has been identified as an important factor in other application fields [16] but not yet investigated in this context. Using two juxtaposed scenarios of a semi-automated vs. fully-automated smart home, the impact of automation level on trust and privacy was studied. To this, interactive empirical research methods were used to sharpen the participants’ understanding of the research subject (through groundplan drawings) and discuss results at a detailed level (by means of comparative rating systems). As privacy and trust perceptions are highly context-dependent [6, 26, 35], in-depth knowledge of both concepts in the sensitive context of home automation is needed. Only then, a user-tailored development of smart home technology can be provided.
4.1 Understanding Trust and Privacy in Smart Home
Trust: The results confirm reliability as a core element of trust from a technological perspective, similar to the acceptance of smart home technology [22]. Also, trust in data protection is an important element. However, because users actually do not fully trust the reliability and data protection, two aspects are important: On the one hand, risks of failure should not be too high. Here the participants perform a risk evaluation based on the application of the technology. On the other hand, users search for a safety net in case of failure wishing for a good technical service and manual control over the technology or task at hand. For developers and provides, the reliability as well as a good safety net should be foreground. This is even more stressed by the results regarding user diversity, where the personal experience with smart home and other technologies is most influential. Correspondingly, time is needed to convince users about the reliability via good experiences.
Our results highlight the high importance of trust in human stakeholders like developers, providers, and particularly the technical support thereby confirming previous results [45]. This human centered perspective needs to be integrated into trust models. For providers, trust-building measures into their performance and the technical support are key factors for users’ trust and thereby their acceptance.
As third aspect, trust in the connected technological infrastructure can be confirmed as important factor [44]. Here again, experiences with internet failures, complex installation processes, and more diminish trust in such a dependent and interconnected technology as smart homes. Consequently, the extension of the infrastructure needs to be driven forward for a successful roll-out of advanced technologies like smart homes. As well, compatibility between technologies and simple installation processes are important requirements for smart home technology development.
Privacy: In smart homes, informational privacy and data protection are core elements for privacy perceptions which is closely linked to the trust in data protection. However, physical, social, and psychological dimensions of privacy are also important in smart home. Feelings of being alone and free from unwanted intruders, of unobserved conversations, and control over the environment to adjust to personal preferences are key aspects. Smart home technologies should correspondingly be most unobtrusive into the personal space of the home. As for trust, a risk evaluation is a core element for perceived privacy. This highlights the importance of data the principle of data minimization: only collect and store data when it is needed.
Both, privacy and trust were not regarded independently from the benefits. Here, trade-offs between barriers like privacy and trust concerns and benefits become prevalent that are key to the genesis of technology acceptance [27]. These trade-offs are not yet fully understood and should be focused in future research.
4.2 The Relation of Privacy and Trust
Interdependence of Privacy and Trust: Our study shows first indications that privacy and trust are more interconnected than is often assumed. Comparing the composition of and influences on privacy and trust in smart home, several similarities and interdependence emerge.
-
a)
Privacy and trust perceptions are both based on a risk evaluation. The more severe potential consequences are, the less trust and the less privacy is prevalent for the users.
-
b)
The perception of privacy and the trust in the technology influence each other. Trust in data protection is important for the perceived privacy in smart home, and the perceived privacy influences the technological risks and thereby trust perceptions.
-
c)
Multiple factors influence both, privacy and trust. For example, the reliability, connection with the internet, application type, control of technology, and perceived benefits. For user diversity, experiences play an important role in both cases.
We could observe in the interviews that the participants had a hard time separating privacy and trust. Participants who first answered questions regarding privacy often started referring to trust in data protection when answering questions about trust. Similarly, participants often directly mentioned reliability for privacy when trust was discussed beforehand.
Follow-up studies should investigate this relationship to advance the comprehension of privacy and trust for technology acceptance.
4.3 The Impact of the Automation Level
Faced with the decision to choose one out of the two smart home scenarios presented for living, the participants commonly agreed on the semi-automated environment often motivated by trust and privacy related issues.
Especially for trust perceptions, the level of automation has previously been identified as important influencing factor [16]. Our results confirm that privacy and trust perceptions in smart home are fundamentally based on and influenced by feelings of control and are strongly influenced by the degree of automation. All participants preferred less automated smart home technologies to fully automated ones regarding privacy and trust. These results can be explained by the influencing factors on privacy and trust: higher automation is accompanied by less (felt) control over the technology resulting in higher risk perceptions, more data collection and interconnection between several data sources, as well as more dependence on the reliability of the smart home and the connected technologies and infrastructure. As one participants plainly puts it: “because one trust oneself more than one trusts the technology”.
This strong, negative impact of the level of automation has to be confirmed and quantified in future studies. It gives a clear direction to the development of smart home technologies: there are adjusting screws to improve privacy and trust in fully-automated smart homes, e.g. data minimization and data protection, reliability of the technology and connected infrastructure, as well as always providing users with fail-safe technical support and manual control options. However, the use of fully-automated technologies is critical for users and should be guided by a high benefit to convince potential users.
4.4 Limitations
Our results contribute to a deeper understanding of privacy and trust in smart homes and their dependence on each other and the level of automation. Still, methodological limitations have to be considered when interpreting the results. The qualitative approach was suited to identify perceptions and influencing factors for trust and privacy in smart home and to indicate important research directions. These have to be further confirmed and quantified in quantitative empirical approaches. The sample size of 10 interviewees was sufficient for this exploratory study. Still, the impact of user diversity in privacy and trust perceptions needs to be further studied. Hence, more diverse samples are needed regarding age, technical affinity and expertise, and experiences with smart homes. Moreover, the study was conducted in Germany with a German sample. Perceptions of privacy, trust, and technology acceptance vary between cultures and countries. Therefore, the perspective should be widened to analyze cultural differences as well as to better understand further influencing factors.
To engage participants into the discussion and promote in-depth reasoning about privacy and trust in smart home, a interactive approach with drawings of ground plans as introduction to the topics as well as ratings of privacy and trust for each technology was chosen. For the results, this assured that each participants considered and evaluated every technology. At the same time, for the participants this made the interview more engaging and fun resulting in good cooperation.
Another important aspect is the approach to interview potential smart home users about hypothetical technologies. The participants were neither experienced with smart homes nor could they try out the technologies. Therefore, the results are based on perceptions and images of smart homes that are mostly based on hearsay and experiences with other, partly similar technologies. Still, that does not belittle the meaningfulness of our results as in the current beginning phase of the roll-out of smart home technologies this is what guides most potential users and buyers.
References
Ajzen, I., Fishbein, M.: Understanding attitudes and predicting social behaviour (1980)
Al-Momani, A.M., Mahmoud, M.A., Ahmad, M.S.: A review of factors influencing customer acceptance of Internet of Things services. Int. J. Inf. Syst. Serv. Sector 11(1), 54–67 (2019). https://doi.org/10.4018/IJISSS.2019010104
Balta-Ozkan, N., Davidson, R., Bicket, M., Whitmarsh, L.: Social barriers to the adoption of smart homes. Energy Policy 63(2013), 363–374 (2013). https://doi.org/10.1016/j.enpol.2013.08.043
Bélanger, F., Crossler, R.: Privacy in the digital age: a review of information privacy research in information systems. MIS Q. 35(4), 1–36 (2011). https://doi.org/10.1159/000360196
Boise, L., Wild, K., Mattek, N., Ruhl, M., Dodge, H.H., Kaye, J.: Willingness of older adults to share data and privacy concerns after exposure to unobtrusive home monitoring. Gerontechnology 11(3), 428–435 (2013). https://doi.org/10.4017/gt.2013.11.3.001.00
Brell, T., Biermann, H., Philipsen, R., Ziefle, M.: Trust in autonomous technologies. In: Moallem, A. (ed.) HCII 2019. LNCS, vol. 11594, pp. 371–384. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-22351-9_25
Burgoon, J.K.: Privacy and communication. Ann. Int. Commun. Assoc. 6(1), 206–249 (1982)
Cook, D.J.: How smart is your home? Comput. Sci. 335, 1579–1582 (2012)
Cook, D.J., Das, S.K.: How smart are our environments? An updated look at the state of the art. Pervasive Mob. Comput. 3, 53–73 (2007)
Davis, F.D.: Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 13(3), 319–340 (1989)
Dengler, S., Awad, A., Dressler, F.: Sensor/actuator networks in smart homes for supporting elderly and handicapped people. In: 21st Conference on Advanced Information Networking and Applications Workshops (AINAW 2007), vol. 2, pp. 863–868. IEEE Computer Society (2007)
European Commission: Digitalisation research and innovation - transforming European industry and services. Technical report (2017). https://doi.org/10.2777/858080
van Heek, J., Himmel, S., Ziefle, M.: Helpful but spooky? Acceptance of AAL-systems contrasting user groups with focus on disabilities and care needs. In: Proceedings of the 3rd International Conference on Information and Communication Technologies for Ageing Well and e-Health (ICT4AWE 2017), pp. 78–90 (2017). https://doi.org/10.5220/0006325400780090
van Heek, J., Himmel, S., Ziefle, M.: Privacy, data security, and the acceptance of AAL-systems – a user-specific perspective. In: Zhou, J., Salvendy, G. (eds.) ITAP 2017. LNCS, vol. 10297, pp. 38–56. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-58530-7_4
Himmel, S., Ziefle, M.: Smart home medical technologies: users’ requirements for conditional acceptance. I-Com 15(1), 39–50 (2016). https://doi.org/10.1515/icom-2016-0007. http://www.degruyter.com/view/j/icom.2016.15.issue-1/icom-2016-0007/icom-2016-0007.xml
Hoff, K.A., Bashir, M.: Trust in automation: integrating empirical evidence on factors that influence trust. Hum. Factors 57(3), 407–434 (2015). https://doi.org/10.1177/0018720814547570
Kirchbuchner, F., Grosse-Puppendahl, T., Hastall, M.R., Distler, M., Kuijper, A.: Ambient intelligence from senior citizens’ perspectives: understanding privacy concerns, technology acceptance, and expectations. In: De Ruyter, B., Kameas, A., Chatzimisios, P., Mavrommati, I. (eds.) AmI 2015. LNCS, vol. 9425, pp. 48–59. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-26005-1_4
Kleinberger, T., Becker, M., Ras, E., Holzinger, A., Müller, P.: Ambient intelligence in assisted living: enable elderly people to handle future interfaces. In: Stephanidis, C. (ed.) UAHCI 2007. LNCS, vol. 4555, pp. 103–112. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-73281-5_11
Lee, J., See, K.: Trust in automation. Designing for appropriate reliance. Hum. Factors 46(1), 50–80 (2004). https://doi.org/10.1518/hfes.46.1.50_30392
Madhavan, P., Wiegmann, D.: Similarities and differences between human-human and human-automation trust: an integrative review. Theor. Issues Ergon. Sci. 8(4), 277–301 (2007)
Marangunić, N., Granić, A.: Technology acceptance model: a literature review from 1986 to 2013. Univ. Access Inf. Soc. 14(1), 81–95 (2015). https://doi.org/10.1007/s10209-014-0348-1
Marikyan, D., Papagiannidis, S., Alamanos, E.: A systematic review of the smart home literature: a user perspective. Technol. Forecast. Soc. Change 138, 139–154 (2019). https://doi.org/10.1016/j.techfore.2018.08.015
Mayring, P.: Qualitative Inhaltsanalyse. Grundlagen und Techniken (Qualitative Content Analysis. Basics and Techniques), 12th edn. Beltz, Weinheim (2015)
Merritt, S.M.: Affective processes in human-automation interactions. Hum. Factors 53(4), 356–370 (2011). https://doi.org/10.1177/0018720811411912
Moran, S., Oonk, H., Alfred, P., Gwynne, J., Eilders, M.: Developing a context framework to support appropriate trust and implementation of automation. In: Karwowski, W., Ahram, T. (eds.) IHSI 2019. AISC, vol. 903, pp. 738–743. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-11051-2_112
Nissenbaum, H.: Privacy in Context: Technology Policy and the Integrity of Social Life, p. 304 (2010)
Offermann-van Heek, J., Ziefle, M.: Nothing else matters! Trade-offs between perceived benefits and barriers of AAL technology usage. Front. Public Health 7(June), 134 (2019). https://doi.org/10.3389/fpubh.2019.00134
Paetz, A.G., Dütschke, E., Fichtner, W.: Smart homes as a means to sustainable energy consumption: a study of consumer perceptions. J. Consum. Policy 35(1), 23–41 (2012). https://doi.org/10.1007/s10603-011-9177-2
Peek, S.T.M., Wouters, E.J.M., van Hoof, J., Luijkx, K.G., Boeije, H.R., Vrijhoef, H.J.M.: Factors influencing acceptance of technology for aging in place: a systematic review. Int. J. Med. Inform. 83(4), 235–248 (2014). https://doi.org/10.1016/j.ijmedinf.2014.01.004
Rashidi, P., Mihailidis, A.: A survey on ambient-assisted living tools for older adults. IEEE J. Biomed. Health Inform. 17(3), 579–590 (2013). https://doi.org/10.1109/JBHI.2012.2234129
Robles, R.J., Kim, T.: A review on security in smart home development. Security 15, 13–22 (2010). https://doi.org/10.1145/1764810.1764818
Robles, R.J., Kim, T.: Applications, systems and methods in smart home technology: a review. Int. J. Adv. Sci. Technol. 15, 37–48 (2010)
Rössler, B.: Der Wert des Privaten. Suhrkamp Verlag, Frankfurt, erste aufl edn (2001)
Schaefer, K.E., Chen, J.Y., Szalma, J.L., Hancock, P.A.: A meta-analysis of factors influencing the development of trust in automation: implications for understanding autonomy in future systems. Hum. Factors 58(3), 377–400 (2016). https://doi.org/10.1177/0018720816634228
Schomakers, E.M., Lidynia, C., Ziefle, M.: Listen to my heart? How privacy concerns shape users’ acceptance of e-health technologies. In: 2019 International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob), pp. 306–311. IEEE (2019)
Schomakers, E.-M., Offermann-van Heek, J., Ziefle, M.: Attitudes towards aging and the acceptance of ICT for aging in place. In: Zhou, J., Salvendy, G. (eds.) ITAP 2018. LNCS, vol. 10926, pp. 149–169. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-92034-4_12
Schomakers, E.M., Ziefle, M.: Privacy perceptions in ambient assisted living. In: Proceedings of the 5th International Conference on Information and Communication Technologies for Ageing Well and e-Health (ICT4AWE 2019) (2019)
Shuhaiber, A., Mashal, I.: Understanding users’ acceptance of smart homes. Technol. Soc. 58(January), 101110 (2019). https://doi.org/10.1016/j.techsoc.2019.01.003
Singh, H., Pallagani, V., Khandelwal, V., Venkanna, U.: IoT based smart home automation system using sensor node. In: 4th International Conference on Recent Advances in Information Technology (RAIT), pp. 1–15. IEEE (2018)
Steinke, F., Fritsch, T., Brem, D., Simonsen, S.: Requirement of AAL systems: older persons’ trust in sensors and characteristics of AAL technologies. In: Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA 2012), Article No. 15, pp. 1–6. ACM (2012). https://doi.org/10.1145/2413097.2413116
Valdez, A.C., Ziefle, M.: The users’ perspective on the privacy-utility trade-offs in health recommender systems. Int. J. Hum.-Comput. Stud. (2018). https://doi.org/10.1016/j.ijhcs.2018.04.003
Venkatesh, V., Walton, S.M., Thong, J.Y.L., Xu, X.: Consumer acceptance and use of information technology: extending the unified theory of acceptance and use of technology. MIS Q. 36(1), 157–178 (2012). https://doi.org/10.1111/j.1540-4560.1981.tb02627.x. (Forthcoming)
Wilson, C., Hargreaves, T., Hauxwell-Baldwin, R.: Smart homes and their users: a systematic analysis and key challenges. Pers. Ubiquitous Comput. 19(2), 463–476 (2014). https://doi.org/10.1007/s00779-014-0813-0
Wilson, C., Hargreaves, T., Hauxwell-Baldwin, R.: Benefits and risks of smart home technologies. Energy Policy 103, 72–83 (2017). https://doi.org/10.1016/j.enpol.2016.12.047
Yang, H., Lee, H., Zo, H.: User acceptance of smart home services: an extension of the theory of planned behavior. Ind. Manag. Data Syst. 117(1), 68–89 (2017)
Yusif, S., Soar, J., Hafeez-Baig, A.: Older people, assistive technologies, and the barriers to adoption: a systematic review. Int. J. Med. Inform. 94, 112–116 (2016). https://doi.org/10.1016/j.ijmedinf.2016.07.004
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Schomakers, EM., Biermann, H., Ziefle, M. (2020). Understanding Privacy and Trust in Smart Home Environments. In: Moallem, A. (eds) HCI for Cybersecurity, Privacy and Trust. HCII 2020. Lecture Notes in Computer Science(), vol 12210. Springer, Cham. https://doi.org/10.1007/978-3-030-50309-3_34
Download citation
DOI: https://doi.org/10.1007/978-3-030-50309-3_34
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-50308-6
Online ISBN: 978-3-030-50309-3
eBook Packages: Computer ScienceComputer Science (R0)