[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Next Article in Journal
Weaponization of the Growing Cybercrimes inside the Dark Net: The Question of Detection and Application
Next Article in Special Issue
Medical IoT Record Security and Blockchain: Systematic Review of Milieu, Milestones, and Momentum
Previous Article in Journal
Deep-Learning-Driven Turbidity Level Classification
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Inquiry into the Evolutionary Game among Tripartite Entities and Strategy Selection within the Framework of Personal Information Authorization

China Academy of Information and Communications Technology, Beijing 100191, China
*
Author to whom correspondence should be addressed.
Big Data Cogn. Comput. 2024, 8(8), 90; https://doi.org/10.3390/bdcc8080090
Submission received: 8 June 2024 / Revised: 2 August 2024 / Accepted: 6 August 2024 / Published: 8 August 2024
(This article belongs to the Special Issue Research on Privacy and Data Security)
Figure 1
<p>The relationship of users, App providers and the government.</p> ">
Figure 2
<p>Users’ dynamical replication phase diagrams.</p> ">
Figure 3
<p>App providers’ dynamical replication phase diagrams.</p> ">
Figure 4
<p>Government’s dynamical replication phase diagrams.</p> ">
Figure 5
<p>Example diagram of the impact of <span class="html-italic">x</span> changes on tripartite evolution strategies.</p> ">
Figure 6
<p>Example diagram of the impact of <span class="html-italic">y</span> changes on tripartite evolution strategies.</p> ">
Figure 7
<p>Example diagram of the impact of <span class="html-italic">z</span> changes on tripartite evolution strategies.</p> ">
Figure 8
<p>Example diagram of the impact of <span class="html-italic">R</span><sub>1</sub> changes on users’ evolution strategies.</p> ">
Figure 9
<p>Example diagram of the impact of <span class="html-italic">C</span><sub>1</sub> changes on users’ evolution strategies.</p> ">
Figure 10
<p>Example diagram of the impact of <span class="html-italic">C</span><sub>2</sub> changes on App providers’ evolution strategies.</p> ">
Figure 11
<p>Example diagram of the impact of <span class="html-italic">W</span><sub>2</sub> changes on App providers’ evolution strategies.</p> ">
Figure 12
<p>Example diagram of the impact of <span class="html-italic">C</span><sub>3</sub> changes on the government’s evolution strategies.</p> ">
Versions Notes

Abstract

:
Mobile applications (Apps) serve as vital conduits for information exchange in the mobile internet era, yet they also engender significant cybersecurity risks due to their real-time handling of vast quantities of data. This manuscript constructs a tripartite evolutionary game model, “users-App providers-government”, to illuminate a pragmatic pathway for orderly information circulation within the App marketplace and sustainable industry development. It then scrutinizes the evolutionary process and emergence conditions of their stabilizing equilibrium strategies and employs simulation analysis via MATLAB. The findings reveal that (1) there exists a high degree of coupling among the strategic selections of the three parties, wherein any alteration in one actor’s decision-making trajectory exerts an impact on the evolutionary course of the remaining two actors. (2) The initial strategies significantly influence the pace of evolutionary progression and its outcome. Broadly speaking, the higher the initial probabilities of users opting for information authorization, App providers adopting compliant data solicitation practices, and the government enforcing stringent oversight, the more facile the attainment of an evolutionarily optimal solution. (3) The strategic preferences of the triadic stakeholders are subject to a composite influence of respective costs, benefits, and losses. Of these, users’ perceived benefits serve as the impetus for their strategic decisions, while privacy concerns act as a deterrent. App providers’ strategy decisions are influenced by a number of important elements, including their corporate reputation and fines levied by the government. Costs associated with government regulations are the main barrier to the adoption of strict supervision practices. Drawing upon these analytical outcomes, we posit several feasible strategies.

1. Introduction

In the era of the Internet and big data, the rapid advancement of information technology is continuously giving rise to novel business forms and innovative models. Against this background, data has solidified its position as a fundamental constituent in societal evolution and industrial transformation, concurrently demonstrating substantial potential in the realm of commercial competition. Nonetheless, while the revaluation of information is reshaping the socio-industrial ecosystem and altering business operation paradigms, it also poses profound challenges to the privacy and security of information suppliers. Mobile applications (Apps), pivotal mediums for information exchange in the contemporary mobile network era, facilitate the transmission and utilization of vast quantities of critical and sensitive data [1,2], thereby rendering Apps a focal point, or the “epicenter”, for privacy incidents, with their compliant supervision evolving into a societal quandary [3]. This dilemma is highlighted by a recent finding from IBM Security’s Cost of a Data Breach Report: while a staggering 95% of businesses have had multiple data breach incidents, a startling 57% of these businesses have chosen to pass the financial cost of these breaches onto their customers [4]. Since the beginning of 2023, there has been a global upsurge in technical innovation driven by the development of generative artificial intelligence, of which ChatGPT is a prime example [5,6]. Large datasets amassed over the course of the subsequent phases of model teaching, content synthesis, and algorithmic enhancement support this creative upsurge. However, the wide scope of data collection combined with undisclosed processing techniques has presented significant obstacles to the inviolability of personal data privacy [7]. In June 2023, OpenAI was confronted with its inaugural class-action lawsuit, in which the plaintiffs alleged that the enterprise, amidst the developmental and operational phases of its AI products, surreptitiously aggregated, exploited, and disseminated the personal information of hundreds of millions of internet users without obtaining their explicit consent, thereby imperiling humanity with the specter of potentially ruinous outcomes [8]. Users, being the primary providers of this crucial data component, find themselves constantly occupying a disadvantaged position within the digital ecosystem, and the pervasive extraction and exploitation of their personal information give rise to severe privacy and security issues [9,10,11], including unauthorized access [12], leakage of personal information [13,14], misappropriation of sensitive data [15], and financial deception [16], among others. Over time, the assurance of user well-being and security becomes increasingly tenuous, and fears surrounding the disorder of privacy norms may lead individuals to continuously limit information disclosure or fabricate personal details [17], in turn affecting the fluidity and accuracy of data at the societal level [18]. This issue not only significantly impairs businesses’ precision marketing and service innovation capabilities but also hinders the healthy progression of the App industry ecosystem [19,20]. It severely disrupts the derivative benefits and value transformation of data elements, thereby posing challenges and threats to the sustainable development of the digital economy. Therefore, understanding the mechanisms that influence users’ willingness to authorize information and fostering a collaborative mechanism among multiple stakeholders is crucial for both enterprises and industry regulators.
Given that user authorization of personal information is the primary means of facilitating information disclosure in mobile applications, this paper defines personal information authorization as the act wherein mobile App users voluntarily provide their personal information to service providers to meet functional requirements, thereby permitting the collection, storage, and utilization of said information in accordance with predetermined methods. Scholars have achieved a great deal in the field of privacy disclosure intentions or behaviors [21,22,23,24,25,26]. When it comes to the topics of study, an overwhelming amount of research on privacy decision-making concentrates on either individual users or the dyad of users and corporations. In this context, users typically appear as either the recipients of superior service features offered by App providers or [22,23,25], conversely, as perennial suppliers of raw data subject to excessive collection and misuse by these same entities [21,23,24,25]. Scarce studies portray users as targets for regulatory protection, and even fewer involve the government as a pivotal player in the supervisory role within the research domain. Given the inherently subjective and dynamic nature of user decision-making behaviors, empirical studies based on questionnaire surveys are the predominant methodological approach in current user privacy research. However, this approach tends to treat user decisions as static endpoints, neglecting the dynamics and adjustments that occur throughout the decision-making process. To address these limitations, our study maintains the traditional roles of users and operating entities while incorporating the government as a regulatory safeguarding agent into our analytical framework. This paper situates users in the context of interests and relationships with App providers and government and employs evolutionary game theory to analyze the dynamic evolution of decision-making adjustments. The purpose of this research is to supply evidence-based solutions to the following queries:
(1) What dynamic interplay exists between the behavioral decisions of users as information providers and the strategic selections of App providers and government regulators?
(2) How do these three actors mutually influence one another to achieve a stable, system-optimal solution?
Considering the complex interaction among information flow, value creation and privacy protection in the App information market, this paper designs a triadic framework involving users, App providers, and the government as primary actors. Specifically, it develops an evolutionary game model within the context of personal information authorization, extracting key elements that influence the decision-making processes of these stakeholders. The study explores the reciprocal interactions and inherent correlations among these actors, analyzing their respective evolutionary trajectories and stable equilibrium strategies. Additionally, MATLAB R2018a simulations are used to dynamically demonstrate the impact of initial strategy changes, as well as variations in each stakeholder’s benefits, costs, and losses, on the evolutionary process.
The significance of this research is two-fold: First, by employing a tripartite evolutionary game theory framework, it conducts a comprehensive analysis of the major parameters affecting the strategic choices of users, App providers, and the government, along with the dynamic progression thereof. Through simulation and emulation, it visually illustrates the evolutionary process and stable strategies of each stakeholder, thereby exploring optimal solutions geared towards maximizing societal benefits. This study enriches the research paradigms of App privacy authorization issues and expands the methods for data handling and analysis. Secondly, the research furnishes actionable recommendations for all stakeholders to make proactive and constructive strategic decisions. By doing so, it contributes materially to fostering a cooperative triadic relationship, facilitating industry progress, and ultimately steering the App market towards an ideal state of orderly information circulation. Thus, this study holds significant practical value in advancing collaboration and shaping a beneficial ecosystem within the digital economy.
The remainder of this paper is structured as follows: An extensive review of the prior research that is pertinent to this topic is provided in Section 2. Game models and research assumptions are presented in Section 3. The stable tactics of the three-party evolutionary game are explored in Section 4. An analysis of the numerical simulation is given in Section 5. In Section 6 and Section 7, the research’s discussion and conclusions are drawn.

2. Theoretical Background

In this section, we elucidate the evolutionary and developmental trajectory of evolutionary game theory, integrating its contemporary applications within the privacy domain. We further encapsulate the multifaceted variables that govern corporate compliance policies, government regulations, and user privacy choices. This comprehensive overview furnishes a robust theoretical and bibliographic foundation for the subsequent construction of our models.

2.1. Research Overview of Evolutionary Game Theory

Evolutionary Game Theory (EGT), with roots in biological evolution, was initially proposed by Smith and Price in 1973 [27], concurrently giving rise to the concept of Evolutionarily Stable Strategies (ESS). ESS is a static concept that denotes the predominant strategy adopted by the majority of individuals within a population in a specific ecological context. This strategy, which yields higher payoffs than its alternatives, prevents it from being displaced by other strategies. The ESS effectively embodies the principle that populations that fail to adopt an evolutionarily stable strategy are likely to be outcompeted and ultimately eliminated from the evolutionary process [28]. Additionally, ESS is a strict refinement of the Nash equilibrium [29], indicating that, given that the other participants adhere to their established strategies, there exists no incentive for any individual participant to unilaterally alter their current strategy [30]. Adding to this base, Taylor and Jonker introduced the idea of Replicator Dynamics (RD) by incorporating the mechanism of selection [31]. At its core, this theory postulates that as evolutionary game players continually adjust their strategies, the percentage of individuals who gradually adopt the most advantageous approach increases within the population. Furthermore, they employed differential equations or systems thereof to depict the dynamic adjustment process of players’ strategy choices. The two fundamental principles of EGT, ESS and RD, together show the dynamic evolutionary route that players take in order to arrive at a stable strategy. EGT posits that individual socio-economic behavior unfolds over an extended period, with agents endowed with bounded rationality and initiative. Single-instance games seldom lead all players to an optimal state; therefore, agents must continuously improve their performance based on real-world situations and other players’ activities. By comparing the total benefits of varying strategies, a “survival of the fittest” process takes place, leading to convergence upon stable solutions that are individually advantageous.
Game Theory is esteemed as one of the most effective approaches for studying participants’ motivations, responses, and actions [32,33]. EGT, by factoring in the learning and adaptive capacities of players, goes beyond traditional, static game theory, offering deeper explanatory and predictive insights into intricate social phenomena [34,35]. Presently, the theory has proven to be flexible and applicable in various domains of collective decision-making analysis, including environmental conservation [36,37,38], food safety [39,40], and cultural tourism [41,42], among others.

2.2. Research Overview of EGT in the Context of Privacy

In the realm of privacy decision-making research, numerous scholars have examined participant behavior under the premise of complete rationality. For instance, Wang et al. [43] investigated the incorporation of a compensation mechanism to motivate users to engage with targeted advertisements of interest, treating users, advertising agents, and advertisers as fully rational and sophisticated entities driven by self-interest. Kumari and Chakravarthy [44], on the other hand, employed cooperative game theory with an emphasis on collective rationality to ascertain the optimal strategy for attaining “coalition” privacy security. Nevertheless, recent years have witnessed a growing consensus among researchers that interactions between users and digital service providers typify evolutionary game scenarios, where users aspire to gain services while minimizing privacy concerns, and digital service providers seek additional advantages from information exchanges with users. Consequently, evolutionary game theory has gained traction in privacy research, prominently featuring in contexts such as platform commerce [9,45], online healthcare [46,47], and social networking [48,49,50], where privacy decision-making dynamics are pivotal.
From a bilateral game perspective, Chorppath and Alpcan [51] investigated the dynamics between mobile commerce applications and users concerning the acquisition of location information, where users attain more convenient mobile commerce services by disclosing their geolocation data, and mobile commerce enterprises leverage this information to achieve more targeted marketing returns. Xu et al. [52] constructed a bilateral game model between users and online health communities, with the strategic choices of both sides significantly influenced by the level of medical service support received by users, community incentives, and the privacy protection costs and reputational damages incurred by online health communities. Moreover, third-party regulation emerges as a salient factor in the bilateral game studies of “users-enterprises”, exemplified by Gu et al. [53], who regarded users and social networking sites (SNS) service providers as interacting entities and deemed the government’s regulatory intensity over privacy leakage a pivotal variable in bolstering user security and trust. Wang et al. [33] contended that a game-like relationship exists between e-commerce platforms and consumers with bounded rationality concerning consumer privacy protection. Model derivations and simulation analyses confirmed that artificial intelligence technologies facilitate both parties adopting proactive strategies, and enhancing the supervision level can augment consumer trust, thereby improving consumer privacy protection.
With the deepening exploration of evolutionary game theory in the domain of user behavior, treating third parties as principal actors and examining the interplay among users, businesses, and these third parties has progressively evolved into the prevailing research paradigm in privacy decision-making. For instance, Wang et al. [43] established a triadic game model involving users, context-aware Apps, and malicious adversaries, with single-round and iterated game analyses confirming that context-aware Apps can foster user trust and, by embracing societal normative oversight, effectively hinder privacy breaches. Li et al. [32], in the context of privacy transactions, devised a three-party game model encompassing users, service providers, and competitors, utilizing rigorous mathematical analysis to derive data trading schemes that conform to the Nash equilibrium. Given the prominence of mobile App privacy and security concerns as a widely debated societal issue, research focusing on mobile Apps has proliferated. Guo et al. [54], in addressing information security challenges pertaining to mobile Apps, constructed a collaborative regulatory model with users, App distribution platforms, and local governments as key stakeholders, revealing that regulatory costs play a pivotal role in shaping strategic choices. However, intensifying penalties for both platforms and local governments alongside fostering mutual benefits can spur both entities towards proactive strategies. Xie et al. [55] applied evolutionary game theory to the study of user strategy selection in mobile learning Apps, devising a tripartite privacy protection model involving users, M-learning platforms, and competitors, thereby aiding users in identifying optimal and efficient privacy preservation strategies post-platform interactions.

2.3. Research Overview of Factors Influencing Tripartite Decision-Making

While privacy concerns can be a significant nuisance for users, the majority do not take a “zero-tolerance” approach to these issues [56]. This is primarily due to their expectations of receiving precise and intelligent service functions in exchange for personal information. Privacy calculus theory, one of the most classic paradigms in user privacy research, elucidates that individuals who disclose personal information will stand to gain anticipated benefits from information collectors, concurrently bearing varying degrees of privacy risk [57,58]. In this setting, privacy decision-making emerges as a holistic weighing of information costs against information value by the user [59]. Academic studies like [19,25,26] have shown that an individual’s willingness to disclose their personal information is substantially impacted by perceived risks as well as benefits. Additionally, government regulation refers to the restraints that industry supervisory authorities apply to market participants in compliance with legal and regulatory requirements [60]. Effective government regulation may mitigate users’ privacy concerns [61,62] and augment their willingness to disclose [63], which makes it a crucial guarantee element in influencing users’ privacy decisions.
Within the realm of corporate management, data are esteemed as an indispensable element in the production process [45,64], capable not only of augmenting operational efficiency within enterprises [65] but also of refining business operations and reinforcing market competitiveness through the exploitation of vast quantities of user-provided data [66]. Consequently, driven by their profit-maximizing nature, firms, particularly those wielding market dominance, often opt to exploit data to their fullest extent [67]. Conversely, corporate strategies concerning data acquisition and utilization are also swayed by government regulation and other safeguarding mechanisms. According to Liu et al. [68], hefty fines levied by the government are an effective means of curtailing data misuse since they eat away at the net profitability of businesses. The notion that such sanctions are an important deterrent against corporate malpractices involving data handling is supported by much pertinent research [45,47,69]. Moreover, reputation, functioning as an intangible asset for businesses, embodies the extent to which the public perceives the company’s integrity, attentiveness toward consumers, and commitment to fulfilling agreements [70]. A favorable corporate reputation bears a notable positive correlation with the company’s financial performance [71,72] and market capitalization [73] while also fostering the cultivation of a favorable public image among users, which in turn effectively bolsters their engagement in privacy-disclosure behaviors [19,74], indirectly empowering the conversion of corporate value. In summary, the direct incentive for corporations to strive for a sterling reputation lies in its capacity to boost profitability. Extensive scholarly investigations have revealed that the threat of reputational damage exerts a constraining influence on corporations’ adoption of unethical information-gathering and handling practices [9,69,75].
As a formidable institutional safeguard, stringent government regulation serves to promptly penalize illicit conduct [76], as well as to compel violating firms to compensate harmed individuals in compliance with legal and regulatory requirements [77]. In developing countries, where insufficient legal frameworks and lax enforcement frequently exacerbate user privacy risks, the government’s credibility and dependability can be severely damaged by the ineffective application of the law [78]. Wang et al. [45] posited that government supervision prevented platform companies from misusing data, thereby augmenting societal benefits and reinforcing the credibility of the government. Nevertheless, a failure to promptly implement supervisory measures could entail adverse consequences, including a deterioration of the government-public relationship and a negative impact on the government’s reputation. Similar conclusions are supported by the studies that Sun et al. [9], Zhu et al. [47], Gao et al. [50], and Shi et al. [79] undertook.

3. Problem Description and Model Building

As was previously noted, information authorization via Apps is a process that involves interaction between stakeholders such as users, App providers, and the government. A plethora of factors facilitate and constrain these parties’ privacy decisions. In this section, we construct an evolutionary game model based on relevant assumptions. This model serves as the foundation for subsequent integrated analyses of equilibrium strategy selections and numerical simulations concerning the tripartite stakeholders.

3.1. Problem Description

The main participants in the personal information authorization, flow and management of Apps are users, App providers and the government. Users are information sources communicating with providers that render Apps in a bid to access relevant features and services. Personalized and basic services are supplied to users by App providers, who also act as information recipients and value converters. Robust government regulation is necessary because App providers often take advantage of opportunities to engage in non-compliant actions, crossing lines and harvesting user data excessively, when they come across data segments with significant derivable value. The most valuable kind of data right now is user information, and users’ active participation will hasten the conversion of information value. This will have a significant impact on increasing business revenue and facilitating social and economic advancement. Furthermore, the strategic choices of all three parties are bounded rationality, which is affected by informational asymmetry, decision-making ability limitations, and exogenous variables. Each, in the relentless pursuit of optimizing their individual benefits, inevitably encounters conflicts with other participants involved. As such, the strategies they adopt are invariably shaped and reshaped by the actions and reactions of these counterparts. It follows, therefore, that the decision-making process among these tripartite entities does not result in an instant resolution but, rather, evolves through a series of iterations, gradually progressing towards a state of equilibrium, where the dynamics of their interaction reach a degree of stability. The aforementioned study indicates that the evolutionary game’s research base is consistent with the three parties’ strategic interaction in the context of App personal information authorization. The three parties’ relationship is shown in Figure 1.
The end-users, functioning as the primary interactants with mobile Apps, are compelled to grant a portion of their personal data to App providers in exchange for leveraging the specialized features and services. However, in the actual exercise of this consent, corporate entities, under the influence of profit-driven incentives, frequently deviate from the principle of “minimal necessity” in data gathering, resorting to practices that either compel or excessively solicit permissions for accessing user information. Given that the provision of personal details enables access to a service experience imbued with positive utilities while unreasonably extensive data requisition amplifies user anxieties regarding privacy, we contend that users engage in a strategic dilemma, navigating between the poles of disclosing information (“authorizing”) and withholding it (“not authorizing”). The decision of a user to acquiesce to the disclosure of their personal information is intimately linked to a constellation of considerations: the functional benefits accrued to the App through this authorization, the latent detriments tied to privacy infringements, and the macro-environmental privacy protection facilitated by the government.
App providers, in their endeavor to harness multidimensional user data for the extraction of supernormal profits, may opt to enforce and excessively solicit unnecessary information permissions. Should such practices evade detection and rectification by regulatory authorities and users acquiesce, completing the authorization process without dissent, these providers stand to gain additional profits. Conversely, rigorous oversight by governmental agencies poses the peril of monetary penalties, delisting, and reputational harm to these enterprises. Additionally, the act of non-compliance erodes confidence amongst a broad user demographic, triggering a deterioration in product or corporate reputation and subsequent user attrition. Consequently, App providers are confronted with a strategic alternative between “compliance” and “non-compliance”, a decision intricately influenced by the intensity of governmental regulation, the potential surplus gains achievable through non-compliance, and the repercussions on user sentiment and loyalty.
Acting as the steward of market order, the government’s stringent regulatory measures significantly augment the sense of security and well-being among users, concurrently bolstering its own credibility, a phenomenon that holds equally pertinent in the realm of personal information protection. Nonetheless, given the colossal scale of the mobile application market, rigorous regulation enforcement entails substantial expenditures of human and material resources. Simultaneously, if App providers can maintain effective self-governance, adhering meticulously to legal norms in the collection of user data, a more permissive regulatory stance on the part of the government is unlikely to provoke widespread public dissatisfaction or engender significant societal issues while also economizing on oversight expenditures. This scenario suggests the plausible adoption of a light-touch regulatory strategy by the government. Accordingly, the government confronts a strategic fork between “strict supervision” and “loose supervision”, with its policy selection intricately intertwined with factors such as the credibility garnered through enforcement, potential pitfalls arising from relying solely on market self-discipline, and the fiscal implications of regulatory endeavors.

3.2. Assumptions and Parameters

Hypothesis 1 posits that the probability of users authorizing personal information is denoted as x. When users opt to grant personal information, they reap the benefit of functions and services from the App, quantified as R1. For the purpose of simplicity, it is assumed that users who decline authorization do not access any of the App’s features, implicating a functionality benefit of zero. If the App provider adheres to compliant data solicitation practices, the cost of privacy concerns incurred by users is C1. However, if the App provider engages in non-compliant conduct, users incur an additional privacy concern cost, ΔC1. In the scenario where the government enforces strict oversight, users authorizing personal information experience a sense of stability and order derived from governmental safeguarding represented as U1. Should the App provider engage in non-compliant permission requests and the government, under its strict regulatory stance, intervenes effectively, the enhanced security and assurance imparted to App users by this regulatory action become notably accentuated, leading to an additional perceived boost in government safeguarding effectiveness, ΔU1. Conversely, under lenient government oversight, users who have authorized personal information may feel let down due to the absence of governmental protection. Let W1 denote the minimum expected psychological loss for these authorized users. If the enterprise further engages in unreasonable data solicitation, users’ psychological distress and dissatisfaction escalate beyond this base level, with an additional increment of ΔW1.
Hypothesis 2 postulates that the probability of App providers engaging in reasonable information permission solicitation is denoted as y. In instances where users authorize normally and App providers collect user information appropriately, it is assumed that the provider accrues an informational value gain, R2. Should an App provider opt to collect user information excessively and exploit it unreasonably, under the premise of user authorization, the provider could secure additional informational gains, ΔR2, through avenues like targeted marketing or third-party data transactions. Nevertheless, such non-compliant solicitation practices would concurrently lead to potential losses in the form of diminished brand reputation and user attrition, encapsulated in a baseline cost, C2. With users having granted authorization, App providers face intensified public censure, leading to augmented reputational damage, ΔC2. In the event of strict government regulation, non-compliant providers risk facing fines or even removal from platforms if they fail to rectify. We posit that these consequences would impose a combined toll of reputational and economic losses, denoted as W2, upon the offending App providers. Conversely, under lenient governance, App providers evade punitive measures from regulatory bodies.
Hypothesis 3 posits that the probability of the government adopting a stringent regulatory strategy is represented by z. Upon choosing to diligently enforce regulatory responsibilities, the government incurs auditing and surveillance costs (C3) yet simultaneously garners credibility benefits (U3) from robust oversight and prompt maintenance of market order. In this scenario, if App providers exhibit transgressions such as excessive or non-compliant information solicitation, the government’s timely imposition of corrective measures and penalties accentuates the pivotal role of sectoral authorities in upholding market order, ensuring citizens’ rights, and fostering industry health, thereby yielding an incremental enhancement to government credibility, ΔU3. Alternatively, adopting a lax regulatory approach spares the government the expense of oversight. However, inadequate supervision breeds manifest and latent cybersecurity hazards, resulting in a credibility toll W3. Should providers engage in non-compliant data collection under this relaxed regime, the government faces public scrutiny over its lackluster regulatory efforts, suffering an additional credibility decrement ΔW3. Moreover, in the event a considerable proportion of users grant authorization to information amidst App providers employing illegal collection practices, it triggers heightened collective privacy panic, exposing more profoundly the societal issues stemming from insufficient government oversight, leading to a further aggravation of credibility loss, quantified as V3. A comprehensive list of these parameters, alongside their definitions, is presented in Table 1.

3.3. Model Building

Based on the aforementioned assumptions, the matrix representing the mixed strategy game among users, App providers, and the government is presented in Table 2.

4. Analysis of the Evolutionary Game Model

4.1. Analysis of Replication Dynamics

As previously expounded, assuming the expected returns for users when authorizing and withholding personal information are respectively denoted as E11 and E12, with the average expected return represented as E1, as illustrated by Equations (1)–(3):
E 11 = y z ( R 1 C 1 + U 1 ) + y ( 1 z ) ( R 1 C 1 W 1 Δ W 1 ) + ( 1 y ) z ( R 1 C 1 Δ C 1 + U 1 + Δ U 1 ) + ( 1 y ) ( 1 z ) ( R 1 C 1 Δ C 1 W 1 Δ W 1 )
E 12 = 0
E 1 = x E 11 + ( 1 x ) E 12
Drawing upon the Malthusian population dynamics model [80], we are capable of deriving the replication dynamics equation for users, herein designated as Equation (4):
U 1 ( x ) = d x d t = x ( E 11 E 1 ) = x ( 1 x ) [ y z ( Δ U 1 + Δ W 1 ) + y ( Δ C 1 + Δ W 1 ) + z ( W 1 + Δ W 1 + U 1 + Δ U 1 ) + ( R 1 C 1 Δ C 1 W 1 Δ W 1 ) ]
In a similar vein, assuming the expected returns for App providers operating compliantly and non-compliantly are respectively designated as E21 and E22, with the average expected return being represented as E2, as illustrated by Equations (5)–(7):
E 21 = x z R 2 + x ( 1 z ) R 2
E 22 = x z ( R 2 + Δ R 2 C 2 Δ C 2 W 2 ) + x ( 1 z ) ( R 2 + Δ R 2 C 2 Δ C 2 ) + z ( 1 x ) ( C 2 W 2 ) ( 1 x ) ( 1 z ) C 2
E 2 = y E 21 + ( 1 y ) E 22
The App provider’s replication dynamics equation is designated as Equation (8):
U 2 ( y ) = d y d t = y ( E 21 E 2 ) = y ( 1 y ) [ x ( Δ C 2 Δ R 2 ) + z W 2 + C 2 ]
Presuming that the expected returns for the government under strict and loose regulatory regimes are respectively designated as E31 and E32, with the average expected return being represented as E3, as illustrated by Equations (9)–(11):
E 31 = x y ( U 3 + Δ U 3 C 3 ) + x ( 1 y ) ( U 3 + Δ U 3 C 3 ) + y ( 1 x ) ( U 3 C 3 ) + ( 1 x ) ( 1 y ) ( U 3 C 3 )
E 32 = x y W 3 x ( 1 y ) ( W 3 + Δ W 3 + V 3 ) y ( 1 x ) W 3 ( 1 x ) ( 1 y ) ( W 3 + Δ W 3 )
E 3 = z E 31 + ( 1 z ) E 32
The government’s replication dynamics equation is designated as Equation (12):
U 3 ( z ) = d z d t = z ( E 31 E 3 ) = z ( 1 z ) [ x y V 3 + x V 3 y ( Δ U 3 + Δ W 3 ) + U 3 + Δ U 3 + W 3 + Δ W 3 C 3 ]

4.2. Stability Analysis of Individual Subjects

4.2.1. Users’ Evolutionary Stability Analysis

Inferred from Equation (4), the evolutionary game strategies for users can manifest in several configurations, including
(1)
Under the premise that y = z ( U 1 + Δ U 1 + W 1 + Δ W 1 ) ( R 1 C 1 Δ C 1 W 1 Δ W 1 ) Δ C 1 + Δ W 1 z ( Δ U 1 + Δ W 1 ) , for an arbitrary value of x, we perpetually have U1(x) ≡ 0, indicating that regardless of the strategic selection chosen by users, it inherently embodies an ESS;
(2)
Under the premise that y z ( U 1 + Δ U 1 + W 1 + Δ W 1 ) ( R 1 C 1 Δ C 1 W 1 Δ W 1 ) Δ C 1 + Δ W 1 z ( Δ U 1 + Δ W 1 ) , and guided by principles of stability theorem, there prevail two equilibrium strategies, specifically x = 0 and x = 1, which fulfill the criterion that U1(x) = 0. Differentiating U1(x) yields
U 1 ( x ) = ( 1 2 x ) [ y z ( Δ U 1 + Δ W 1 ) + y ( Δ C 1 + Δ W 1 ) + z ( W 1 + Δ W 1 + U 1 + Δ U 1 ) + ( R 1 C 1 Δ C 1 W 1 Δ W 1 ) ]
In circumstances wherein y > z ( U 1 + Δ U 1 + W 1 + Δ W 1 ) ( R 1 C 1 Δ C 1 W 1 Δ W 1 ) Δ C 1 + Δ W 1 z ( Δ U 1 + Δ W 1 ) , U 1 ( 0 ) > 0 ,   U 1 ( 1 ) < 0 , thereby rendering the strategy where x = 1 as the stable one. This signifies that the act of users adopting an “authorizing” strategy aligns with the criteria for an ESS. Conversely, when y < z ( U 1 + Δ U 1 + W 1 + Δ W 1 ) ( R 1 C 1 Δ C 1 W 1 Δ W 1 ) Δ C 1 + Δ W 1 z ( Δ U 1 + Δ W 1 ) , a converse scenario unfolds: U 1 ( 0 ) < 0 ,   U 1 ( 1 ) > 0 , the strategy x = 0 emerges as the stable strategy, underscoring that the choice of users to forgo authorization, effectively embracing a “not authorizing” strategy, conforms to the tenets of an ESS.
Figure 2 displays the replication dynamic phase diagram of users. The strategic selection of users in this game-theoretic framework is intricately influenced by the probabilistic interplay of two key determinants: the likelihood that App providers will opt for a “compliance” strategy and the government’s propensity to “strict supervision” strategy. When App providers opt to access user information in a compliant manner with a probability of y > z ( U 1 + Δ U 1 + W 1 + Δ W 1 ) ( R 1 C 1 Δ C 1 W 1 Δ W 1 ) Δ C 1 + Δ W 1 z ( Δ U 1 + Δ W 1 ) , users will prefer to select the “authorizing” strategy (x → 1). Users will choose the “not authorizing” strategy (x → 0) when y < z ( U 1 + Δ U 1 + W 1 + Δ W 1 ) ( R 1 C 1 Δ C 1 W 1 Δ W 1 ) Δ C 1 + Δ W 1 z ( Δ U 1 + Δ W 1 ) , on the other hand.

4.2.2. App Providers’ Evolutionary Stability Analysis

Inferred from Equation (8), the evolutionary game strategies for App providers can manifest in several configurations, including:
(1)
Under the premise that z = x ( Δ C 2 Δ R 2 ) C 2 W 2 , for an arbitrary value of y, we perpetually have U2(y) ≡ 0, indicating that regardless of the strategic selection chosen by App providers, it inherently embodies an ESS;
(2)
Under the premise that z x ( Δ C 2 Δ R 2 ) C 2 W 2 , and guided by principles of stability theorem, there prevail two equilibrium strategies, specifically y = 0 and y = 1, which fulfill the criterion that U2(y) = 0. Differentiating U2(y) yields
U 2 ( y ) = ( 1 2 y ) [ x ( Δ C 2 Δ R 2 ) + z W 2 + C 2 ]
In circumstances wherein z > x ( Δ C 2 Δ R 2 ) C 2 W 2 , U 2 ( 0 ) > 0 ,   U 2 ( 1 ) < 0 , thereby rendering the strategy where y = 1 as the stable one. This signifies that the act of App providers adopting a “compliance” strategy aligns with the criteria for an ESS. Conversely, when z < x ( Δ C 2 Δ R 2 ) C 2 W 2 , a converse scenario unfolds: U 2 ( 0 ) < 0 ,   U 2 ( 1 ) > 0 , the strategy y = 0 emerges as the stable strategy, underscoring that the choice of App providers to excessively solicit user information permissions effectively embracing a “non-compliance” strategy, conforms to the tenets of an ESS.
Figure 3 displays the replication dynamic phase diagram of App providers. The strategic selection of App providers in this game-theoretic framework is intricately influenced by the probabilistic interplay of two key determinants: the likelihood that App users will opt for an “authorizing” strategy and the government’s propensity to “strict supervision” strategy. When the government chooses to strictly supervise with a probability of z > x ( Δ C 2 Δ R 2 ) C 2 W 2 , App providers will prefer to select the “compliance” strategy (y → 1). App providers will choose the “non-compliance” strategy (y → 0) when z < x ( Δ C 2 Δ R 2 ) C 2 W 2 , on the other hand.

4.2.3. The Government’s Evolutionary Stability Analysis

Inferred from Equation (12), the evolutionary game strategies for the government can manifest in several configurations, including
(1)
Under the premise that x = y ( Δ U 3 + Δ W 3 ) U 3 Δ U 3 W 3 Δ W 3 + C 3 V 3 y V 3 , for an arbitrary value of z, we perpetually have U3(z) ≡ 0, indicating that regardless of the strategic selection chosen by the government, it inherently embodies an ESS;
(2)
Under the premise that x y ( Δ U 3 + Δ W 3 ) U 3 Δ U 3 W 3 Δ W 3 + C 3 V 3 y V 3 , and guided by principles of stability theorem, there prevail two equilibrium strategies, specifically z = 0 and z = 1, which fulfill the criterion that U3(z) = 0. Differentiating U3(z) yields
U 3 ( z ) = ( 1 2 z ) [ x y V 3 + x V 3 y ( Δ U 3 + Δ W 3 ) + U 3 + Δ U 3 + W 3 + Δ W 3 C 3 ]
In circumstances wherein x > y ( Δ U 3 + Δ W 3 ) U 3 Δ U 3 W 3 Δ W 3 + C 3 V 3 y V 3 , U 3 ( 0 ) > 0 ,   U 3 ( 1 ) < 0 , thereby rendering the strategy where z = 1 as the stable one. This signifies that the act of governments adopting a “strict supervision” strategy aligns with the criteria for an ESS. Conversely, when x < y ( Δ U 3 + Δ W 3 ) U 3 Δ U 3 W 3 Δ W 3 + C 3 V 3 y V 3 , a converse scenario unfolds: U 3 ( 0 ) < 0 ,   U 3 ( 1 ) > 0 , the strategy z = 0 emerges as the stable strategy, underscoring that the choice of governments to adhere lightly to their regulatory duties, effectively embracing a “loose supervision” strategy, conforms to the tenets of an ESS.
Figure 4 displays the replication dynamic phase diagram of the government. The strategic selection of government in this game-theoretic framework is intricately influenced by the probabilistic interplay of two key determinants: the likelihood that App users will opt for an “authorizing” strategy and App providers’ propensity to a “compliance” strategy. When App users choose to authorize their personal information with a probability of x > y ( Δ U 3 + Δ W 3 ) U 3 Δ U 3 W 3 Δ W 3 + C 3 V 3 y V 3 , the government will prefer to select the “strict supervision” strategy (z → 1). The government will choose the “loose supervision” strategy (z→0) when x < y ( Δ U 3 + Δ W 3 ) U 3 Δ U 3 W 3 Δ W 3 + C 3 V 3 y V 3 , on the other hand.

4.3. Analysis of Equilibrium Solution and Its Stability

The integration of replication dynamic Equations (4), (8), and (12) forms a tridimensional system, within which, when U1(x), U2(y), and U3(z) are all zero, the equilibrium points of the system are identified as E1(0,0,0), E2(0,0,1), E3(0,1,0), E4(0,1,1), E5(1,0,0), E6(1,0,1), E7(1,1,0), E8(1,1,1), and E9(x*,y*,z*). Here x * = y ( Δ U 3 + Δ W 3 ) U 3 Δ U 3 W 3 Δ W 3 + C 3 V 3 y V 3 , y * = z ( U 1 + Δ U 1 + W 1 + Δ W 1 ) ( R 1 C 1 Δ C 1 W 1 Δ W 1 ) Δ C 1 + Δ W 1 z ( Δ U 1 + Δ W 1 ) , z * = x ( Δ C 2 Δ R 2 ) C 2 W 2 . If an equilibrium point is asymptotically stable, it must be a pure strategy Nash equilibrium solution. Therefore, excluding the mixed Nash equilibrium solution E9(x*,y*,z*), it becomes necessary to conduct stability analysis on only eight equilibrium points: E1(0,0,0), E2(0,0,1), E3(0,1,0), E4(0,1,1), E5(1,0,0), E6(1,0,1), E7(1,1,0), and E8(1,1,1) [81]. As per the discriminant method proposed by Friedman in 1991 [82], if all eigenvalues of the Jacobian matrix are less than zero, then this equilibrium point is considered an asymptotically stable point for the evolution of the system. The Jacobian matrix of the tripartite game subject is
J 11 J 12 J 13 J 21 J 22 J 23 J 31 J 32 J 33
Specifically,
{ J 11 = ( 1 2 x ) ( R 1 C 1 W 1 Δ C 1 Δ W 1 + z U 1 + z W 1 + y Δ C 1 + z Δ U 1 + z Δ W 1 y z Δ U 1 ) J 12 = x ( 1 x ) ( Δ C 1 z Δ U 1 ) J 13 = x ( 1 x ) ( U 1 + W 1 + Δ U 1 + Δ W 1 y Δ U 1 ) J 21 = y ( 1 y ) ( Δ C 2 Δ R 2 ) J 22 = ( 1 2 y ) ( C 2 + z W 2 + x Δ C 2 x Δ R 2 ) J 23 = y ( 1 y ) W 2 J 31 = z ( 1 z ) ( y V 3 + V 3 ) J 32 = z ( 1 z ) ( x V 3 Δ U 3 Δ W 3 ) J 33 = ( 1 2 z ) ( x y V 3 + x V 3 y Δ U 3 y Δ W 3 + U 3 + Δ U 3 + W 3 + Δ W 3 C 3 )
The eigenvalues and stability of eight equilibrium points are delineated in Table 3.
Owing to the influence of multiple parameters, the game’s theoretical model converges to distinct stable points under differing parameter configurations. As evidenced by Table 2, the eigenvalues associated with equilibria E1(0,1,0) and E2(0,0,1) consistently remain positive, thereby disqualifying E1 and E2 as asymptotically stable points. Consequently, only E3(0,1,0), E4(0,1,1), E5(1,0,0), E6(1,0,1), E7(1,1,0) and E8(1,1,1) can attain evolutionary stability under specific conditions. Given that information flow serves as the fundamental premise empowering the development of the digital economy, yet the vast scale of the App market and its elongated business chain necessitate macroscopic governance by the government and compliant practices by information handlers for sustainable health, the tripartite strategy converging at E8(1,1,1) signifies an ideal state of synergistic development and compatible progress for App market governance. Particular emphasis is placed on examining the conditions conducive to the stable emergence of this equilibrium and its resultant implications.
The equilibrium point E8(1,1,1) embodies the optimal solution where users actively engage in information authorization, App providers adhere to norms in collecting user data, and the government exercises rigorous supervision. Its realization hinges upon three premises:
(1) C1 < R1 + U1, indicating that the strategic choice of “authorization” by users is subject to the combined effects of privacy concerns, the sense of security provided by macroscopic safeguards, and the value received from App functionalities. It is solely under circumstances wherein the sum of the utility gains from the facilitation of advanced App functionality through information provision and the heightened security due to government-imposed corporate conduct regulations vastly outweigh the perceived costs pertaining to privacy intrusions that the users’ strategy gravitates toward and consolidates around this particular course of action;
(2) ΔR2 < W2 + ΔC2 + C2, signifying that the App providers’ strategic choice of “compliance” is determined by a combination of factors, including the excess profits from over-collection, the harm to their reputation arising from non-compliance, the additional user dissatisfaction under information authorization conditions, and the punitive measures enforced by strict government oversight. Critically, the need for providers to choose a “compliance” strategy rests on the fact that the costs of non-compliant privacy practices outweigh any benefit;
(3) C3 < U3 + W3, illustrating that the government’s adoption of a “strict supervision” strategy is intimately linked to the social dangers that lie beneath the unbridled liberalization of the information market, the credibility boost garnered from rectifying the disorder of corporate non-compliant practices, and the tangible expenditures of manpower, material resources, and finances necessitated by the active implementation of App governance. Given that the overall benefits outweigh the associated expenses, rigorous federal oversight may be warranted. The economic rationale supporting this strategy is based on a situation where the total benefits of these interventions exceed their costs, providing a strong impetus for the government to choose strict regulatory control.

5. Simulation Result and Analysis

In pursuit of a more visual illustration of the evolutionary trajectory of stable strategies among the three parties and to validate the efficacy of the game theoretical model, we employed MATLAB R2018a software to conduct numerical simulation analyses on the triadic evolutionary game model set against the backdrop of personal information authorization. After assigning values to the parameters, we input the replication dynamic equations and use code to simulate the evolutionary process of the three parties reaching equilibrium solutions based on changes in initial strategies and parameter variations, with t denoting the system runtime. Additionally, the impact of the core element of the replicated dynamic system on the progression of the game is explored.

5.1. Analysis of Simulation for Initial Strategy Modification

We utilize x, y, and z to represent the initial strategy ratios of user authorization of personal information, the App providing firms’ compliance in claiming rights, and the government’s strict supervision, respectively. As previously articulated, the App market is driven by the need for orderly information flow and the sustainable growth of the industry. This section delves into the implications of alterations in the initial strategies on the outcomes of the three-party evolution when these subjects converge at the equilibrium point E8(1,1,1).

5.1.1. The Impact of Changes in Users’ Initial Strategy(x) on Tripartite Evolution Strategies

Based on the initial conditions that converge to the equilibrium point E8(1,1,1) as defined by Equations (1)–(3): C1 < R1 + U1, ΔR2 < W2 + ΔC2 + C2, and C3 < U3 + W3. We examined the impact of users’ initial strategy changes on the outcomes of the tripartite evolution. For this analysis, we assigned the following initial values in Table 4. The initial strategies were established with x = 0.1, y = 0.1 and z = 0.1. When x assumes values of 0.1, 0.4, 0.7, and 0.9 in succession, the influence of these trends on users’ strategic choices, App providers’ decisions, and government policies is illustrated in Figure 5. As shown in the figure, when users are more likely to choose the “authorization” strategy, they evolve faster toward this strategy. On the other hand, it slows down the evolution of App providers towards a “compliance” strategy and speeds up the evolution of government towards a “strict supervision” strategy. The rationale is that when users provide their personal information, it gives App providers an opportunity to obtain additional information gains, so they are reluctant to adopt compliant privacy practices. For the government, facing the situation of user authorization, the loss of credibility and potential losses caused by non-regulation become important factors, so strict supervision becomes its main choice.

5.1.2. The Impact of Changes in App Providers’ Initial Strategy(y) on Tripartite Evolution Strategies

To investigate the influence of a shift in initial strategies on the tri-evolutionary outcomes of App providers, we initially establish the baseline values for each parameter as presented in Table 5. Herein, the initial strategies are designated with x = 0.4 and z = 0.4. The trend effects on the strategic decisions of users, App providers, and governments are depicted in Figure 6, where y assumes values of 0.1, 0.4, 0.7, and 0.9 in sequence. As illustrated by the graph, the greater the probability that App providers choose the “compliance” strategy, the faster users evolve toward the “authorization” strategy. Similarly, the providers themselves also evolve more rapidly toward the “compliance” strategy. While the government’s evolution is less pronounced, it is still discernible that the speed at which it moves toward “strict supervision” increases. The probable reason for these dynamics is that the App providers’ choice to adopt compliant privacy practices reinforces users’ determination to authorize personal information. Moreover, the government’s strategy selection is more significantly influenced by users’ strategy choices than by those of the providers. As such, the user authorization behavior spurred by the App providers’ compliant information collection becomes the primary factor driving the government’s quicker adoption of strict supervision.

5.1.3. The Impact of Changes in the Government’s Initial Strategy(z) on Tripartite Evolution Strategies

We first set the starting values of each parameter as Table 6, where the initial strategies x = 0.1 and y = 0.1 are established in order to investigate the effects of changing the government’s initial strategies on the results of the three-way evolution. The trend effects on users’, App providers’, and governments’ strategy choices when z takes values of 0.1, 0.4, 0.7, and 0.9 in turn are shown in Figure 7. The graph illustrates that an increased likelihood of the government opting for the “strict supervision” strategy results in a swifter evolution towards the “authorization” strategy among users. Concurrently, App providers evolve more rapidly toward the “compliance” strategy, and the government itself progresses more quickly toward the “strict supervision” strategy. Users can feel more comfortable authorizing personal information because of the external assurance that strict government oversight provides. It also acts as a deterrent to businesses engaging in non-compliant privacy practices. As a result, there is a greater chance that all parties will succeed in achieving the optimum strategy E8(1,1,1), which stands for an optimal equilibrium state.

5.1.4. Brief Summary of the Results

From the aforementioned study, it is clear that the initial strategy ratio significantly influences how the three subjects evolve and that any party’s initial strategy shift will have an impact on both its own and the other two subjects’ strategy stabilization solutions. Additionally, the initial strategy ratio has no effect on the final evolution results of the three parties’ participating subjects when the initial parameter values are set to satisfy the condition of converging to the equilibrium point E8(1,1,1). Apart from the prolongation of the App provider’s evolution toward the optimal solution due to the growth in the user’s initial strategy, overall, as the initial strategy ratios of all three parties increase gradually, users, App providers, and the government will accelerate towards the optimal asymptotic equilibrium point. This equilibrium encompasses users authorizing their personal information, compliance claims from App providers, and strict government regulation. Stated otherwise, as the value of the initial strategy escalates, the realization of an ideal policy combination becomes more straightforward.

5.2. Analysis of Simulation for Parameter Modification

The replication dynamic Equations (4), (8), and (12) of the tripartite subjects reveal that alterations in parameters such as cost, benefit, and loss per subject influence their individual strategy selection and evolutionary rate. We conducted a numerical simulation analysis on the parameters, including users’ perceived benefits from functionalities and services R1, the privacy concern costs incurred C1, the user attrition and reputation decline costs C2 due to App providers’ non-compliant data solicitation practices, the potential penalty losses W2 incurred by App providers for non-compliance, and the regulatory enforcement costs C3 borne by the government, with a view to delve into the impacts of variations in these parameters on the strategic selections of each stakeholder. The initialization strategies for all three parties implicated in the subject were assigned in Table 7.

5.2.1. The Impact of Changes in R1 on Users’ Evolutionary Strategies

Figure 8 illustrates, with other parameters held constant, the simulation of the impact of variations in users’ perceived benefits from functionalities and services, parameter R1, on their individual strategy. It is apparent that when R1 measures at 0.3, the meager benefit level, significantly trailing the risk perception associated with information provision, prompts users to adopt a strategy of “non-authorization”. Upon adjusting R1 to 1.6, an initial fluctuation in user decision-making emerges, with a lingering tendency towards the “non-authorization” strategy initially, but over time, after a period of evolution, users’ decisions stabilize towards authorizing their information. As R1 escalates further to 2.0 and 2.4, the timeframe for users’ strategic evolution towards the “authorizing” strategy will markedly shrink, highlighting an accelerated convergence towards this strategy.

5.2.2. The Impact of Changes in C1 on Users’ Evolutionary Strategies

Figure 9 simulates the impact of a difference in App users’ perceived privacy concerns regarding C1 on its own approach while maintaining the other variables constant. A low perceived risk cost and a propensity for rapid “authorizing” of techniques are indicated when the user’s perceived privacy concern fluctuates between 0.8 and 1.0. The user experiences varying levels of enthusiasm for authorizing information when their privacy concern increases to 1.2. The evolution takes slightly longer than it did when C1 was 0.8 and 1.0, even though they eventually stable in the “authorizing” strategy. The user will eventually prefer the “authorizing” option when C1 escalates over 2.4 since information authorization will come at a significantly higher cost than its advantages.

5.2.3. The Impact of Changes in C2 on App Providers’ Evolutionary Strategies

Depicted in Figure 10, under the invariant values of other parameters, is a simulation illustrating the outcomes of variations in the reputational cost, C2, incurred by App providers due to non-compliant privacy practices. Should the negative feedback from users regarding corporate misconduct or overreaching solicitation be negligible, that is, when the cost of reputational and word-of-mouth decline for the App providers stands at 0.3, they would be less inhibited by cost considerations and opt for a “non-compliance” strategy in pursuit of maximum economic efficiency. However, upon the escalation of C2 to 0.6, the pressure of reputation erosion on the App provider intensifies, leading to an initial progression towards “compliance” orientation. Yet, post a transitional phase, the strategy reverts to “non-compliance”, albeit with a notably prolonged duration compared to the scenario where C2 was 0.3. This adjustment signals a rising prominence of reputational risk in the operational decision-making calculus of App providers. Further augmentation of reputational costs to 0.9 and subsequently 1.2, nearly approximating the benefits derived from the “non-compliance” strategy, prompts App providers to swiftly and definitively anchor their strategy on “compliance”.

5.2.4. The Impact of Changes in W2 on App Providers’ Evolutionary Strategies

Figure 11 shows the simulation of what transpires to its own strategy when App providers confront various degrees of government-imposed fines W2, assuming all other parameters are untouched. Within the milieu of stringent regulatory oversight, the penalty mechanism subjects providers overstepping boundaries in soliciting user information to differential fines. In instances where the fine magnitude is modest, represented here by values of 0.3 and 0.5 in the simulation, while this exerts a certain deterrent effect on App providers, momentarily elevating the likelihood of compliant privacy practices, the insufficient severity of punishment fails to deter them from adopting opportunistic behavior in the long term. Conversely, as the fine W2 escalates to 0.8 and 1.2, App providers confront mounting financial penalties, prompting a more cautious approach and a gradual transition towards the “compliance” strategy. Moreover, the higher the penalty incurred, the more expedited the evolutionary trajectory of App providers toward embracing the “compliance” strategy, highlighting the intensifying influence of financial losses on strategy adjustment.

5.2.5. The Impact of Changes in C3 on the Government’s Evolutionary Strategies

Figure 12 presents the simulation outcomes of government strategy selection, assuming that all other parameters remain constant while administrative cost C3 fluctuates. Given the expansive scale of the App market, implicating a multitude of App providers and third-party entities, the rigorous implementation of policy standards, the technical layout measures, and the execution of targeted initiatives inevitably incur substantial manpower and material costs. Consequently, when the government encounters regulatory expenditures that are less than the social utility gains derived from rigorous regulation and can effectively mitigate potential losses due to privacy concerns (C3 = 0.6 and C3 = 0.9), it will adopt stringent regulatory measures against App providers. The lower the associated costs, the quicker the evolution rate of the government’s “strict supervision” approach. However, as regulatory costs escalate (C3 = 1.2 and C3 = 1.5), the pressure on regulatory expenditure faced by the government also rises proportionally. At this juncture, the government begins to transition towards a “loose supervision” model, eventually converging on this strategic approach.

6. Discussion

6.1. Results Analysis

Drawing upon the aforementioned model derivation and simulation analysis results, we arrive at the following conclusions:
First, the three stakeholders’ choices of strategies are marked by a high level of interconnectedness. Any decision taken by one of the three has an evident effect on the other two. Consequently, in order to find solutions that maximize the total utility of the App market, it is imperative to take into account the strategic decisions and changing trends of all three actors in a comprehensive manner. This comprehensive strategy offers a practical route to direct the App ecosystem toward structured operations and collaborative growth.
Second, the rate at which the participating entities and the other two parties converge upon stable strategies can be significantly changed by varying their initial strategy ratios while maintaining other factors constant. Thus, the best course of action for promoting simultaneous advancements in privacy protection and information exchange within the App market is to actively work to encourage users to voluntarily authorize their personal information, reduce the instances of non-compliant data collection by App providers, and strengthen government supervision and inspection.
Third, the triadic stakeholders’ strategic choices are profoundly affected by an assortment of factors, such as users’ perceived utility from functional benefits, privacy concerns’ costs, the efficacy of government regulation, the extra profit providers stand to earn from gathering irrelevant information, the magnitude of penalties imposed by the government, corporate reputation, government-incurred regulatory expenditures, credibility enhancements, and public skepticism associated with inadequate oversight. In the simulation section, special attention is placed on analyzing how the five most important factors—costs, benefits, and losses—affect the resulting simulations. To be specific:
(1) The primary purpose for which users consent to the disclosure of information is so they may benefit from the more personalized services and enhanced recreational possibilities that come with providing Apps with their personal privacy data. Users are more likely to opt for the “authorizing” strategy if the perceived holistic advantage is bigger. This may occasionally give rise to the “privacy paradox”, which is a contentious issue in academic circles. In contrast, users will not have any motivation to provide information access in return for services if they think an App is superfluous. Meanwhile, privacy concerns serve as impediments to the adoption of the “authorizing” strategy, with users potentially ignoring benefit considerations altogether and opting for the “not authorizing” strategy when the risk perception reaches a certain threshold. Hence, to alleviate user apprehensions and trust deficits, App providers ought to concentrate on enhancing user experience, optimizing product functions, and putting in place acceptable and scientifically sound privacy protocols. Such measures are pivotal in attracting a larger user base to partake in information authorization and engage more deeply with the App’s usage experience.
(2) It is possible for App providers to take a “non-compliance” strategy to capitalize on the surplus value behind user information when they embrace non-compliant claiming behavior without encountering significant reactions from users or seriously damaging their company’s brand. Unauthorized information collecting is being stopped in its tracks as the damage that these actions do to a company’s reputation grows. Conversely, the impulsion for resorting to opportunistic tactics, including non-compliant data harvesting, heightens for providers when faced with minor governmental penalties for anomalies in data gathering. Nonetheless, as the severity of these penalties escalates, economic self-preservation drives these entities towards a “compliance-first” stance. Furthermore, there is a strong correlation between the speed at which companies adopt compliant behavior and the severity of reputational damage and penalty escalation.
(3) Implementing stringent monitoring is hampered by the government’s regulatory costs. Regulators may even adopt a “free-riding” mindset when the cost of supervision is disproportionately high, taking the cost-effectiveness ratio into account. When the financial cost of regulations lessens, the government will be in a stronger position to actively enforce strict monitoring tactics since active administration fosters an atmosphere that is beneficial for information exchange, which promotes the App market and industry advancement. The administrative sector can enforce strict oversight practices by implementing a variety of effective strategies, such as minimizing the cost of regulatory oversight for government agencies, changing inward-looking regulatory thinking through cooperation with industry self-discipline organizations, instituting surveillance and whistle-blowing mechanisms for non-conforming Apps to broaden violation intelligence sources, and promoting regulatory effectiveness through smart technology inspections.

6.2. Theoretical and Practical Significance

From a theoretical standpoint, prior research on user privacy decisions has predominantly centered on users’ intentional behaviors and attitudinal dispositions, with empirical studies employing questionnaire surveys being the prevailing methodology. However, studies examining government regulations on corporate behavior and protection of user information rights have largely been descriptive analyses, with scant quantitative analysis utilizing multi-party game models. The diverse identities of users participating in App information authorization are meticulously investigated in this essay, which takes into account users, App providers, and the government as stakeholders in the mobile App ecosystem. It builds a three-party evolutionary game model and proactively studies how each stakeholder progresses toward ideal strategies amid ongoing modifications. This approach provides a theoretical framework for research on multi-party strategy selection in Apps, as well as a dynamic and interactive research perspective for studies pertaining to App user privacy decision-making.
This study has crucial practical significance for corporate compliance management, government regulation and user rights protection. By highlighting the significance of personal information, increasing awareness of privacy protection, and cautioning against overclaiming and blind authorization, it facilitates users to comprehend the pros and cons of granting permission to access personal information. From the perspective of App providers, this paper delves into the internal mechanism of users’ personal information authorization in mobile applications, aiding App providers in comprehending user necessities and alleviating their anxieties. This paper proposes a strategy to actively promote user consent by improving application functions, reinforcing user trust and mitigating privacy concerns so as to achieve a win–win situation between commercial interests and privacy protection, which is integral for the sustainable development of enterprises over the long haul. Regarding government entities, this paper underscores that effective supervision is crucial to bolster user satisfaction and security. This paper provides a reference for government regulators to improve privacy protection regulations, strengthen monitoring in key areas and increase publicity on the security of personal information in Apps. Furthermore, it holds significant value for industry regulators in guiding App providers toward compliant privacy practices, so as to create a healthy digital economy.

6.3. Limitation and Future Direction

Even though this study is a novel attempt in the field of App privacy research, it is unavoidably limited by a variety of subjective and objective constraints. These deficiencies, which mostly show up in the following areas, call for additional improvement and refinement in later studies:
In terms of model construction, although we have synthesized the principal factors influencing the privacy decisions of users, enterprises, and the government based on a comprehensive review of prior studies, thereby formulating our hypotheses and parameters, it is plausible that additional factors may impact the strategic choices of these stakeholders in real-world contexts. Furthermore, certain assumptions made during the theoretical exploration were grounded in idealized states. Yet, real-world situations frequently involve more intricacy. Consequently, subsequent studies could integrate a broader range of variables tailored to specific research contexts and undertake multifaceted analyses that align with practical applications.
In terms of simulation and emulation, given the inherently subjective nature of user-perceived benefits, corporate reputation, governmental credibility and other related factors, the parameter assignment in this study can only represent approximate conditions, lacking robust support from empirical data derived from actual cases. To enhance the objectivity of our findings, future studies should place greater emphasis on empirical research methodologies.
In terms of participant selection, the ecosystem of App information circulation and privacy protection encompasses other stakeholders, such as third-party App stores and industry self-regulatory bodies, which also influence the interaction between users and enterprises. The complex network of interactions significantly impacts the evolutionary game dynamics among participants, representing a pivotal area for future exploration.

7. Conclusions

Considering the research gap in existing studies on privacy decision-making and privacy governance, this paper constructs an evolutionary game model involving three principal actors—users, App providers, and the government—in the context of personal information authorization, which aims to explore feasible strategies for maximizing the benefits of the App market, ensuring the orderly flow of information and promoting the healthy development of the industry. The model employs a system of replication dynamics equations to analyze the evolutionary trajectories and stable strategies of each actor. MATLAB numerical simulations are utilized to demonstrate the impact of changes in initial strategies and variations in weighting parameters on the evolution process of the three parties. The results show that there is a high degree of interdependence among the strategic choices of these actors. Modulating the initial strategy values of each participant can significantly alter the rate at which they approach their stable strategies. The strategy choices of the three parties are influenced by a combination of costs, benefits, and losses associated with each actor. Under the assumption of constant parameters, altering the parameter values can intuitively illustrate the changes in the strategies of users, App providers, and the government, thus enabling targeted guidance toward the ideal direction in which users proactively authorize their personal information, App providers follow regulations while gathering user data, and the government oversight is strict.

Author Contributions

Conceptualization, J.T. and Z.P.; methodology, J.T.; software, J.T.; validation, J.T., Z.P. and W.W.; formal analysis, J.T. and Z.P.; investigation, J.T.; resources, J.T., Z.P. and W.W.; data curation, J.T.; writing—original draft preparation, J.T.; writing—review and editing, Z.P. and W.W.; visualization, J.T.; supervision, Z.P. and W.W.; project administration, Z.P. and W.W.; funding acquisition, Z.P. and W.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Soomro, Z.A.; Shah, M.H.; Ahmed, J. Information security management needs more holistic approach: A literature review. Int. J. Inf. Manag. 2016, 36, 215–225. [Google Scholar] [CrossRef]
  2. Kshetri, N. Big data׳ s impact on privacy, security and consumer welfare. Telecommun. Policy 2014, 38, 1134–1145. [Google Scholar] [CrossRef]
  3. Mamonov, S.; Benbunan-Fich, R. The impact of information security threat awareness on privacy-protective behaviors. Comput. Hum. Behav. 2018, 83, 32–44. [Google Scholar] [CrossRef]
  4. IBM Security. Cost of a Data Breach Report 2023 [EB/OL]. Available online: https://www.ibm.com/downloads/cas/E3G5JMBP (accessed on 24 July 2023).
  5. Baidoo-Anu, D.; Ansah, L.O. Education in the era of generative artificial intelligence (AI): Understanding the potential benefits of ChatGPT in promoting teaching and learning. J. AI 2023, 7, 52–62. [Google Scholar] [CrossRef]
  6. Fui-Hoon Nah, F.; Zheng, R.; Cai, J.; Siau, K.; Chen, L. Generative AI and ChatGPT: Applications, challenges, and AI-human collaboration. J. Inf. Technol. Case Appl. Res. 2023, 25, 277–304. [Google Scholar] [CrossRef]
  7. Wu, X.; Duan, R.; Ni, J. Unveiling security, privacy, and ethical concerns of ChatGPT. J. Inf. Intell. 2024, 2, 102–115. [Google Scholar] [CrossRef]
  8. Musto, J. OpenAI, Microsoft Face Class-Action Suit over Internet Data Use for AI Models [EB/OL]. Available online: https://www.foxnews.com/tech/openai-microsoft-face-class-action-suit-internet-data-use-ai-models (accessed on 29 June 2023).
  9. Sun, Y.; Zhang, Y.F.; Wang, Y.; Zhang, S. Cooperative governance mechanisms for personal information security: An evolutionary game approach. Kybernetes, 2023; ahead-of-print. [Google Scholar] [CrossRef]
  10. Samonas, S.; Dhillon, G.; Almusharraf, A. Stakeholder perceptions of information security policy: Analyzing personal constructs. Int. J. Inf. Manag. 2020, 50, 144–154. [Google Scholar] [CrossRef]
  11. Choi, J.P.; Jeon, D.S.; Kim, B.C. Privacy and personal data collection with information externalities. J. Public Econ. 2019, 173, 113–124. [Google Scholar] [CrossRef]
  12. Wang, J.; Shan, Z.; Gupta, M.; Rao, H.R. A longitudinal study of unauthorized access attempts on information systems: The role of opportunity contexts. MIS Q. 2019, 43, 601–622. [Google Scholar] [CrossRef]
  13. Alneyadi, S.; Sithirasenan, E.; Muthukkumarasamy, V. A survey on data leakage prevention systems. J. Netw. Comput. Appl. 2016, 62, 137–152. [Google Scholar] [CrossRef]
  14. Hauer, B. Data and information leakage prevention within the scope of information security. IEEE Access 2015, 3, 2554–2565. [Google Scholar] [CrossRef]
  15. Featherman, M.S.; Miyazaki, A.D.; Sprott, D.E. Reducing online privacy risk to facilitate e-service adoption: The influence of perceived ease of use and corporate credibility. J. Serv. Mark. 2010, 24, 219–229. [Google Scholar] [CrossRef]
  16. Guo, Z.; Cho, J.H.; Chen, R.; Sengupta, S.; Hong, M.; Mitra, T. Online social deception and its countermeasures: A survey. IEEE Access 2020, 9, 1770–1806. [Google Scholar] [CrossRef]
  17. Son, J.Y.; Kim, S.S. Internet users’ information privacy-protective responses: A taxonomy and a nomological model. MIS Q. 2008, 32, 503–529. [Google Scholar] [CrossRef]
  18. Miltgen, C.L.; Smith, H.J. Falsifying and withholding: Exploring individuals’ contextual privacy-related decision-making. Inf. Manag. 2019, 56, 696–717. [Google Scholar] [CrossRef]
  19. Wang, L.; Yan, J.; Lin, J.; Cui, W. Let the users tell the truth: Self-disclosure intention and self-disclosure honesty in mobile social networking. Int. J. Inf. Manag. 2017, 37, 1428–1440. [Google Scholar] [CrossRef]
  20. Mason, K.; Harris, L.C. Pitfalls in evaluating market orientation: An exploration of executives’ interpretations. Long Range Plan. 2005, 38, 373–391. [Google Scholar] [CrossRef]
  21. Lappeman, J.; Marlie, S.; Johnson, T.; Poggenpoel, S. Trust and digital privacy: Willingness to disclose personal information to banking chatbot services. J. Financ. Serv. Mark. 2023, 28, 337–357. [Google Scholar] [CrossRef]
  22. Degutis, M.; Urbonavičius, S.; Hollebeek, L.D.; Anselmsson, J. Consumer s’ willingness to disclose their personal data in e-commerce: A reciprocity-based social exchange perspective. J. Retail. Consum. Serv. 2023, 74, 103385. [Google Scholar] [CrossRef]
  23. Degirmenci, K. Mobile users’ information privacy concerns and the role of app permission requests. Int. J. Inf. Manag. 2020, 50, 261–272. [Google Scholar] [CrossRef]
  24. Liu, Z.; Wang, X.; Min, Q.; Li, W. The effect of role conflict on self-disclosure in social network sites: An integrated perspective of boundary regulation and dual process model. Inf. Syst. J. 2019, 29, 279–316. [Google Scholar] [CrossRef]
  25. Min, J.; Kim, B. How are people enticed to disclose personal information despite privacy concerns in social network sites? The calculus between benefit and cost. J. Assoc. Inf. Sci. Technol. 2015, 66, 839–857. [Google Scholar] [CrossRef]
  26. Cho, J.Y.; Ko, D.; Lee, B.G. Strategic approach to privacy calculus of wearable device user regarding information disclosure and continuance intention. KSII Trans. Internet Inf. Syst. (TIIS) 2018, 12, 3356–3374. [Google Scholar]
  27. Smith, J.; Price, G.R. The logic of animal conflict. Nature 1973, 246, 15–18. [Google Scholar] [CrossRef]
  28. Nowak, M.A.; Sigmund, K. Evolutionary dynamics of biological games. Science 2004, 303, 793–799. [Google Scholar] [CrossRef] [PubMed]
  29. Sigmund, K.; Hauert, C.; Nowak, M.A. Reward and punishment. Proc. Natl. Acad. Sci. USA 2001, 98, 10757–10762. [Google Scholar] [CrossRef] [PubMed]
  30. Nash, J.F., Jr. Equilibrium points in n-person games. Proc. Natl. Acad. Sci. USA 1950, 36, 48–49. [Google Scholar] [CrossRef] [PubMed]
  31. Taylor, P.D.; Jonker, L.B. Evolutionary stable strategies and game dynamics. Math. Biosci. 1978, 40, 145–156. [Google Scholar] [CrossRef]
  32. Li, K.; Tian, L.; Li, W.; Luo, G.; Cai, Z. Incorporating social interaction into three-party game towards privacy protection in IoT. Comput. Netw. 2019, 150, 90–101. [Google Scholar] [CrossRef]
  33. Wang, S.; Chen, Z.; Xiao, Y.; Lin, C. Consumer privacy protection with the growth of AI-empowered online shopping based on the evolutionary game model. Front. Public Health 2021, 9, 705777. [Google Scholar] [CrossRef]
  34. Binmore, K.G. Game Theory and the Social Contract: Just Playing; MIT Press: Cambridge, MA, USA, 1994. [Google Scholar]
  35. Mengibaev, U.; Jia, X.; Ma, Y. The impact of interactive dependence on privacy protection behavior based on evolutionary game. Appl. Math. Comput. 2020, 379, 125231. [Google Scholar] [CrossRef]
  36. Peng, J.; Tu, G.; Liu, Y.; Zhang, H.; Leng, B. The integration role of governmental information disclosure platform: An evolutionary game analysis of corporate environmental monitoring data fraud. Kybernetes 2019, 49, 1347–1379. [Google Scholar] [CrossRef]
  37. Gao, X.; Shen, J.; He, W.; Sun, F.; Zhang, Z.; Guo, W.; Zhang, X.; Kong, Y. An evolutionary game analysis of governments’ decision-making behaviors and factors influencing watershed ecological compensation in China. J. Environ. Manag. 2019, 251, 109592. [Google Scholar] [CrossRef] [PubMed]
  38. Fan, W.; Wang, S.; Gu, X.; Zhou, Z.; Zhao, Y.; Huo, W. Evolutionary game analysis on industrial pollution control of local government in China. J. Environ. Manag. 2021, 298, 113499. [Google Scholar] [CrossRef] [PubMed]
  39. Zhu, Y.; Chu, M.; Wen, X.; Wang, Y. Food safety risk communication between the food regulator and consumer in China: An evolutionary game perspective. Complexity 2021, 2021, 9933796. [Google Scholar] [CrossRef]
  40. Peng, X.; Wang, F.; Wang, J.; Qian, C. Research on food safety control based on evolutionary game method from the perspective of the food supply chain. Sustainability 2022, 14, 8122. [Google Scholar] [CrossRef]
  41. Yan, H.; Wei, H.; Wei, M. Exploring tourism recovery in the post-COVID-19 period: An evolutionary game theory approach. Sustainability 2021, 13, 9162. [Google Scholar] [CrossRef]
  42. Qingyun, P.; Mu, Z. Evolutionary game analysis of land income distribution in tourism development. Tour. Econ. 2021, 27, 670–687. [Google Scholar] [CrossRef]
  43. Wang, S.; Li, L.; Sun, W.; Guo, J.; Bie, R.; Lin, K. Context sensing system analysis for privacy preservation based on game theory. Sensors 2017, 17, 339. [Google Scholar] [CrossRef]
  44. Kumari, V.; Chakravarthy, S. Cooperative privacy game: A novel strategy for preserving privacy in data publishing. Hum.-Centric Comput. Inf. Sci. 2016, 6, 12. [Google Scholar] [CrossRef]
  45. Wang, Z.; Yuan, C.; Li, X. Evolutionary analysis of the regulation of data abuse in digital platforms. Systems 2023, 11, 188. [Google Scholar] [CrossRef]
  46. Gao, Y.; Zhu, Z.; Yang, J. An Evolutionary Game Analysis of Stakeholders’ Decision-Making Behavior in Medical Data Sharing. Mathematics 2023, 11, 2921. [Google Scholar] [CrossRef]
  47. Zhu, G.; Liu, H.; Feng, M. An evolutionary game-theoretic approach for assessing privacy protection in mHealth systems. Int. J. Environ. Res. Public Health 2018, 15, 2196. [Google Scholar] [CrossRef]
  48. Liu, F.; Pan, L.; Yao, L.H. Evolutionary game based analysis for user privacy protection behaviors in social networks. In Proceedings of the 2018 IEEE Third International Conference on Data Science in Cyberspace (DSC), Guangzhou, China, 18–21 June 2018; pp. 274–279. [Google Scholar]
  49. Du, J.; Jiang, C.; Chen, K.C.; Ren, Y.; Poor, H.V. Community-structured evolutionary game for privacy protection in social networks. IEEE Trans. Inf. Forensics Secur. 2017, 13, 574–589. [Google Scholar] [CrossRef]
  50. Gao, S.; Ling, S.; Liu, W. The role of social media in promoting information disclosure on environmental incidents: An evolutionary game theory perspective. Sustainability 2018, 10, 4372. [Google Scholar] [CrossRef]
  51. Chorppath, A.K.; Alpcan, T. Trading privacy with incentives in mobile commerce: A game theoretic approach. Pervasive Mob. Comput. 2013, 9, 598–612. [Google Scholar] [CrossRef]
  52. Xu, Z.; Chen, X.; Hong, Y. Evolutionary game—Theoretic approach for analyzing user privacy disclosure behavior in online health communities. Appl. Sci. 2022, 12, 6603. [Google Scholar] [CrossRef]
  53. Gu, T.; Zeng, P.; Wang, H. Research on Digital Information Privacy Behavior of Social Network Users Based on Evolutionary Game. Wirel. Commun. Mob. Comput. 2022, 2022, 1055817. [Google Scholar] [CrossRef]
  54. Guo, Y.; Zou, K.; Yang, M.; Liu, C. Tripartite Evolutionary Game of Multiparty Collaborative Supervision of Personal Information Security in App: Empirical Evidence From China. IEEE Access 2022, 10, 85429–85441. [Google Scholar] [CrossRef]
  55. Xie, Y.; Ma, Y.; Shen, J.; Li, A. A game theoretic approach toward privacy preserving for mobile learning data sharing. In Proceedings of the 2022 2nd International Conference on Consumer Electronics and Computer Engineering (ICCECE), Guangzhou, China, 14–16 January 2022; pp. 360–363. [Google Scholar]
  56. Kokolakis, S. Privacy attitudes and privacy behaviour: A review of current research on the privacy paradox phenomenon. Comput. Secur. 2017, 64, 122–134. [Google Scholar] [CrossRef]
  57. Laufer, R.S.; Wolfe, M. Privacy as a concept and a social issue, A multidimensional developmental theory. J. Soc. Issues 1977, 33, 22–42. [Google Scholar] [CrossRef]
  58. Awad, N.F.; Krishnan, M.S. The personalization privacy paradox, an empirical evaluation of information transparency and the willingness to be profiled online for personalization. MIS Q. 2006, 30, 13–28. [Google Scholar] [CrossRef]
  59. Klopfer, P.H.; Rubenstein, D.L. The concept privacy and its biological basis. J. Soc. Issues 1977, 33, 52–65. [Google Scholar] [CrossRef]
  60. Stern, J.; Holder, S. Regulatory governance: Criteria for assessing the performance of regulatory systems: An application to infrastructure industries in the develo** countries of Asia. Util. Policy 1999, 8, 33–50. [Google Scholar] [CrossRef]
  61. Xu, H.; Teo, H.H.; Tan, B.C.; Agarwal, R. The role of push-pull technology in privacy calculus: The case of location-based services. J. Manag. Inf. Syst. 2009, 26, 135–174. [Google Scholar] [CrossRef]
  62. Gong, X.; Zhang, K.Z.; Chen, C.; Cheung, C.M.; Lee, M.K. What drives self-disclosure in mobile payment applications? The effect of privacy assurance approaches, network externality, and technology complementarity. Inf. Technol. People 2020, 33, 1174–1213. [Google Scholar] [CrossRef]
  63. Bignami, F. Privacy and law enforcement in the European union: The data retention directive. Chi. J. Int’l L. 2007, 8, 233. [Google Scholar]
  64. Arrieta-Ibarra, I.; Goff, L.; Jiménez-Hernández, D.; Lanier, J.; Weyl, E.G. Should we treat data as labor? Moving beyond “free”. AEA Pap. Proc. 2018, 108, 38–42. [Google Scholar] [CrossRef]
  65. Ichihashi, S. Competing data intermediaries. RAND J. Econ. 2021, 52, 515–537. [Google Scholar] [CrossRef]
  66. Bourreau, M.; Hofmann, J.; Krämer, J. Prominence-for-Data Schemes in Digital Platform Ecosystems: Economic Implications for Platform Bias and Consumer Data Collection. In Innovation Through Information Systems: Volume III: A Collection of Latest Research on Management Issues; Springer International Publishing: Berlin, Germany, 2021; pp. 512–516. [Google Scholar]
  67. Gilbert, R.J. Separation: A cure for abuse of platform dominance? Inf. Econ. Policy 2021, 54, 100876. [Google Scholar] [CrossRef]
  68. Liu, W.; Long, S.; Xie, D.; Liang, Y.; Wang, J. How to govern the big data discriminatory pricing behavior in the platform service supply chain? An examination with a three-party evolutionary game model. Int. J. Prod. Econ. 2021, 231, 107910. [Google Scholar] [CrossRef]
  69. Yang, M.; Feng, L.; Zhang, X. Research on the evolution of tripartite data protection strategy based on game theory. J. Algorithms Comput. Technol. 2023, 17, 17483026231157204. [Google Scholar] [CrossRef]
  70. Doney, P.M.; Cannon, J.P. An examination of the nature of trust in buyer–seller relationships. J. Mark. 1997, 61, 35–51. [Google Scholar]
  71. Dunbar, R.L.; Schwalbach, J. Corporate reputation and performance in Germany. Corp. Reput. Rev. 2000, 3, 115–123. [Google Scholar] [CrossRef]
  72. Roberts, P.W.; Dowling, G.R. Corporate reputation and sustained superior financial performance. Strateg. Manag. J. 2002, 23, 1077–1093. [Google Scholar] [CrossRef]
  73. Dowling, G.R. Communicating Corporate Reputation Through Stories. Calif. Manag. Rev. 2006, 49, 82–100. [Google Scholar] [CrossRef]
  74. Fang, X.; Chan, S.; Brzezinski, J.; Xu, S. Moderating Effects of Task Type on Wireless Technology Acceptance. J. Manag. Inf. Syst. 2005, 22, 123–157. [Google Scholar] [CrossRef]
  75. Li, C.; Li, H.; Tao, C. Evolutionary game of platform enterprises, government and consumers in the context of digital economy. J. Bus. Res. 2023, 167, 113858. [Google Scholar] [CrossRef]
  76. Gibson, J.L.; Caldeira, G.A. The legal cultures of Europe. Law Soc. Rev. 1996, 30, 55–85. [Google Scholar] [CrossRef]
  77. Culnan, M.J.; Bies, R.J. Consumer privacy: Balancing economic and justice considerations. J. Soc. Issues 2003, 59, 323–342. [Google Scholar] [CrossRef]
  78. Faure, M.G.; Goodwin, M.; Weber, F. Bucking the Kuznets curve: Designing effective environmental regulation in developing countries. Va. J. Int’l L. 2010, 51, 95–156. [Google Scholar]
  79. Shi, T.; Xiao, H.; Han, F.; Chen, L.; Shi, J. A regulatory game analysis of smart aging platforms considering privacy protection. Int. J. Environ. Res. Public Health 2022, 19, 5778. [Google Scholar] [CrossRef] [PubMed]
  80. Friedman, D. On economic applications of evolutionary game theory. J. Evol. Econ. 1998, 8, 15–43. [Google Scholar] [CrossRef]
  81. Selten, R. A Note On Evolutionarily Stable Strategies in Asymmetric Animal Conflicts. J. Theor. Biol. 1980, 84, 93–101. [Google Scholar] [CrossRef]
  82. Friedman, D. Evolutionary games in economics. Econom. J. Econom. Soc. 1991, 59, 637–666. [Google Scholar] [CrossRef]
Figure 1. The relationship of users, App providers and the government.
Figure 1. The relationship of users, App providers and the government.
Bdcc 08 00090 g001
Figure 2. Users’ dynamical replication phase diagrams.
Figure 2. Users’ dynamical replication phase diagrams.
Bdcc 08 00090 g002
Figure 3. App providers’ dynamical replication phase diagrams.
Figure 3. App providers’ dynamical replication phase diagrams.
Bdcc 08 00090 g003
Figure 4. Government’s dynamical replication phase diagrams.
Figure 4. Government’s dynamical replication phase diagrams.
Bdcc 08 00090 g004
Figure 5. Example diagram of the impact of x changes on tripartite evolution strategies.
Figure 5. Example diagram of the impact of x changes on tripartite evolution strategies.
Bdcc 08 00090 g005
Figure 6. Example diagram of the impact of y changes on tripartite evolution strategies.
Figure 6. Example diagram of the impact of y changes on tripartite evolution strategies.
Bdcc 08 00090 g006
Figure 7. Example diagram of the impact of z changes on tripartite evolution strategies.
Figure 7. Example diagram of the impact of z changes on tripartite evolution strategies.
Bdcc 08 00090 g007
Figure 8. Example diagram of the impact of R1 changes on users’ evolution strategies.
Figure 8. Example diagram of the impact of R1 changes on users’ evolution strategies.
Bdcc 08 00090 g008
Figure 9. Example diagram of the impact of C1 changes on users’ evolution strategies.
Figure 9. Example diagram of the impact of C1 changes on users’ evolution strategies.
Bdcc 08 00090 g009
Figure 10. Example diagram of the impact of C2 changes on App providers’ evolution strategies.
Figure 10. Example diagram of the impact of C2 changes on App providers’ evolution strategies.
Bdcc 08 00090 g010
Figure 11. Example diagram of the impact of W2 changes on App providers’ evolution strategies.
Figure 11. Example diagram of the impact of W2 changes on App providers’ evolution strategies.
Bdcc 08 00090 g011
Figure 12. Example diagram of the impact of C3 changes on the government’s evolution strategies.
Figure 12. Example diagram of the impact of C3 changes on the government’s evolution strategies.
Bdcc 08 00090 g012
Table 1. Parameter Symbols and Definitions.
Table 1. Parameter Symbols and Definitions.
ParametersDefinitions
R1The benefit obtained by users from the functions and services provided by the App when they choose the “authorizing” strategy, where R1 > 0.
C1The cost of privacy concern borne by users when they choose the “authorizing” strategy, and the App provider chooses the “compliance” strategy, where C1 > 0.
ΔC1The additional cost of privacy concerns users need to bear when they choose the “authorizing” strategy and the App provider chooses the “non-compliance” strategy, where ΔC1 > 0.
U1The benefit of government assurance felt by users when they choose the “authorizing” strategy, the App provider chooses the “compliance” strategy, and the government chooses the “strict supervision” strategy, where U1 > 0.
ΔU1The additional benefit of government assurance felt by users when they choose the “authorizing” strategy, the App provider chooses the “non-compliance” strategy, and the government chooses the “strict supervision” strategy, where ΔU1 > 0.
W1The negative impact on users due to the absence of government assurance when they choose the “authorizing” strategy, the App provider chooses the “compliance” strategy, and the government chooses the “loose supervision” strategy, where W1 > 0.
ΔW1The additional negative impact on users due to the absence of government assurance when they choose the “authorizing” strategy, the App provider chooses the “non-compliance” strategy, and the government chooses the “loose supervision” strategy, where ΔW1 > 0.
R2The informational value benefit gained by the App provider when they choose the “compliance” strategy and users choose the “authorizing” strategy, where R2 > 0.
ΔR2The additional informational benefit gained by the App provider when they choose the “non-compliance” strategy and users choose the “authorizing” strategy, where ΔR2 > 0.
C2The reputation loss incurred by the App provider when they choose a “non-compliance” strategy, where C2 > 0.
ΔC2The additional reputation cost incurred by the App provider when they choose the “non-compliance” strategy and users choose the “authorizing” strategy, where ΔC2 > 0.
W2The penalty loss suffered by the App provider when they choose the “non-compliance” strategy and the government chooses the “strict supervision” strategy, where W2 > 0.
C3The auditing and monitoring costs incurred by the government when they choose a “strict supervision” strategy, where C3 > 0.
U3The credibility and trust benefit gained by the government when they choose the “strict supervision” strategy, where U3 > 0.
ΔU3The additional credibility benefit gained by the government due to timely maintenance of market order when they choose the “strict supervision” strategy and the App provider chooses the “compliance” strategy, where ΔU3 > 0.
W3The loss of credibility suffered by the government due to the “loose supervision” strategy, where W3 > 0.
ΔW3The additional credibility loss suffered by the government due to inadequate supervision when they choose the “loose supervision” strategy and the App provider chooses the “non-compliance” strategy, where ΔW3 > 0.
V3The deepened credibility challenge faced by the government due to regulatory absence when they choose the “loose supervision” strategy, the App provider chooses the “non-compliance” strategy, and users choose the “authorizing” strategy, where V3 > 0.
xThe probability of users choosing the “authorizing” strategy, where 0 ≤ x ≤ 1.
yThe probability of the App provider choosing the “compliance” strategy, where 0 ≤ y ≤ 1.
zThe probability of the government choosing a “strict supervision” strategy, where 0 ≤ z ≤ 1.
Table 2. Payoff matrix.
Table 2. Payoff matrix.
Strategy CombinationsUsersApp ProvidersGovernment
Authorizing,
Compliance,
Strict supervision
R1C1 + U1R2U3C3
Authorizing,
Compliance,
Loose supervision
R1C1W1R2W3
Authorizing, Non-compliance,
Strict supervision
R1C1 − ΔC1 + U1 + ΔU1R2 + ΔR2C2 − ΔC2W2U3 + ΔU3C3
Authorizing, Non-compliance,
Loose supervision
R1C1 − ΔC1W1 − ΔW1R2 + ΔR2C2 − ΔC2W3 − ΔW3V3
Not authorizing, Compliance,
Strict supervision
00U3C3
Not authorizing, Compliance,
Loose supervision
00W3
Not authorizing, Non-compliance,
Strict supervision
0C2W2U3 + ΔU3C3
Not authorizing, Non-compliance,
Loose supervision
0C2W3 − ΔW3
Table 3. Stable analysis of equilibrium points.
Table 3. Stable analysis of equilibrium points.
Equilibrium Pointsλ1λ2λ3Stability
E1(0,0,0)R1C1 − ΔC1W1 − ΔW1C2U3 + ΔU3 + W3 + ΔW3C3Non-Stable
E2(0,0,1)R1 + U1 + ΔU1C1 − ΔC1C2 + W2C3U3W3 − ΔU3 − ΔW3Non-Stable
E3(0,1,0)R1C1W1 − ΔW1C2U3 + W3C3Uncertain
E4(0,1,1)R1 + U1C1C2W2C3U3W3Uncertain
E5(1,0,0)C1 + ΔC1 + W1 + ΔW1R1C2 + ΔC2 − ΔR2V3 + U3 + W3 + ΔU3 + ΔW3C3Uncertain
E6(1,0,1)C1 + ΔC1R1U1 − ΔU1C2 + W2 + ΔC2 − ΔR2C3V3U3W3 − ΔU3 − ΔW3Uncertain
E7(1,1,0)C1 + W1 + ΔW1R1ΔR2C2 − ΔC2U3 + W3C3Uncertain
E8(1,1,1)C1R1U1ΔR2W2C2 − ΔC2C3U3W3Uncertain
Table 4. Parameter assignment for the impact of x changes on tripartite evolution strategies.
Table 4. Parameter assignment for the impact of x changes on tripartite evolution strategies.
ParametersR1U1C1W1ΔC1ΔU1ΔW1R2C2W2ΔR2ΔC2C3U3W3V3ΔU3ΔW3
Numerical values1.20.20.60.30.30.10.21.20.60.61.20.30.90.60.60.10.30.3
Table 5. Parameter assignment for the impact of y changes on tripartite evolution strategies.
Table 5. Parameter assignment for the impact of y changes on tripartite evolution strategies.
ParametersR1U1C1W1ΔC1ΔU1ΔW1R2C2W2ΔR2ΔC2C3U3W3V3ΔU3ΔW3
Numerical values1.20.10.60.20.30.050.11.20.90.61.20.310450.11.21.5
Table 6. Parameter assignment for the impact of z changes on tripartite evolution strategies.
Table 6. Parameter assignment for the impact of z changes on tripartite evolution strategies.
ParametersR1U1C1W1ΔC1ΔU1ΔW1R2C2W2ΔR2ΔC2C3U3W3V3ΔU3ΔW3
Numerical values1.20.20.60.30.30.10.21.20.60.61.20.30.90.60.60.10.30.3
Table 7. Parameter assignment for the investigation of five core elements’ evolution process.
Table 7. Parameter assignment for the investigation of five core elements’ evolution process.
ParametersR1U1C1W1ΔC1ΔU1ΔW1R2C2W2ΔR2ΔC2C3U3W3V3ΔU3ΔW3
Numerical values1.20.31.00.80.10.10.11.20.30.31.20.10.60.30.80.10.10.1
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tang, J.; Peng, Z.; Wei, W. An Inquiry into the Evolutionary Game among Tripartite Entities and Strategy Selection within the Framework of Personal Information Authorization. Big Data Cogn. Comput. 2024, 8, 90. https://doi.org/10.3390/bdcc8080090

AMA Style

Tang J, Peng Z, Wei W. An Inquiry into the Evolutionary Game among Tripartite Entities and Strategy Selection within the Framework of Personal Information Authorization. Big Data and Cognitive Computing. 2024; 8(8):90. https://doi.org/10.3390/bdcc8080090

Chicago/Turabian Style

Tang, Jie, Zhiyi Peng, and Wei Wei. 2024. "An Inquiry into the Evolutionary Game among Tripartite Entities and Strategy Selection within the Framework of Personal Information Authorization" Big Data and Cognitive Computing 8, no. 8: 90. https://doi.org/10.3390/bdcc8080090

APA Style

Tang, J., Peng, Z., & Wei, W. (2024). An Inquiry into the Evolutionary Game among Tripartite Entities and Strategy Selection within the Framework of Personal Information Authorization. Big Data and Cognitive Computing, 8(8), 90. https://doi.org/10.3390/bdcc8080090

Article Metrics

Back to TopTop