4 Commoditization of Internet Privacy: A Research Proposal Chapter One Introduction and

Commoditization of Internet Privacy: A Research Proposal
Chapter One
Introduction and Overview
The adoption of digital technology has increased organizations’’ access to information, local, and global markets. As a result, the task of administering users’ data protection closely interrelates with expectations of access to information and the overall sense of cyber security. Additionally, the Internet of Things (IoT) provides organizations with large volumes of data about customers, which were previously not accessible to them. Their examples include users’ biometric, demographic, and health-related data, such as fingerprint patterns, purchase behaviors, and level of income (Tik, 2017). Additionally, the emergence of big data analytical tools characterizes the next generation of the internet and is projected to pave the way for new economic age with transformations that rival the indusial revolution. Nonetheless, the radical digitalization of the physical business environment raises major concerns for end-users. According to Elvy (2018), some of the most alarming scenarios of the continued adoption of information technologies to access information and products include the financial crisis, which is brought by firms selling their data assets to third parties. This problem will be worsened by the unwitting sale of such information to parties that would utilize the customer data for perverse reasons (Parisi et al., 2020). Additionally, the practice of gathering big data and selling it to other entities increases organizations’ vulnerability to cyber-attacks. Another ethical concern that characterizes the sale of such information is the ubiquitous use of wearable tools that gather and monitor real-time biometric, financial, and health-related data from users (Aho & Duffield, 2020). These include their emotional state, hormonal levels, and their widespread use of such information to control and manipulate consumer behavior.
The end-users of information technology resources are already raising privacy concerns as a result of the widespread sale of their data without their consent. In 2016, for instance, Dyn, an organization that manages critical components of internet resources reported a severe attack on its systems that disrupted access to various websites, including Twitter, Netflix, and the New York Times (Elvy, 2018). In the same vein, Equifax complained about a breach of its information systems, which revealed classified information, such as customers’ social security numbers of dates of birth, location, and levels of income. Consumer data produced from the purchase and utilization of goods and services is a widely prized commodity. In this regard, organizations’ utilization of their databases has become an important strategic tool for gaining competitive advantage. In that respect, the brisk expansion of IoT is anticipated to increase the consumed data contained in existing customer databases phenomenally. Thus, the susceptibilities related to IoT devices and the pace at which firms can gather, analyze, interpret, and distribute customer data in IoT contexts worsen the concerns regarding privacy and security. The customers’ utilization of IoT devices generates information, which includes credit card numbers, names, dates of birth, and physical and email addresses. The devices also capture a wealth of new information about the customers’ behaviors and routine tastes and preferences. Big data and IoT devices also can collect private information such as facial scans, heart rates, fitness levels, temperate, as well as blood sugar levels. As the practice of commodification of private data continues, the administration of personal data protection will likely become intrinsically connected to societal expectations regarding access to information and cyber security.
The commodification of data is a practice that is increasingly strengthening its footing in the corporate world. Also referred to as data capitalism, commodification targets people’s social world, which is intertwined with the internet in the digital era. According to Luts et al. (2020), digital citizenship has increased companies’ access to people’s personal information, since it influences the ability to participate in society online, which is a prerequisite to social inclusion. This concept closely bears semblance with digital inequality and divide, which highlight the exclusionary implications of lacking access to digital media and failing to take part actively in online affairs (Tik, 2017). The proponents of digital citizenship, however, often fail to highlight the security implications of actively using online resources to gather information and purchase products and services. For many users, participation in society online exposes them to risks, which include access to private data by unauthorized parties, online harassment, spam, hacking, or identity theft (Andrew & Baker, 2021). The consumer has also complained of privacy problems that using online information exposes them to. On the one hand, threats to the integrity of customers’ personal information can potentially affect their social, economic, and mental wellbeing. On the other hand, digital participation is almost impossible without sharing personal details.
In the emerging digital era, big data has become of special interest to ethical and legal researchers. Following the Snowden revelations, the great potential and over that big data and the internet of things hold in shaping relations between the public and private entities have been recognized significantly. Indeed, Karanasiou and Douilhet (2020) observe that many organizations control a vast amount of raw data. In this respect, many industry players that have sufficient resources to mine that data and create new information enjoy a significant competitive advantage over their rivals in the big data market. Additionally, the utilization of predictive analytics in the gathering and processing of information that is traced across different platforms and devices is pivotal in analyzing trends in consumer behavior. This practice furth4er adds value to big data, thereby making it more an important tool for any commercial organization. This radical pace at which personal data is being commoditized has paved the way to a new approach in which users are calling for legal protection. This pattern is a shift from the traditional privacy protection regimes to broader protection under property law by both corporations and individuals. Moreover, many groups of consumer protection activists often call for the need to establish appropriate policy and legal response to the problem of monetization of personal data, which was initially viewed through the lens of big data.
In many social systems, people often pursue privacy. Nonetheless, confidentiality and privacy are not ends in themselves but rather describe the conditions under which fundamental rights such as autonomy, emotional release, self-growth, and self-evaluation can be accomplished (Masur, 2020). As such, the value of privacy is often widely recognized either explicitly or implicitly based on fundamental rights that it seeks to attain. While privacy is often deeply grounded in different schools of thought, modern conversations on online privacy are almost exclusively rooted in liberal theories. In many researchers’ attempts to understand the potential threats to privacy such as the utilization of ubiquitous surveillance and large-scale data gathering systems, they have not extensively covered the ethical implications of data commodification. More specifically, there is a major dearth of knowledge about the blurring of major public and private paces in networked surroundings. One of the problems that this emerging pattern exposes users to is the subsequent malleability of the individual by powerful economic and commercial players. Additionally, a group of privacy scholars asserts that many people perceive privacy as a form of protection against potential social, economic, or institutional interferences. This dimension bears semblance with the concept of negative freedom. The variants of this negative conceptualization of privacy are often manifesting in non-intrusion models, seclusion frameworks, and control or limitation theories of privacy. Considering the above, there is a need to examine the potential ethical implications of the commodification of private data online.
Problem Statement
There is a scarcity of studies that examine the potential ethical and legal implications of the commodification of private data. This knowledge gap is partly attributed to existing scholars primarily focusing their attention only on privacy as it relates to information in general. However, the advent of information technologies and bi-data analytics tools shifts privacy focus from the traditional models of keeping data through pen and paper to the digital ones. Thus, privacy as an ethical and moral dilemma should be examined through the lenses of contemporary digital environments (Sarikakis & Winter, 2017). As a consequence, this study attempts to fill the knowledge gap by exploring the effects, and ethical issues of commoditizing personal private data online. Most scholarly works on the commodification of data often emphasize the benefits that such practices have on organizational success. This study, however, departs from this norm by highlighting the potential ethical concerns that the fetching of private data has on the customers involved.
Privacy and security in the big data capitalism era is an important area of study that requires deeper investigation. Within the context of commodification, privacy refers to the privilege and right to have some control over how personal data is gathered and utilized. Information privacy is the ability of an individual or group to stop data on themselves from becoming known to people other than those that they grant consent to (Jain et al., 2016). One serious user privacy problem is the identification of personal information during the gathering and sale of such data to third parties (Moura & Serrao, 2015). This problem exposes both organizations and individuals to security issues. As a consequence, there is a need to fill the existing research gap by exploring the problem of commodification and determining viable solutions to security threats that it poses.
Purpose Statement
The purpose of the study is to explore the issues and problems that are associated with the commodification of users’ data online. Additionally, the research will assess the impacts of accessing, gathering, and sale of such classified data on customers’ behavior. To attain this objective, the study will utilize a correlational design, which will examine the relationship between data commodification and customers’ perception and response. Apart from correlational design, a qualitative methodology will be employed to investigate online users’ attitudes, perceptions, and opinions about their data being gathered and sold to unknown third-party entities for marketing purposes.
Research Questions
What are the ethical and legal issues that are associated with the commoditization of private data online?
What are the effects of commoditization of users’ data on consumer behavior?
What are the users ‘attitudes, perceptions, and opinions regarding the gathering, analysis, and sale of their private data online to third-party entities?
This paper is significant in directing researchers’ attention to the emerging ethical and legal problems that data commodification poses. From the time that the first internet was launched for public use, many researchers never comprehended its widespread adoption in business and social lives. As digital flows change people’s daily interactions and redefine their perceptions about different products, services, and organizations, marketing institutions, and older bureaucrats are becoming malleable (Miranda, 2016). As a result, they are increasingly modulating and controlling user behavior through advanced surveillance tools that have expanded their range and become multifaceted for unique purposes. Thus, the study turns researchers’ attention to the merging utilization of digital technologies and statistical tools to socially sort all types of populations has now become a base mode of organization in virtually all enterprises, whether public or private. More importantly, the study generates inquiries into the extent to which people should think of digital inflows as a possible space for the articulation of accountability within a deep democratic action (Hagner, 2018). Additionally, it aims at understanding how customers are responding to data commodification beyond the market and institutional spheres to shape third parties’ use of their information.
The study is also important shedding light on the security problems posed by the commodification of personal data on the internet. This fact is particularly important in understanding how the ubiquitous nature of IoT devices is significantly becoming pervasive. Indeed, the tremendous development of big data devices and their capacity to provide different forms of services have made them the fastest evolving technologies. As a consequence, they exert a big impact on social life and the business environment (Abomhara & Koien, 2014). Additionally, the IoT has progressively permeated virtually all aspects of modern life, including education, health care, as well as businesses, which involve the storage of sensitive information on individuals and organizations, gathering financial data transactions, and undertaking product development and marketing (Fuchs, 2017). However, the intensive diffusion of interconnected devices and applications in the IoT sphere creates enormous demands for robust security in reaction to the growing demand for millions or probably billions of connected devices and services. As the number of threats continues to increase every day, the tools and resources available to potential attackers also become more complex, efficient, and effective. As such, for IoT to accomplish its optimal potentials, it requires protection against threats and vulnerabilities. Given the above, the study is pivotal in enlightening researchers on the steps that they can take to avoid data breaches emanating from the commodification of private data online. To this end, security measures can be described as the process of protecting objects and private data against damage, unauthorized access, theft, or loss, by maintaining high confidentiality.
This study also renews people’s interests in the importance of data privacy in light of the emergence of data brokers and data capitalism. In the United States, for instance, the prevailing public policy approach to deterring the adverse impacts of internet surveillance and espionage is founded on the liberal democratic value of transparency. Crain (2016), however, observes that data transparency is an issue that runs up against insurmountable structural and systemic constraints within the political economy of commercial surveillance. In that connection, the commoditization of personal information is at the center of the power imbalances that transparency-based approaches of consumer protection and empowerment aim at rectifying. Despite such problems, this study is important in exploring how privacy policies should be more centrally informed by a critical political economy of commercial surveillance. The desire for transparency is often a widely observed refrain in talks relating to the overreach of private and state surveillance. When it comes to data brokers, transparency proponents often assert that consumers need access to information regarding monitoring practices to make rational decisions about their data. This can be attained if firms that take part in data gathering merely accord clients a seat at the table of discussions. Although the recommendations of transparency proponents are commendable, promoting the open sharing of private data exposes users to many security challenges. In this regard, data brokers may not cede their control over marketing information to customers themselves without a significant reorientation of their respective industries.
Finally, this study will provide scholarly directions on the extent to which surveillance capitalism affects people’s privacy and security. According to Zuboff (2015), surveillance capitalism will have major implications on information civilization in the next few years. Major tech firms such as Google, Facebook, and Microsoft, for example, embrace institutionalized practices and operational assumptions of utilizing computer-mediated transactions, which include data extraction and analysis, new contractual strategies as a result of improved monitoring, personalization, and customization. An assessment of the nature and effects of these activities can provide insights into the implied logic of data monitoring and surveillance capitalism, including the global architecture of computer mediation on which it relies. In the last few years, big data has often been depicted as an inevitable outcome of a technological juggernaut with a life of its own entirely outside the social world. Additionally, most studies on bug data only define it without taking into consideration its implications on users’ privacy. Additionally, many researchers only take an inadequate view of the technology because they overlook its social and business origins. Although the technology may be transformed to other uses, this does not erase its roots in the extractive project based on the formal indifferences to the populations, which encompasses both its data sources and ultimate targets. Finally, it is important to note that when it comes to the commercial sphere, the electronic texts are already structured by the logic of accumulation in which it is embedded and the conflicts that are inherent to the logic. As such the logic of data accumulation organizes perception and influences the expression of technological affordances at their foundations.
While the study is going to be pivotal in assessing the potential ethical and legal problems sparked by the commodification of data, it will have various limitations. The research work may, for example, be not generalized to the entire population. In this respect, indeed, generalizability relates to the extent to which the outcomes of the study can be applied to a broader group of individuals or circumstances. While the problem that is associated with the utilization of personal user data online is a universal one, part of this study also utilizes a qualitative research design, which limits the number of participants. As a consequence, it is not easy to develop a mechanism through which the study can be applied to the entire population. Furthermore, the study is context-specific in the sense that it only evaluates participants of an s[specified location, demographic group, and identities. As such, the sheer lack of inclusion of other equally important members of the social system limits its capacity to apply the findings to a broader study group. Another potential limitation of the study is that it employs a quantitative model to assess the association between the dependent and independent variables. This implies that it would be an uphill task to derive similar outcomes using a different research design.
Definition of Terms
Privacy – the capacity of an individual to seclude themselves and hide personal information from
the public.
Internet – A group of computer networks that interlinks to provide the wider population with
Valuable data and understanding.
Confidentiality – the ability of a system to keep personal data private and ensure that they are
not accessed by unauthorized third parties.
Internet- The World Wide Web, or international network.
Organization of the Remaining Chapters
The remaining chapters include the Literature Review and Methodology sections. The former will analyze and compare previous studies conducted on this area of research. Thereafter, it will conclude by identifying the potential gaps in each study, which call for a new scholarly investigation. The latter will provide insights into the research design and methods that will be used to perform scholarly investigations. In particular, it will identify the data collection methods, sampling procedures, sampling populations, analysis, and ethical issues.
Abomhara, M., & Køien, G. M. (2015). Cyber security and the internet of things: vulnerabilities,
threats, intruders, and attacks. Journal of Cyber Security and Mobility, 65-88.
Aho, B., & Duffield, R. (2020). Beyond surveillance capitalism: Privacy, regulation and big data
in Europe and China. Economy and Society, 49(2), 187-212.
Andrew, J., & Baker, M. (2021). The general data protection regulation in the age of surveillance
capitalism. Journal of Business Ethics, 168(3), 565-578.
Crain, M. (2018). The limits of transparency: Data brokers and commodification. new media &
society, 20(1), 88-104.
Demchenko, Y., Los, W., & de Laat, C. (2018). Data as economic goods: Definitions, properties,
challenges, enabling technologies for future data markets. ITU Journal: ICT
Discoveries, 2(23).
Elvy, S. A. (2018). Commodifying Consumer Data in the Era of the Internet of Things. BCL
Rev., 59, 423.
Fuchs, C. (2017). Günther Anders’ Undiscovered Critical Theory of Technology in the Age of
Big Data Capitalism. tripleC: Communication, Capitalism & Critique. Open Access
Journal for a Global Sustainable Information Society, 15(2), 582-611.
Hagner, M. (2018). Open access, data capitalism, and academic publishing. Swiss Medical
Weekly, 148, w14600.
Jain, P., Gyanchandani, M., & Khare, N. (2016). Big data privacy: a technological perspective
and review. Journal of Big Data, 3(1), 1-25.
Karanasiou, A. P., & Douilhet, E. (2016, April). Never mind the data: the legal quest over
control of information & the networked self. In the 2016 IEEE International Conference on
Cloud Engineering Workshop (IC2EW) (pp. 100-105). IEEE.
Lutz, C., Hoffmann, C. P., & Ranzini, G. (2020). Data capitalism and the user: An exploration of
privacy cynicism in Germany. new media & society, 22(7), 1168-1187.
Masur, P. K. (2020). How online privacy literacy supports self-data protection and self-
determination in the age of information. Media and Communication, 8(2), 258-269.
Miranda, J. R. Y. (2016). Beyond the commodification of privacy: Personal data management as
a strategy for accountability in a digital world.
Moura, J., & Serrão, C. (2015). Security and privacy issues of big data. In Handbook of research
on trends and future directions in big data and web intelligence (pp. 20-52). IGI Global.
Parisi, L., & Dixon-Román, E. (2020). Data capitalism, sociogenic prediction, and recursive
indeterminacies. In Data Publics (pp. 48-62). Routledge.
Sarikakis, K., & Winter, L. (2017). Social media users’ legal consciousness about privacy. Social
Media+ Society, 3(1), 2056305117695325.
Sevignani, S. (2013). The commodification of privacy on the Internet. Science and Public
Policy, 40(6), 733-739.
Tikk, E. (2017). Privacy online: up, close and personal. Health and Technology, 7(4), 489-499.
West, S. M. (2019). Data capitalism: Redefining the logics of surveillance and privacy. Business
& Society, 58(1), 20-41.
Zuboff, S. (2015). Big other: surveillance capitalism and the prospects of an information
civilization. Journal of information technology, 30(1), 75-89.