Some companies and researchers argue it’s not enough for the government to simply protect personal data; consumers need to own their information and be paid when it’s used. Some social networks have experimented with rewarding users with cryptocurrency when they share content or spend time using their platforms. Other companies have tried paying the users in exchange for sharing data with them. But allowing users to take back ownership likely wouldn’t solve most privacy issues posed by personal data collection. It might also be the wrong way to tackle the issue: Instead, perhaps, less collection should be permitted in the first place, forcing companies to move away from the targeted-advertising business model altogether.
When concerned with such large amounts of money, and with such massive improvement in the Quality of Life for people with access to the internet, it is important to consider the Ethics behind it all. Digital Privacy seems almost free. Having a company know about one’s birthday, or the number of siblings seems trivial. However, there is much going under the hood than it seems. The ethics surrounding such is what we will discuss in this essay. Customers are clearly incentivized to share their data. Using a site like Google in comparison to something like DuckDuckGo (which values user privacy) has several additional benefits. One, by collecting personal data, Google is able to offer better search results and faster browsing experience through methods of cookies.Two, by using Google, it builds a sort of community as most people are likely to use Google too. From Gmail to Google Docs, much can be done through but one company.
Save your time!
We can take care of your essay
- Proper editing and formatting
- Free revision, title page, and bibliography
- Flexible prices and money-back guarantee
Place an order
Since the growth in digital activities, firms have put higher security burdens on customers. This has led to the overall user experience falling down. According to research (“New survey highlights startling erosion of online trust,” May 15, 2016, Forbes), customers have to remember more than 14 passwords on average and increasingly complain about the inefficiency and complexity of the authentication experience. Yet, consumer views of security have worsened. Digital security and privacy are reducing consumer trust online. Scandals in the past like that of Cambridge Analytica where a large amount of user data was leaked and used for political campaigns, make it clear to users that their data is not safe, but alternatives are usually not available.
It is one thing if a company uses your data to serve ads, but having such a large amount of data in the hands of one entity is a very vulnerable target for hackers and for cyberattacks. There have been multiple leaks and attacks on company data, and it has led to massive loss of privacy. ICloud pictures have been leaked, including people’s personal pictures. Clearly, there’s more to the ethics of privacy than simple “money vs taking something I do not own.” By collecting and using data from users, companies increase the risk to their consumers. There have always been instances of financial loss as people are likely to have common passwords throughout their accounts online. For the companies with user data under their own roofs cyberattacks are a concern . By providing external sources with user data, Tech companies are allowing easier access to data. This is likely to increase the chances that a cyberattack will be more harmful.
This is clearly a matter of ethics that needs to be considered when we decide just how accountable companies are when taking a risk. The magnitude of the risk may be considered minimal, in most instances, however, most users are unaware of potential dangers associated with sharing their data.
To have a clear distinction between the accountability of companies during a data breach and the moral responsibility of users while permitting the company to use data, we will use two major theories in ethics which are Moral Luck by Thomas Nagel and Moral Responsibility by Galen Strawson. Let's consider a 2-way study: one is user centric and the other is company centric. Suppose two users share their images to a company and the company publicizes the second user’s images for some purpose which is a breach in the person’s privacy. Now in this case when both the users shared their data with the company and agreed to the terms and conditions. They were equally prone to the leak in personal data. Here the first user got morally lucky and his data was safe. In such cases both the users have equal right to blame the company and hold it accountable for the breach in privacy despite the occurrence of the breach because, both of them were equally prone to the data leak. Thus the accountability should not depend only on the user who was harmed but also those users who could be affected by it. This is in line with Nagel’s conclusion on Moral Luck that, ‘Praise or Blame should be given in accordance with the expected future behaviour.’
So both people can equally blame the company for the action to avoid a repetition of such behaviour. But according to Moral Responsibility theory, for a person to be truly responsible for an act, he must be responsible for the way he truly is. There are cost implications for improving digital security. The ethical concern here comes when companies must choose between security to user data and their primary responsibility to shareholders which is to maximise profits. This makes them behave in the way they do. To blame a company by pointing out it’s moral obligations towards the user’s data which they entrust it with, the economic structure should also be blamed. Factors like this create a clear trade-off between improving user experience and money. As it drives companies to use unfair means to increase the net worth. If we consider the moral obligations and accountability for an organization with a long term perspective, as in how we want its nature and functioning to be, then the user can hold the company responsible for the breach of his privacy.