U.S. Senator Mark R. The law proposed by Warner and Josh Holley wants to protect privacy by forcing the technology companies to disclose the real value of their data to users.
In particular, companies with more than 100 million users will be able to assess the financial value of their data to each user as well as revenue resulting from “data collection, gathering, processing, selling, using, or sharing” Must be disclosed.
In addition, the dashboard empowers users to delete their data from the corporate database.
As a researcher who searches ethical and political implications of digital platforms and large-scale data, I sympathize with Bill’s ambition to increase transparency and empower users.
However, estimating the value of user data is not easy and I do not think it solves confidentiality issues.
Data collected by technology companies is not limited to identifying traditional information like name, age and gender.
Instead, as Harvard historian Rebecca Limof noted, it includes “Tweets, Facebook, twitch, Google search, online comments, one-click purchases, even images in the feed, but leaving them Please. ”
In other words, large-scale data contains intimate life moments.
If Facebook captures your conversation with friends and family, then your late night searches by Google and Alexa’s living room command will not tell you at all, as stated.
However, calculating the value of user data is not so easy. The estimate of user data is quite different.
Includes a rating of less than $ 100 more than the $ 1 per person for Facebook user.
A user sold his data on a cake starter for $ 2,733 To achieve this number, it had to share data with keystrokes, mouse movements and repetitive screen shots.
Unfortunately, the dashboard does not specify how the value of a user’s data is estimated.
I think the Commission will quickly realize that it is a difficult task to estimate the value of user data.
More than personal
The purpose of the proposed law is to provide more transparency to the users. However, privacy is no longer just a matter of personal data.
The data you share can provide some ideas about the lives of many people.
For example, Facebook fans can help you predict the user’s sexual orientation with high accuracy.
The goal was to use their purchasing data to predict pregnant customers. The retailer came to know that this matter was widely followed before its teenage father was pregnant.
The power of this prediction means that personal information is not included in user data only.
Depending on the statistical link in the data of many users, companies can also estimate your own information.
How can these figures be devalued for the values of individual dollars?
In addition, this ability to use statistical analysis to identify people related to the group group can have a far-reaching effect on privacy.
If service providers can use a predictive analysis to estimate a person’s sexual orientation, race, sex and religious beliefs, then what prevents them from discriminating on this basis?
Once lost, forecasting techniques will continue to work even if users remove the data part they helped build.
Control through data
Data sensitivity not only depends on what it involves, but how governments and companies affect it.
This is evident in my current research on China’s planned social security system. To regulate the behavior of Chinese citizens, the Chinese government is planning to use national databases and trust levels.
Google, Amazon and Facebook, as the author Shoshana Zubov has argued, use the estimated data to “adjust our behavior toward the most profitable results.”
In 2014, how Facebook tried to feed it to affect the emotional status of users in a popular protest, its search ended.
However, this example has demonstrated how digital platforms generally use data to engage users and in this process generate more data.
The secret of data is largely about the capability of a large technology that shapes your personal life as it relates to what you know about yourself.
The truth is that with all the implications of privacy, dataification does not affect everyone equally.
Large-scale data and hidden discrimination of network discrimination continue again in gender, race and class disparities.