見出し画像

For our integrity and society with the user agreement approach

 We're living in an uncertain age and the market always changes. It's very important to make room for collaborative activities where a variety of people and organizations can come together and gather knowledge and insights to create value. At the same time, it's important to apply technology to security in order to protect collaboration and support open innovation. So, there are two sides of the same coin. Openness and security. In order to achieve that security, some degree of surveillance is required.

 In that aspect, surveillance is sometimes an essential part of democracy, and not only used by dictatorships. And in a democratic economy, surveillance technology is also utilized as advertisement technology, for example in the form of tracking and user segmentation. People believed that this usage was just for giving useful information to them in economic perspective. However, recently we have witnessed that advertisement technology could intervene in our democratic discussions without people noticing.

 Many internet businesses rely on advertisement which shows information to users in a specific way so as to lead them to become shoppers based on a bunch of data. Businesses continue to display “useful” information to users, to keep them watching their content and make them expose their personal information back to businesses in an attempt to improve accuracy and efficiency of the advertisement. That’s one characteristic of the internet business.

 This mechanism tends to exploit something attractive to users. It leads to polarizing them at an accelerated pace, reducing possible consensus among people and blocking dialogue sustaining the democracy. This is how the security for innovation eventually becomes a threat to our democracy. It’s not a problem with technology, but with the essence of the business model of advertisement. Tracking can promote the evolution of advertisement, but it could also circle back to attacking the democracy and the innovation.

 Internet advertisement is built on top of user data. Businesses collect data in agreement with users. Regularly, people just make one click to agree to “the user agreement” in their daily lives. What exactly can they agree to? In the first place, what would you confirm in the user agreement? We need to review this process in depth.

 People ignore some information. Otherwise, they cannot survive the flood of massive information. We intentionally skip reading sentences of the agreement and pay very little attention to important information while we know it has something to do with our basic rights. And we tell a lie silently by clicking the button "I agree."

 That could make us collective liars. We view telling a lie to these kind of agreements as common behavior that everybody does. New generations like the digital native generation accept telling a lie as something they should do. That’s the social disease in this era. Our technology, business and culture have been negatively affecting our integrity.

 Of course, people on the business side know very well that users just skip reading and lying a little by clicking it. They understand they contribute to the enhancement of the lying society. Then businesses should abandon this procedure, and society should invent another system to replace it.

 We are reaching the limit of handling privacy issues with this kind of service agreement. We may want to identify the proper usage and improper usage of data and to request business providers to clarify whether they are complying with the proper usage. Of course, end users still need to make a decision about the usage of their data. However, the scope of the agreement should be limited to new areas, new contexts, or new vulnerabilities of the data. The scope should be narrowed down, otherwise the agreement will damage our sincerity in a society-wide and serious way.

 Recently there is a trend to leave the decision of user data usage to the user. However, it is not working well. The way to agree to it hasn’t seen any improvement, and that is a wrong strategy. User agreements couldn’t build a good relationship between users and businesses for privacy. Because users cannot understand how their data is actually utilized and cannot spend time making a decision on the data usage. If we try to devise a good way for users to check each item of data usage one by one and to show their attitude, it will just end up wasting time. As I mentioned, people are already accustomed to the behavior that they will press the agree button to things they don’t check. The will of people are contaminated by the technology and its operation. The integrity of their own behavior has been destroyed.

 It's a bit skeptical to expect each individual to make sure to treat their data as their asset and handle it properly.

 This kind of approach deceptively gives free will to everybody. In reality, it turned out to represent the will of nobody because people don’t have enough information to make an informed choice. If people have many chores in their daily lives and they cannot be in a position to make an informed choice, it’s not the true choice which they would make. Then we should not regard the choice taken under this situation as the authorized effective choice. We want to carefully and humbly identify the scope of what people understand well to make a choice. We respect the choice people make if they have enough information and can make a choice. In other cases, we better take as much burden as possible off their shoulders and get more prudent when making a decision on data usage. 

 It wouldn't be good if each user needs to decide everything regarding data usage. I think it’d be better to take a different approach than to technically improve the agreement approach. That is to clarify what data usage should be approved automatically, banned automatically or decided by users. The scope of data usage calling for users’ personal decision should be narrowed down as much as possible.

 We need to have much more measures to deduce which data usage is inappropriate or appropriate. We should not keep expecting users to choose the right data usage by themselves. If we collect all choices people made individually, it won’t lead to the right answer which people want as a whole or as a society. It is much like the fallacy of composition in economics.

 Like this example, it’s time to get the government involved to have a better legal structure for data privacy.

Not only that, the move is also spreading to fake news issues.


 We may need to step back from the current situation of the agreement-centric data management. It is better for us to look deeply into the purpose of data collection, minimize data collection, anonymize stored data as much as possible and regulate the usage by political measures.

 Of course, it’s not that easy. People are concerned about how the government functions, and about the balance of tracking technology and trusted data utilization to build a bright future. It’s like a chicken-and-egg problem. Even if people don’t trust the current government so much for reasons like political corruptions, we need to change it for the better in order to have the trusted government regulate it appropriately at some point in the near future.

 To go in that direction, we want to explore the potential of a new approach. Let’s hope experts, especially academic experts, will progress the study of regulation on data usage. At the same time, we want to transfer the kind of technical knowledge which experts have to the government so that they can discuss technological points with a better understanding. We may also want the government to build the capability to review the technology points precisely. It’ll take some time, but it’s very essential.

この記事が気に入ったらサポートをしてみませんか?