Toggle menu
122
332
11
3.4K
Information Rating System Wiki
Toggle personal menu
Not logged in
Your IP address will be publicly visible if you make any edits.

Culture and privacy in a ratings-based society

From Information Rating System Wiki

Main article: Privacy, identity, and fraud in the ratings system

Judging from our own society, we can infer that most members of a ratings-based community will not understand the encryption scheme well enough to assure themselves that it is, in fact, private. But they may take the word of cryptologists who do it for them. This is another reason why organizations will be important. In this case, crypto-organizations will be a trusted party for advanced cryptographic methods.

How much privacy, and trust in privacy, will be necessary to accomplish the goals of our society? Again we can offer a variety of algorithms, but it would be good to know what types of information go with what level of privacy. We’d expect data on personal health, for instance, will be subject to rigorous methods such as HE. Political opinions, however, may not be so rigorous. In fact, we may not want them to be since presumably everyone benefits from an open marketplace of ideas and debate. Simulation and experimentation should lead to the optimum settings for privacy on a society-wide scale for a variety of applications. Of course, not everyone will adopt these recommendations but having them will serve as an important educational tool.

It is hard to know in advance what a culture will produce as its standard of openness/privacy. We have assumed that the society we are aiming for will be more open and less private than our current society. After all, we will be sharing much more information with our peers, whether aggregated privately or not. And we are depending on our peers to provide us with constant feedback on how we’re doing. The idea is that when we go astray our peer ratings will help guide us back. This process would appear to suggest a level of comfort with honest exchange quite a bit greater than we have today.

Simulation and optimization notwithstanding, we might ask right now whether greater or less privacy leads to a better society. On the one hand, if our views are open to all, they can be checked much more quickly by others. We could prevent, hopefully, bubbles of misinformation from developing by ensuring that people who disseminate them are rated accordingly. Open information is, furthermore, a foundation for collaboration on all manner of projects. On the other hand, we consider privacy a necessary part of thought. We want to be able to think things through privately, have opinions that may not fit with the mainstream, try them out, and discard them ourselves before being subject to the community’s disapproval. Thinking freely often means thinking privately. Our system will need clearly delineated privacy settings and reasonable defaults depending on the situation. All privacy settings will, of course, be ultimately modifiable by the user.

We go further and speculate that a ratings-based society will have less need for privacy because trust, overall, will be greater. Thus a virtuous cycle would develop where greater trust allows people to feel they can more freely share information (less privacy), leading to even greater trust. We might hope for this outcome but there is probably some limit. At some point, we don’t want to know everything about everyone and folks don’t want to tell us everything. We suspect there is always a privacy barrier that is better not to cross. Our interest, in any event, is mostly public information and behavior. Optimizing the privacy settings in that domain should be enough.