>>109011021>If something makes them feel icky, they work backwards from that feeling and justify through flawed reasoning structures rather than using logical reason structures to inform their feelings.I've read literal books on the subject of data privacy. There are many different models for describing the harm that comes from violations of privacy. But they boil down to two things:
>the quantifiable, economic harm from breaches of privacy/confidentiality (i.e. stealing money)>the subjective, emotional and psychological harm from breaches of privacy and normsJust because "emotional and psychological harm" is subjective and grounded in irrationality, does not mean it can be dismissed as "without worth". Humans are not creatures of pure reason, they experience pain and distress over all manner of things that don't have material impacts on their life.
But at the same time, we can use models to predict with pretty decent accuracy various things that are likely to cause them pain or distress. "Loss aversion" is a very common irrational bias humans experience, and yet it's tremendously predictable and easy to design around (e.g. don't "give" someone anything until it's 100% a done deal, otherwise they will get very mad when you take it away).
>>109011598The "populist" argument does matter in this case, because we're talking about "what kinds of activities cause social harms?" If you are a utilitarian, then you need to gauge the impact of an activity on the common welfare. If you are a deontologist, you need some sense of the consequences of actions of others and this is commonly by observing from the perspective of some median/mean person. And if you are some manner of sophist, then an angry mob is bad news.
>>109011634Before sharing some piece of data or information in public, there is some mental calculus used to gauge the potential consequences of sharing that. I'm sure you agree, because everyone stating "you should know the internet is a public space" is essentially arguing from that premise.
Clearly, not everyone uses the "best" or "most accurate" mental calculus in that decision. Again, everyone arguing from the other side would agree with this. But also, no one has perfect foresight into future conditions. e.g. We don't know when today's "unbreakable encryption" may some day no longer be quite so unbreakable.
When confronted with the reality that your mental calculus making that original decision was incorrect, people feel distress. Both because they don't like what the outcome was, but also because their decision was "incorrect" and they regret making making it.
The "average person" has some set of beliefs about how the internet works, how people interact with the information they share, and what behaviors are "normal" or "outside the bounds of normal."
Was the decision about sharing their data or information in public one that was "reasonable for the average person to make?" If the answer was "yes", but your behavior causes them distress, then I'd suggest you are at fault for violating a social norm, rather than they are at fault for not using the most "technically accurate" mental calculus.
While it may be more technically correct to assume that every tiny point of data is capable of being harvested, aggregated, synthesized to produce larger or more significant data points, and then published or used to someone's detriment, it is not the way the average, rational person sees or interacts with the internet. And, thus, they feel distress when this occurs and this is a violation of social norms.
Take the internet angle out of this. Would it be possible? Yes, it would probably be even more feasible than using the internet because they can't hide certain things. Would it be "normal"? Fuck no. Is the fact that it's "possible" a socially valid justification for doing it?