Basically, people don't understand privacy, and don't see what is going on, so they don't care about it. Additionally, most privacy intrusions are carefully combined with some reward or convenience, and that becomes the status quo.
This leads to the people who stand up to this being ridiculed as tinfoil hat types, or ignored as nonconformist.
everything after that is just a matter of time.
I showed her all this, and joked about how I'd make a "Wife status tile" in Home Assistant.
All of a sudden she understood privacy.
To your point, there is something in us that does not consider what information could do.
I am 100% not serious and do not believe either statement above. I sadly am in the same boat as you and had a blacksheep of a brother who did some sort of crime and as a condition had his DNA taken so I by default am in the system as well.
I never could understand why people would willingly offer their DNA to companies that even if they are not selling that data sooner or later could have that data leak and the consequences could mean being able to afford life and medical insurance or not.
- leave your phone at home most of the time.
- don't buy a smart TV or other smart devices
- don't use social media
- etc.
None of these measures are bullet-proof, but they are relatively low-cost and don't require much expertise. These are much, much more likely to be things consumers complain about than things that consumers are actually ready to do something about. I think it's clear that consumers ALSO do not understand privacy, but I'd also suggest that they don't care very much. If they cared, there would be more a market for privacy.
There are no non-smart TVs. You cannot buy one. Your only alternatives are to not buy a TV period, or go to great lengths to firewall your new smart TV.
Yes, non-smart displays exist, but they are not sold to consumers, nor at a price consumers can afford.
This is one of those prime examples of how the idea of "vote with your wallet" is a fantasy. Consumers are not in any way in control of the market.
There is no possible way to protest smart TVs when the only options for a new TV include spyware. Your only possible move is to not participate in the market, which then summarily ignores you.
Similarly, existing without a smartphone in today's society is largely not possible. You can't even park in many cities without an app.
I think it's clear that you don't understand the problems being discussed and are just blithely assuming that deflecting blame onto individuals is a reasonable position. It isn't. It's moronic and unconsidered.
...ie people are making a conscious choice based on what they value?
The problem isn't about the big corporations themselves but about the fact that the network itself is always listening and the systems the big corporations build tend to incentivize making as many metadata-leaking connections as possible, either in the name of advertising to you or in the name of Keeping You Safe™: https://en.wikipedia.org/wiki/Five_Eyes
Transparent WWW caching is one example of a pro-privacy setup that used to be possible and is no longer feasible due to pervasive TLS. I used to have this kind of setup in the late 2000s when I had a restrictive Comcast data cap. I had a FreeBSD gateway machine and had PF tied in to Squid so every HTTP request got cached on my edge and didn't hit the WAN at all if I reloaded the page or sent the link to a roommate. It's still technically possible if one can trust their own CA on every machine on their network, but in the age of unlimited data who would bother?
Other example: the Mac I'm typing this on phones home every app I open in the name of “““protecting””” me from malware. Everyone found this out the hard way in November 2020 and the only result was to encrypt the OCSP check in later versions. Later versions also exempt Apple-signed binaries from filters like Little Snitch so it's now even harder to block. Sending those requests at all effectively gives interested parties the ability to run a “Hey Siri, make a list of every American who has used Tor Browser” type of analysis if they wanted to: https://lapcatsoftware.com/articles/ocsp-privacy.html
If every morning I got in my car and left for work and my neighbor followed me, writing down every place I went, what time I got there, how long I stayed, and the name of everyone I called, it would be incredibly intrusive surveillance data, and I'd probably be somewhat freaked out.
If that neighbor were my cell phone provider, it would be Monday.
What we allow companies and governments to do (and not do) with this data isn't something we can solve in the technical realm. We have to decide how we want our data handled, and then make laws respecting that.
And with that, thanks to you, today I am a bit smarter than yesterday.
Thank you very much for that phrase, the rest of your post is a very good example for the layman, but that phrase should be the subtitle of a best selling privacy book.
It's not "surveillance data," you are in a public place and have no expectation of privacy. It's only through such neighbourhood watch and open-source intelligence initiatives that our communities can be kept safe from criminals and terrorists.
Why are you so protective of your goings-on and the names of everyone you call? Are you calling terrorists or engaging in illicit activity at the places you visit? What is it that you have to hide?
I would actually take the premise of (national) security even further and extend collection to not only metadata, but data as well. Further, these capabilities should be open-sourced and made available to all private citizens. Our current law enforcement systems are not powerful enough, nor do they move quickly enough to catch criminals - by the time sufficient information has been gathered on a suspect, it may already be too late.
in your mind, ssl won't leak anything. and non ssl leaks everything.
make a list of everything you can infer without a cert looking on a ssl connection. then add on top of that all the things people with the cert or control over CAs can see and make a list of them all
when you're done you notice ssl is not perfect as you think and the extra request and no cache compound all that.
> Transparent WWW caching is one example of a pro-privacy setup that used to be possible and is no longer feasible due to pervasive TLS.
What? You're kidding. If we didn't have pervasive TLS we'd have neither privacy nor security. Sure, a caching proxy would add a measure of privacy, but not relative to the proxy's operator, and the proxy's operator would be the ISP, and the ISP has access to all sorts of metadata about you. Therefore pervasive TLS did not hurt privacy, and it did improve security.
You're making the same mistake as Meredith Whittaker. It's a category mistake.
> Other example: the Mac I'm typing this on phones home every app I open in the name of “““protecting””” me from malware.
What does this have to do with secure cryptography? That's what TFA is about. You are conflating security as-in cryptography with security as-in operating system security. More category errors. These are serious errors because if we accept this nonsense then we accept weak cryptography -- that's DJB's point.
Some of thi requirements I see here seem crazy. I want carte blanche access to the global network of other peoples computers and I want perfect privacy and I want perfect encryption...
Yeah, no
https://www.justsecurity.org/10318/video-clip-director-nsa-c...
But letting everyone have the message content so that metadata doesn't leak isn't helpful. Maybe in the context it was deployed, where pervasive deep packet inspection was only something China wasted their CPU cycles on, your proxy made sense. But it doesn't make sense today.
[0] X is holographic to Y when the contents of X can be used to completely reconstruct Y.
However, I do want to call out his "Amazon was doing good business before 1999 and the end of the crypto wars", and "companies allocate just a small fraction of their security spend to cryptography":
* Prior to the end of export controls, Amazon was still doing SOTA cryptography
* Export controls themselves boiled down to clicking a link affirming you were an American, and then getting the strong-cryptography version of whatever it was you wanted; there were no teeth to them (at least not in software products)
* Prior to the widespread deployment of cryptography and, especially, of SSH, we had backbone-scale sniffing/harvesting attacks; at one point, someone managed to get solsniff.c running on some pinch point in Sprint and collected tens of thousands of logins. Lack of cryptographic protection was meaningful then in a way it isn't now because everything is encrypted.
It's probably in the talk's last sentences:
> We want not only the right to deploy e2ee and privacy-preserving tech, but the power to make determinations about how, and for whom, our computational infrastructures work. This is the path to privacy, and to actual tech accountability. And we should accept nothing less.
But who are "we" and "whom", and what "computational infrastructure" is she referring to?
If you look at the regulatory trends developing around tech at the moment there are a lot of pushes to slap obligations on the host essentially toe the societal line of their geopolity. You will spy on your users. You will report this and that. You will not allow this group or that group.
This tightening acts in part to encourage centralization, which is regulable by the state, and discourage decentralization, which is at best, notionally doable.
The power of technologically facilitated networking has, prior to the Internet, been in large part a luxury of the State or Entity granted legitimacy by the State. With everyone having the potential to take their networks dark enough where the State level actors legitimately revert to having to physically compromise the infrastructure instead of being able to just snoop the line, it's a threat to the edifice of power currently extant to under a bottom up inversion.
No longer would the big boys in the current ivory tower be able to sit on high and know that there may be threats purely by sitting on SIGINT and data processing and storage alone. The primitive of true communications and signalling sovereignty would be in the hands of every individual. Which the establishment would like to cordially remind you includes those dirty terrorists, pedophiles, communists, <group you are mandated to treat as an outgroup>. So therefore, everyone must give up this power and conduct affairs is a monitorable way to make those other people stand out. Because you're all "good" people. And "good" people have nothing to fear.
You can't deplatform persona non grata from infra they've already largely built for themselves, which is a terrifying prospect to the current power structure.
It's all about control.
That's great and all, but how does that help with mass surveillance by big tech? How would "true communications and signalling sovereignty" shield me from Google, Facebook, Whatsapp, Twitter, etc.?
The whole talk felt like it was gearing up to making a point but then it ended. It turned out that the point was to blame our current situation on the "sins of the 90s". To be fair, it was in the title all along so I'm not sure why I was expecting otherwise.
Without cryptography, all wifi is public, and in dense areas, you would be able to steal so many cookies without having to actually get suspiciously close to anything.
I'm guessing without crypto, we would only access financial systems using hard lines, and wifi wouldn't be nearly as popular. Mobile data probably wouldn't have taken off since it wouldn't have been useful for commerce.
The GP specified "without cryptography", in reference to a counterfactual world where we weren't allowed to encrypt things.
Early domestic WiFi would use WEP ("Wired-Equivalent Privacy") which was very vulnerable: https://library.fiveable.me/key-terms/network-security-and-f...
https://github.com/codebutler/firesheep https://en.wikipedia.org/wiki/Firesheep
Now, would CERTAIN industries exist without strong cryptography? Maybe not, but commerce doesn't really care about privacy in most cases, it cares about money changing hands.
Imagine if you could sue a company for disclosing your unique email address to spammers and scammers. (They claim it's the fault of their unscrupulous business partner? Then they can sue for damages in turn, not my problem.)
There are some practical issues to overcome with that vision... but I find it rather cathartic.
And yet countless thefts have happened and had the proceeds exfiltrated via Bitcoin, and the culprits never caught.
If that's not effective/practical anonymity, I don't know what is?
The connection is interesting, but the key word that I find important is the word policy. Mass surveillance is generally not a technology problem, it is a policy problem. If the government want to surveil every citizens movement they can put a camera on every street, regulate that every car has a gps and network connection that report their movements, have face recognition on every train and bus, and require government ID to buy a ticket that get sent to a government database. When the price of mass surveillance went down, the question of using it became a policy question.
The lame claim that DJB is tearing to shreds in TFA is quite shocking coming from a senior manager at an institution that works on strong crypto. Really shocking. Is she just clueless?
If there was no international business, any-strength crypto would have been and could have been used.
Efforts to design foundational cryptographic protocols were completely hamstrung by the spectre of ITAR and the real possibility that designs would have to US only. Right around the time that the US gave up, the commercial community was taking off and they weren't at all interested in further standardization except was creating moats for their business - which is why we're still stuck in the 90s as far at the network layer goes.
Tinfoil hat: it was dropped to prevent exporting code being 1A protected as case law
We still have this blind spot today: Google and Apple talk about security and privacy, but what they mean by those terms is making it so only they get your data.
The article debunks this, demonstrating that privacy was a primary concern (e.g. Cypherpunk's Manifesto) decades ago. Also that mass surveillance was already happening even further back.
I think it's fair to say that security has made significantly more progress over the decades than privacy has, but I don't think there is evidence of a causal link. Rather, privacy rights are held back because of other separate factors.
Over time, because security and cryptography were beneficial to business and government, cryptography got steadily increasing technical investment and attention.
On the other hand, since privacy as a social value does not serve business or government needs, it has been steadily de-emphasized and undermined.
Technical people have coped with the progressive erosion of privacy by pointing to cryptography as a way for individuals to uphold their privacy even in the absence of state-protected rights or a civil society which cares. This is the tradeoff being described.
How does that debunk it? If they were so concerned, why didn't they do anything about it?
One plausible answer: they were mollified by cryptography. Remember when it was revealed that the NSA was sniffing cleartext traffic between Google data centers[0]? In response, rather than campaigning for changes to legislation (requiring warrants for data collection, etc.), the big tech firms just started encrypting their internal traffic. If you're Google and your adversaries are nation state actors and other giant tech firms, that makes a lot of sense.
But as far as user privacy goes, it's pointless: Google is the adversary.
[0] https://theweek.com/articles/457590/why-google-isnt-happy-ab...
We didn't go wrong in limiting export encryption strength to the evil 7, and we didn't go wrong in loosening encryption export restrictions. We entirely missed the boat on what matters by failing to define and protect the privacy rights of individuals until nearly all that mattered was publicly available to bad actors through negligence. This is part of the human propensity to prioritize today over tomorrow.
That's a very hot take. Citation needed.
I remember when the US forced COP(P?)A into being. I helped run a site aimed at kids back in those days. Suddenly we had to tell half of those kids to fuck off because of a weird and arbitrary age limit. Those kids were part of a great community, had a sense of belonging which they often didn't have in their meatspace lives, they had a safe space to explore ideas and engage with people from all over the world.
But I'm sure that was all to the detriment of our society :eyeroll:.
Ad peddling, stealing and selling personal information, that has been detrimental. Having kids engage with other kids on the interwebs? I doubt it.
COPA [0] is a different law which never took effect. COPPA [1] is what you're referring to.
Ad peddling, stealing and selling personal information, that has been detrimental.
I agree and what's good for the gander is good for the goose. Why did we only recognize the need for privacy for people under an arbitrary age? We all deserve it!
0 - https://en.wikipedia.org/wiki/Child_Online_Protection_Act
1 - https://en.wikipedia.org/wiki/Children%27s_Online_Privacy_Pr...
So we agree on this part.
> What did happen is that regulation was passed to allow 13 year olds to participate online much to the detriment of our society.
My claim is that if "we" hadn't allowed 13 year olds to sign away liabilities when they registered on a website there would be fewer minors using social media in environments that are mixed with adults; more specifically guardians of minors would be required to decide if their kids should have access and in doing so would provide the correct market feedback to ensure that sites of great value to minors (education resources being top of mind for me) would receive more market demand and at the same time social platforms would have less impact on children as there would be fewer kids participating in anti-nurturing environments.
Unless those kids aren't interacting with kids at all, but instead pedo's masquerading as kids for nefarious reasons. Which yes, has been VERY detrimental to our society.