But we've utterly normalised digital ignorance and built what Edward Snowden very rightly calls an "Insecurity Industry".
I'd go further, we've turned a celebration of ignorance around cybersecurity and dismissive attitudes into virtuous slogans.
"Don't make me think" - Krug
"Move fast and break things" - Mark Zuckerberg
"If you've nothing to hide you've nothing to fear" - J Random Idiot
And those who are charged with advising and protecting are deeply
conflicted - because they want backdoor access or at least insecure
products.What it boils down to is that presently there's more money and power in insecurity than there is in security. Our industry has multiple principal agent, Shirky Principle and Pournelle's Law problems, see [0].
We allow ransomware and stalkerware companies, and outfits like NSO (which I only mention because they are most well recognised) to operate as legitimate.
We flood markets with defective IoT crap and reduce consumers expectations to the level of accepting vendor malware and backdoors installed out of the box.
And then we turn around and complain that "stuff ain't secure".
This whole ship is DUI.
> "Don't make me think" - Krug
That quote has nothing to do with cybersecurity, it's the title of a book by Steve Krug about web usability.
I am unfortunately old enough to have read that book when it first came out, and it's exclusively around how to design front-end UIs on websites to reduce user complexity. There is no mention of infrastructure or security at all.
You're making a quote around how we should make websites more usable and understandable to users - so they can use them without thinking - into something it isn't.
It has everything to do with it.
I know exactly what the book is and I read it. It's actually an excellent book on UX and I expect Steve Krug picked the title because it sounds cool.
No disrespect to that author intended, but it (maybe unwittingly) expresses a sentiment that has grave implications about the position of technology in human affairs. To understand why, please look deeper into what we used to call Human Computer Interaction (HCI) or "Cognitive Ergonomics".
I think I recently mentioned it in this online chat [0]
Explicit cognition is the "thinking slow" part of our brains that uses so-called left-brain linear reasoning and logic. It sits high in the cognitive stack. But as people use devices today, in what McLuhan [4] or Innes [5] would call an "acoustic" (nothing much to do with actual sound) way, we drop down a cognitive level to a faster, visual-haptic loop that bypasses explicit reasoning.
Designing applications that bypass this has major effects on security. The work of B J Fogg will show you more about this [1].
Tristan Harris also has lots on it [2,3].
One of the disastrous effects of this "distracted" level of HCI is that people use more emotional cues, rote, colour, word association, implicit trust and other models that make them easy prey for phishing and other kinds magic and trickery.
If you're interested in a much broader understanding of cybersecurity I give you a sincere invitation to check us out here [6].
[0] https://www.youtube.com/watch?v=hYnOf4PWGpA
[1] https://behaviordesign.stanford.edu/people/bj-fogg
[2] https://www.youtube.com/watch?v=LUNErhONqCY
[3] https://www.wired.com/story/our-minds-have-been-hijacked-by-...
[4] https://en.wikipedia.org/wiki/Marshall_McLuhan