This realization is also deeply depressing, because it means you're doomed to repeat yourself over and over if you want to persuade people.
Let's take Noam Chomsky for instance. He gives more than a hundred talks every year for the past 50 years. He has written dozens (if not a hundred) books, given thousands of interviews. His message is always the same. Because after you've figured out what your best and most persuasive arguments are the only thing left to do is repeat yourself over and over. Every day is groundhog day.
Startups also have to learn the value of repetition. Long form sales text works, because of repetition. Long form video demos work, because of repetition. Drip email campaigns work, because of repetition. It's often better to give customers one good reason to use your product, repeated three times than to give three distinct reasons why they should purchase. Counter-intuitive, perhaps, unless you've heard this argument before.
https://m.youtube.com/watch?v=XvwNKpDUkiE
It's a somewhat tedious story but it's all about the freedom to control your own hardware.
Yes. Embrace this.
Patience is a virtue; and not getting angry because the person you've stated something to doesn't get it even the third time you've stated it. I actually find it fun to try and come up with different ways to state things such that people might better understand it.
The biggest component of success in communication comes down to saying things enough times that your message can actually be listened to and digested at least once. You can vary the way you say things each time, but literal repetition works nearly as well, because the problem is almost never "I don't know what those words in that order mean" but rather "I didn't hear half that sentence" or "I was thinking about lunch" or "that might have been important but it just sort of passed by and I forgot."
That would probably be the distinction I would draw between persuasion in general and indoctrination / brainwashing. Of course, generally indoctrination and brainwashing seem to have a much higher rate of success in changing minds, which is indeed depressing.
Phrasing the challenge as "sound enough logic" puts the burden of proof in the wrong place. It implies that whenever you're not persuaded it's the fault of the other person for not being persuasive enough. That's the opposite of open-mindedness. It is exactly because of the presumption that your current beliefs are true that you won't change your mind as easily as you might think. Even Scientologists say they'll leave the church if somebody could just provide them with evidence it's all baloney, but it's a standard of proof nobody can meet.
When I say all persuasion is repetition, it's really not an overstatement. Maybe you're closer to believing me this time ;-)
As that old Lincoln saying goes, you can fool some people all the time, all people some of the time, but you can't fool all people all the time.
They all work by repetition but last one hurts people ability to thing rationally.
But I think most peoples reaction is just to say "Yeah! that's totally what those other people do that our side never does because we're right.".
to guard against this, as soon as i am aware of a catchphrase or common talking point, i mentally deconstruct it and find the truth. frighteningly, the most commonly repeated verbiages are, should we say, misleading.
but lean a little closer, stranger, and i'll whisper the truth into your ear: the "two sides" are unequal in their abilities for evaluative thought, self righteous zealotry, and dogmatism. an honest attempt at critical evaluation will go nowhere if your evaluating apparatus is garbage.
The very fact that both sides are convinced there are two sides suggests this is a false statement. The "sides", really the false divisions and classifications, are some of the biggest lies ever told. In any case, any inequality between them is not the problem. By way of analogy, if two glasses of water have unequal amounts of fecal matter with neither being even close to zero, wouldn't it be better not to drink either of them rather than suggest one is better than the other?
The good news is that you don't even have to worry about the "other side" just worry about the things that you believe. That was the point.
Also unlike hard sciences and logic the fact that there only two sides and one must be in either one, is also a thing to evaluate.
So you're right. First there is a need for a meta-evaluation, is my evaluation apparatus even working and how would I test if it is.
Of course everyone will think the two sides are unequal, with the side that isn't theirs getting the worst of it.
I have an ego, I don't like being wrong, and I think I'm right a lot. Ok, so far I get it. But to constantly ignore or avoid objective evidence? How do people not become ill at the thought?
1) Clinical narcissism or sociopathy? It's all an intentional means to an end.
2) Simple lack of practice in critical thinking? They are acting in good faith but just not seeing the con.
3) Their morale code does not exclude machiavellian tactics and they just want to win.
Maybe the population has all three types collaborating both knowingly and unknowingly across different roles.
Honestly, it's pretty great how many people are willing to spend time and effort that doesn't directly benefit them (on a base material level) to care about this stuff.
It's also harder than one may think to do this well. You have be fairly skeptical 24/7, even of yourself and your own thoughts. I think that doing it halfway likely leads to lots of seemingly well-rationalized ideologies.
I like to think I'm getting better at being generally skeptical and slowly layering together a coherent, mutually supportive set of usefully accurate mental models of how the world works, but I find it very tough to know to what extent I'm fooling myself or not. One simple maxim I've found useful is to try to avoid allowing any idea from becoming "sacred" and above questioning. While not practical for daily life, I think it's a great fundamental background orientation for our thoughts and perception of the world.
You may be interested in the Meaningness project: http://meaningness.org
To add to this, I think an additional problem is academia.
So much of success and even prestige in academia is garnered through skills that do not involve critical thinking.
Who hasn't had a conversation with someone at the top of their field who has strong, factually unsupported, convictions about another?
Entire fields are subject to self-interested ideologies, rather than facts (i.e. my undergrad degree in economic science was more propaganda science at all): http://www.thecrimson.com/article/2013/12/13/economics-scien...
Just to clear up any notion that I'm being bitter, I graduated at the top of my class.
There needs to be entire coursework in critical thinking, starting in primary school. Unfortunately, most of my primary school education outside of maths and language was just rote memorization of facts.
One of the critical aspects is social pressure. If they ever made fun of the thing they are supposed to be convinced of, convinced others (publicly especially) against it, they will throw every cognitive, emotional, and other tricks against every believing that.
That is why public ministering and proselytization as a necessary step in participating in many religions -- it is not just to simply bring others into the faith, but it is to inoculate those who do it against every disavowing it. I posit that social media and everyone messaging each other false-hoods is part of this public display of belief. Later on going against that is very hard, because there is solid evidence of them making fun of it just a week before. Nobody wants to be seen flip-flopping or being a hypocrite.
2) Simple lack of practice in critical thinking? They are acting in good faith but just not seeing the con.
Interestingly critical thinking of often orthogonal to other proxies for what society thinks "intelligence" is. Often it goes the opposite way -- the smarter the person thinks they are (maybe the more diplomas they have hung on their wall), the less likely they are to ever change their positions, because they will:
a) Have to confront the fact that they have chosen or supported an invalid one before. And with 3 diplomas on the wall, that is surely not something they would do
b) They have a greater capability at rationalization. When the CEO has a bad day because they had a fight at home and goes to work and shuts down a project or fires someone publicly, they will rationalize it to themselves in many other ways except "I really was upset for another reason, and made a stupid mistake, I just wasn't thinking straight". They'll use their intelligence to make something up that sounds reasonable.
Would it affect you tight away? No, but in a Generation our children contract more disease, less of them go into stem fields because of the distrust and dissonance that has been spread, and the air your children breath becomes more polluted.
Information bubbles. Too many people only consume information that aligns with their beliefs. To many, they don't feel that they are ignoring or avoiding objective evidence because they easily dismiss it as lies without any evidence.
"Just the place for a Snark! I have said it twice/ That alone should encourage the crew/ Just the place for a Snark! I have said it thrice/ What I tell you three times is true."
Also, I highly suggest everyone making the Hitler comparison actually read the definitive work "The Rise and Fall of the Third Reich: A History of Nazi Germany," so we can perhaps have some more intelligent comparisons than: "You know who else used repetition? Hitler."
Every article I read seems to have become a fun little exercise in "How can I put both Trump and Hitler into this seemingly innocuous article about cognitive fusion?"
It's obnoxious and old. If you want to actually do a Trump-Hitler comparison, read a book on the topic and write an actual paper about it instead of just flinging it out willy-nilly so you can fear-monger your readership into believing that actual dystopian eugeno-fascism is just on the horizon.
So here is another human psyche detail - when someone removes the actual details (such as what these 3 exec orders are and the specifics of the mentioned crimes), they are doing so to protect the narrative...
I'll give you an example: "Ban of majority Muslim countries" vs. "Iraq, Syria, Iran, Sudan, Libya, Somalia and Yemen".
The reason the media removes the list of countries and replaces them with the quote, is because when people see the names of these countries, they realize that they are an open death-sentence destination for Americans, and have a large public that often chants death threats to America.
And that is a problem for the narrative the media is using (to exploit the situation). So the details have to be removed.
This is the first step of the manipulation of public opinion. The second step is the repetition.
As an aside one strong component of a couples ability to coexist is just this: how do they interpret relative truths. And how do they gain input for truth moments.
You tell me Mount Everest is 5742 meters high, seems a reasonable size for the largest mountain on earth. A year later you tell me Mount Everest is 6488 meters high. Seems reasonable and I have long forgotten that you claimed a different height last year. For whatever reason you keep telling me Mount Everest is 6488 meters high every Sunday afternoon, week after week.
I have no reason to doubt that what you are telling me is true and after a couple of weeks I will start to know and remember that Mount Everest is 6488 meters high. But then a couple of years later someone else tells me that Mount Everest is actually 8893 meters high. I object. To settle the issue we decide to look it up on Wikipedia and lo and behold the official height of Mount Everest is indeed 8893 meters.
This may or may not make me remember that Mount Everest is 8893 meters high but it is very likely that I will remember that 6488 meters is not the correct height and it might make me trust you less with regard to mountain related facts. Even without any repetition.
If you want me to accept a statement, the statement must be believable based on what I already know and believe to be true. And I have to have some trust that you are telling me a true statement. Repetition is only secondary, only required if you or I want that I remember the statement in the long term.
And if something is surprising or exciting or whatever, then one might remember something easily without a lot of repetition. The height of Mount Everest was never really interesting to me and learning the wrong height took some repetition. But then learning that Mount Everest is actually 8893 meters high and that I remembered the wrong height for years, that came as a surprise and may not take much if any repetition to remember.
If you create art, even if it's primitive or ugly, if you can repeat something within one artwork or across multiple works, the works suddenly gain some merit, feel more like art, less like random doodle, just due to repetition.
To be clear, I'm not claiming that's the only way to invoke perception of truth or perception of beauty. It's just something I observed while trying to appreciate some contemporary art.
Repetition, in itself, does not persuade anyone of anything. Repetition, as others have noted, simply makes the thing being repeated easier to remember. The true "persuasion" -- i.e., the misinterpreting of the false fact as true -- occurs because we forget the /source/ of a statement quicker than we forget the /content/.
So, for example, if you happen from your friend Joe (whom you know to be a compulsive liar) that "Priuses are actually less environmentally friendly than Hummers, because manufacturing the batteries for Priuses actually releases more greenhouse gases than a Hummer releases over its average lifespan," you'll likely remember that statement for far longer than you remember that it came from Joe, the compulsive liar. And if you find yourself in an argument with a pretentious Prius driver two years down the road and you search your memory banks for relevant facts to throw in his face, you may well pull out the "Prius battery" statement, without ever remembering that it is almost certainly bunk. You have, in essence, adopted a false belief due to having an imperfect/poorly configured memory.
To take it a step further: If you then make the "Prius battery" statement to the Prius driver, presenting it as fact, you have repeated it (thus making it more firmly entrenched in your mind) and you have replaced the (previously empty) "Spoken By" metadata field with one that now reads "Me [trust score: 100%]." Speaking the false fact is not necessary to make the false-belief-adoption effect appear, but if you do happen to speak the false fact, it only serves to strengthen the effect and further entrench the false fact.
This effect, of course, only works with facts that are not absurd or plainly wrong on your face. If you hear 2+2=5, you don't need to remember the source to know that's wrong. But there's a whole class of facts out there that exist in a gray area -- where they are wholly falsifiable on their face, and would require some serious digging to validate/disprove -- where this effect can lead to serious confusion. To the extent the repetitive blasting of falsehoods works, it works because of this and this alone. Wired here is doing us (and the fools who paid for the "HeadOn" advertising campaign) a disservice by implying otherwise.
I think one needs to look at older writings to get a take that isn't designed to reinforce the constructed media narrative du jour.
Even then its hard to get decent info on this topic because its always been so morally charged. Nothing obscures reality worse than moral concerns.
Edit: I think you can also apply that description to several parties in the modern media environment.
https://www.wired.com/2016/10/wireds-totally-legit-guide-rig... - from before the election, claiming it was basically impossible and would require a massive conspiracy
https://www.wired.com/2016/11/hacked-not-audit-election-rest... - from after the election, arguing the safeguards against hacking are ineffective and casting doubt on the security of the election
Both articles are backed up by a convincing-sounding set of facts and expert opinions, yet despite the available evidence not actually changing they come to completely opposite conclusions. All that changed was that before the election "hacking voting machines is impossible" was the better anti-Trump narrative, and after he won there was suddenly a reason to cast doubt on the results. It's all about the narrative. (Which is one reason you should question the endlessly-repeated claim that there's "no such thing as alternative facts". Careful selection of which facts to include and exclude is a great way to create a narrative.)
Apparently a specific comparison formed in your mind, though, for some mysterious reason.
Probably just "constructed media narrative du jour," right? Certainly not because the person might actually be a quintessential example that stands out from everyone else so well that you already knew who people were likely talking about without anyone being named (yet, strangely, apparently want to deny that person should be considered for such a comparison at all).
> Nothing obscures reality worse than moral concerns.
It's not really clear that it's possible to separate human concerns from moral concerns, and I can only imagine someone arriving at the conclusion that "nothing obscures reality worse" by searching a pretty narrow set of reality-obscuring hazards.
https://news.ycombinator.com/item?id=12829781
http://www.bbc.com/future/story/20161026-how-liars-create-th...
-- early false statements are seen by many
-- later corrections down-thread are seen by few
-- popular promoted, true but unpopular hidden
https://www.reddit.com/r/rust/comments/5queq5/how_high_perfo...
Is it really a matter of repetition or a matter of trust? Does it matter that fact-checkers point out the errors in trumps tweets? He has gained the trust of his followers, while the fact-checkers have a bad reputation and shady relationships with establishment. Even if they get the facts right, people don't trust them to make decisions right. It's more about what you plan to do with the facts than who has the most facts, and policy decisions are not deterministic, no matter how many facts you throw at them.
Same goes for establishing scientific facts indeed. Peer review is based on a few, reputable reviewers, rather on a crowd of anonymous but trust-less fact-checkers.
The article is clearly trying to put Trump in a bad light and other comments here are applying it to other politicians and corporations. I think the important point is that repetition doesn't discriminate truth but can be dangerous because it can seem like it does. On important issues there is no substitute for research and accurate methods to interpret new data.
Here induction refers to inductive reasoning rather than mathematical induction. For example if you see a billion white swans you might conclude that "swans are white". It's not "true" but it's not wrong either from a bayesian point of view. We really don't have any other way of doing experimental science.
Basically, we learn patterns real well. Doesn't matter the nature of the pattern, if we have no counter-example of significant or equivalent weight.
Humans are the only beings who has language and hence reason to understand how everything works, unless they engage in producing dogmas and chimeras out of words and abstract concepts, which is what they usually do.
it doesn't improve your visual acuity, but it can have a positive effect on your overall eye health [1]. there's vastly more to eyesight than sharpness/clarity. my wife is an optometrist.
[1] http://yoursightmatters.com/carrots-really-improve-eyesight/
Hoffman's character repeatedly asks questions to Phoenix's character, inducing a hypnotic state. Repetition can also be used as a form of mind control.