OP: I dislike the claim because I reject the claim.
Your version of OP: I reject the claim because I dislike the claim.
So what OP seems to be actually rejecting is an extreme version of the claim, along the lines that humans aren't rational at all, which isn't there in the article.
Consider: "a stranger makes a sophisticated argument, which seems convincing, whose conclusion is that you're wrong about something important". If you always respond to this by doing whatever the stranger advocates, you're likely to end up getting scammed or getting eaten up by a political movement or otherwise doing something you later regret.
Even if the argument is true as far as it goes, sometimes true facts are presented in a misleading context. One favorite tactic to discredit a group seems to be to find some of its worst members and do truthful reporting on them; one can also try to suggest "policy X is working/not working" by choosing the statistical measures that paint it in the best/worst light, and failing to mention the other measures that might portray it more accurately.
A general strategy of "remember new information, but don't let it affect your actions until you've had time to reflect / consult with those wiser than yourself / do further research" is useful in a wide range of situations. (And if you don't bother to do further research for years, it follows that either the new information sits in abeyance for years, or you take the risk of acting on it without having validated it.)
That's amazing, it's an excellent summary of the theory the article proposes for why we instinctively discount evidence against our preconceived opinions. Basically it's to stop us getting scammed, but in the hunter-gatherer context in which we evolved, not a modern society with robust systems for validating evidence.
The pre-conceived opinions the studies test aren't always even things the test subjects actually care about though. They can be opinions about things they were only just exposed to and wouldn't be expected to have any personal investment in, such as opinions about fictional characters that only exist in the test. We're not talking about proving to conservatives that liberalism is right, or vice versa, some of the tests are literally concerning beliefs about issues that only exist in the test. It doesn't matter, as soon as an opinion is formed it's incredibly hard to change it no matter how strong the counter evidence, and even if it's shown conclusively the initial opinion was based on false data.
We are all doing this shit all the time. Denying it just gives it more power. Admitting you have biases that color every interaction at least gives you a fighting chance to examine them.
For example: "I don't like eating brisket. It always gives me heartburn." In that case, the heartburn causes the dislike, even though it was stated after the effect (non-enjoyment).
In this case, im3w1l described an emotional state (effect), followed by his reasons (cause). You're free to disagree with his reasons, but it's important to understand the argument or you're responding to a straw-man.