Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Humans aren't rational actors who get tricked into embracing propaganda by subtle logical fallacies. This will be of no more help than fact checking.

It's a neat project on its own, tbc, I just have very low expectations of broader impact.



I disagree with your first point. People are far more rational than you are making them out to be, it's just that they are rational within their own value system, not yours.

Also today's propaganda is capable of adapting itself to each audience member's value system to make it more palatable, and then gradually nudge the audience towards the desired narrative/beliefs/values. The systems that distribute the propaganda are already analyzing people's values and using that information to manipulate people. I think that information asymmetry is part of the problem. I could be wrong, but I think flipping that dynamic around so the public can see the true values of the subjects of propaganda may help neutralize a lot of propaganda.

As far as what impact this specific project will have, I have no idea. You may be right. I'm curious about its limitations and how it can be applied.


I thought so too, but recently so many people dropped or adapted their core beliefs to be able to support and defend people in power that they really love that it made me change my mind. Now I think that value systems are malleable and are formed by whatever makes us feel good. And the logical consistency on top is very optional.


Or maybe these people you speak up assume that the people in power have values aligned with their own, and if there was an unbiased system that highlights value discrepancies using formal logic, they might not "love" those people as much anymore.

What I assume you might be missing is that you are looking at the world through a different lens than these other people. Both you and they are consuming propaganda and can't detect it as propaganda because it aligns with your values. However it subtly nudges your values in a direction over time.

I agree that people's values and core beliefs are malleable, but in the same way a tree trunk is. It may seem like these people have changed a lot and you haven't, but I think it's more likely that you've changed too, and that they've changed less than you think.

No one is immune to propaganda, which is why anything that can help disarm it interests me.


You touched on many points, but one thing to consider: people seek out information that confirms their worldview and actively protect themselves against anything that can harm their feelings for their idols.

John Doe isn’t trained in logic and can adjust any of his premises if it means he can continue to admire his favorite celebrity. It’s a combination of flawed reasoning and premise flexibility.

Not to mention, any fact can be endlessly challenged and questioned even if it’s agreed upon and largely incontestable.


Humans are not fully rational, but they're more rational than many assume. For instance, many thought the illusory truth effect showed that people are biased towards believing things they hear many times over, which is great for propagandists, but it turns out this is only true when they are in a "high quality information" environment. This is quite rational! They should update towards believing repeated statements when the environment they're in has shown itself to be reliable. When the environment they're in has shown itself to be unreliable, the illusory truth effect basically disappears.

[1] https://x.com/ROrchinik/status/1885820697160859951


How does that explain conservatives doubling down on whatever they hear even if it's obviously false? I guess because they wrongly consider some "low quality information" environments "high quality information" environments?


Not everything can be reduced to this one cognitive phenomenon. The behaviour you describe stems from: confirmation bias, and the backfire effect/identity-protective cognition. Also this isn't exclusive to conservatives:

Science Denial Across the Political Divide: Liberals and Conservatives Are Similarly Motivated to Deny Attitude-Inconsistent Science, https://journals.sagepub.com/doi/abs/10.1177/194855061773150...


> Also this isn't exclusive to conservatives:

You link aside, I think the obvious evidence is that that behavior is significantly more common in conservatives. Literally the most basic of facts get denied in bulk. I don't understand how you could make any argument that any other major political affiliation engages in the same behavior to a comparable extent.


> You link aside, I think the obvious evidence is that that behavior is significantly more common in conservatives. Literally the most basic of facts get denied in bulk.

This is modulated by who is currently in power. Conservatives were worse when they lost and Biden was in power. Democrats are ramping up the crazy now that they're the underdogs.

> I don't understand how you could make any argument that any other major political affiliation engages in the same behavior to a comparable extent.

Go check out X and Bluesky and how many people are denying Trump was legitimately elected, and how they are convinced Musk tampered with the voting machines.

As for denying basic facts, there's a whole host of basic scientific facts that people who lean left deny wholesale, eg. heritability of behaviours, personality and other characteristics, differences between groups, denying certain features of sex and the sexes, etc.

I won't claim that the problem is equal on both sides, for many reasons I won't belabour here, but it's not nearly as wide a margin as you're implying. Part of the reason it seems so one-sided to you is my-side bias + the biased coverage the other side gets.


> This is modulated by who is currently in power. Conservatives were worse when they lost and Biden was in power. Democrats are ramping up the crazy now that they're the underdogs.

That isn't remotely true. Conservatives have been consistently in the lead, and there are studies showing how much more prone to believing misinformation they are.

> Go check out X and Bluesky and how many people are denying Trump was legitimately elected, and how they are convinced Musk tampered with the voting machines.

There's at least reasoned arguments for that. That isn't the same thing as rejecting useing masks during a pandemic.

> it's not nearly as wide a margin as you're implying.

It really is, but we clearly disagree.

> Part of the reason it seems so one-sided to you is my-side bias + the biased coverage the other side gets.

You shouldn't make assumptions about how or where I get my news. I don't think coverage bias applies at all in influencing my conclusion based on how I get my news.


> Conservatives have been consistently in the lead, and there are studies showing how much more prone to believing misinformation they are.

No, that's misleading. Conservatives have also been consistently in the lead on "authoritarianism" to the point that it was considered a purely conservative phenomenon, until someone actually thought to ask questions like "what would left wing authoritarianism look like?" and suddenly they found it everywhere.

You seem not to realize how unreliable the data is on these questions. Not only is the replication rate of psychology and sociology ~35%, but the demographics of those fields yields a clear bias on exactly these questions. You simply cannot draw such sweeping conclusions from the unreliable data we have.

When conspiracy and biased thinking are tested directly, as with the study I linked, there is no difference in how the biases impact their thinking. Both sides are extra harsh on their enemies, are overly forgiving of their allies, etc. Confirmation bias and motivated reasoning all around.

> There's at least reasoned arguments for that.

Do you think that there were reasoned arguments for Trump having won in 2020?

> That isn't the same thing as rejecting useing masks during a pandemic.

They could cite reasons for that too, you just don't believe they are valid reasons. It's the same confirmation bias in all cases though.


I think you're envisioning this in a pessimistic way.

I totally agree that the end conclusion "this statement is fallacious" is pretty useless. But I assume that a working process would also yield the chain of judgements (A is right, B is right, C is wrong, etc). I think that would be VERY useful.

People who become captured by propaganda and lies generally are not sold on 100% of the propaganda. There are certain elements they care more about and others they can ignore. A way to deprogram people through conversation is to just ask them to explain things about their views and ask them to reconcile them with reality. The reconciliation is painful for them and that pain keeps people "in" irrational beliefs - but it's also how people find their way out. Once they no longer associate themselves with the conspiracy, they can discard beliefs associated with it...provided they can think through them.

I think being able to automatically decompose a fact check into the elements of what "is true" and "is false" in a statement would be HUGE. An essential tool in helping people escape from information swamps.


I vaguely remember a post I read on reddit [1] around the beginning of COVID by a nurse who dealt with an anti-vax patient. It went along the lines of "Big pharma wants to poison me", "Maybe you're being played and Chinese propaganda wants you to believe that to hurt the US". Apparently induced quite a lot of dissonance.

Fighting fire with fire.

[1] Impossible to find of course. And with all the LARPing going on on there, take this with two grains of salt. Given all the crazy shit going on in the US, I find it totally believable though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: