I have come across a lot’s of people like these. like 99% of them. Sometimes it makes me think twice if what i am saying is wrong? What’s wrong with them. Is it so hard to swallow your pride and acknowledge that the other person is speaking facts? When they come to know they are wrong they proceed to insult/make fun of others to save their ass. Just why?

  • theherk@lemmy.world
    link
    fedilink
    arrow-up
    29
    arrow-down
    1
    ·
    1 year ago

    Best thing my daddy taught me; no matter how confident you are, you could always be wrong. Brains are just unreliable sometimes. Sky is blue? Could be wrong. You’re N years old? Probably… but you could be wrong.

    Accepting this allows one to improve. Best we can do is recognize this, and try our best to minimize how often we’re wrong.

    This has allowed me to withhold confidence in many situations. Not in deference, but in thoughtful acceptance that I truly might be wrong.

    Best dad ever.

    • Metacortechs@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      That really warms my heart to hear. I’m trying to be one of the good dads.

      Just today my 9 year old and I had a conversation about how I’m always the first to step up and admit when I make a mistake, and communicate what I did or will do to fix it, where I have colleagues who will try to hide their mistakes and front like they never ever make them. Going so far as lying to clients, bosses, and coworkers all the way.

    • Socsa@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      1 year ago

      The problem with this is the quiet nihilism baked into it, which is the same reason so many people believe that widely supported science could be wrong.
      In the absolute sense, it is true. Though things like “the sky is blue” is more about linguistics, but for a layperson it’s kind of inconsequential either way. While there is a small possibility that scientific consensus could be wrong, there is orders of magnitude bigger chance that unwarranted skepticism is dangerous. Reality does exist, regardless of how much epistemology you choose to wave away.

      • theherk@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I don’t think so, and he and I have discussed this in epistemological terms several times over the years. “Sky is blue” example was probably bad as would have been “earth is round” etc. The point isn’t that anything can be wrong, though strictly speaking, I guess it can. What we mean is precisely that our minds have the ability to mislead us and powerfully so. But part of the drive to minimize that is to understand the value of consensus in both scientific communities and wider communities.

        To have the best ratio of things about which were correct vs incorrect, being confident in things like the outcome of refereed science is helpful.

    • zxqwas@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      When I feel like I am getting dragged into an argument on the internet I try to remember that when two people argue at least 50% of them are wrong.

      • KevonLooney@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Not necessarily. Both people can be correct, but arguing just to “win”. Both people can also be wrong.

  • 520@kbin.social
    link
    fedilink
    arrow-up
    25
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Because people tie their egos to their opinions and beliefs. They see an attack on those as an attack on their person.

    We are all like this to certain extents. For example, I am a firm believer of the right of the individual to make their own choices, and believe that attempts to remove a person’s right to make choices morally abhorrent.

    • bionicjoey@lemmy.ca
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      1 year ago

      It’s a basic psychological need to be able to trust that your brain can be right. A lot of psychological problems can result if you don’t trust your own brain to be able to solve problems and cope with new information.

      Some people don’t learn the value of accepting mistakes/failure as part of learning. As a result, they will associate being wrong with weakening the trust they have in their brain. They don’t want to believe that their brain may not come up with answers for the problems and changes they face in life. So they will deny that their brain is incorrect.

      It’s an ugly insecurity, but it’s totally understandable from an evolutionary perspective. We need to be able to trust our brains to navigate life’s challenges. People need to be taught that it’s okay to make mistakes, and that admitting when you are wrong is an opportunity for personal growth.

      • ArumiOrnaught@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Can you elaborate on what you mean psych problems when you don’t trust your brain?

        I have memory issues and so when I’m doing something more complex than muscle memory I write it down. So I don’t trust myself on specific memory tasks, like phone numbers.

        • bionicjoey@lemmy.ca
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          I’m not a psychologist, so I can’t explain the exact theory, but basically my understanding is that self-confidence and self-esteem are linked to the concept of self-efficacy, which means the trust that your mind will be able to cope with the challenges presented to it. Low self-esteem and self-confidence are linked to all kinds of insecurities and challenges functioning.

          Disclaimer: I recently began working on my own low self-confidence with my therapist and she pointed me to a book, the contents of which I’m mostly regurgitating here. The book is “The Six Pillars of Self-esteem” by Nathaniel Branden.

  • mydude@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    1 year ago

    When trying to persuade someone of something, you need to talk to the elephant, not the rider (emotion, not logic). That’s why propaganda works so well and facts don’t.

  • Dr. Coomer@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    The answer is 1: they’re stubborn, but more importantly 2: it’s because the human brainly is wired to hate conflicting ideas. Quite literally, when a belief or idea of yours in countered, your brain tells you “your feeling pain and in danger”, and this applies to every person, though some people feel it more strongly than others.

  • Etterra@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    1 year ago

    Cognitive dissonance. Lots of people never learn and are never taught how to separate their ego from their knowledge. It doesn’t help that education still relies on punishing mistakes and failure.

    • phorq@lemmy.ml
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      Yup, and when most people are a certain way, odds are you are too. I try to keep an open mind as much as possible because it’s very hard to identify your own biases and it would be naive to believe that I am the exception to human nature.

      • Swedneck
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        i actually find it reasonably easy to identify my biases, it’s just basically impossible to directly act on that knowledge.

        if i realize i’m actually probably in the wrong i tend to just sorta… slide into the shadows and disengage, which is at least better than continuing to insist that i’m correct and just digging in deeper… Then in the future the brain tends to have let go of it and people have forgot what was said previously and i can upgrade to a more correct take.

        • redballooon@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          And this is the normal thing how it works Just not online because what’s said there doesn’t fade away. It just keeps up standing there regardless if the author since then changed their mind.

    • redballooon@lemm.ee
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      Does not stand to scrutiny. I counted and only 3 out of 13 are like that. And only two of those would post online.

  • intensely_human@lemm.ee
    link
    fedilink
    arrow-up
    10
    arrow-down
    2
    ·
    1 year ago

    Because most people, when they’re showing someone else that they’re wrong, choose to twist the knife about it. Onlookers add in jeers and snark, making the experience of admitting one was wrong into an unnecessarily-painful shaming event.

    People don’t want to admit they’re wrong, because our culture punishes people who admit they were wrong.

    In the cases when a person speaks to me as if I am someone capable of admitting I’m wrong, when they treat it like it’s no big deal I just happen to be wrong, I have no trouble admitting it.

    For me what works is to show me without much emotion. Like pointing out to someone they’ve got a leaf in their hair or something. If someone comes at me, with proof that I’m wrong, in the manner of a helpful friend pointing out something I can’t see from my vantage point, it really doesn’t hurt.

    But when people are calling me evil, stupid, toxic, etc, I just want to dig in my heels. I might see that I’m wrong, and at that point stop arguing my point, but I won’t actually come out and acknowledge it.

  • kalkulat@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    What makes you think it’s most people? Who you hangin out with?

    Remember, Confucious say: If you are the smartest person in the room, then you are in the wrong room.

  • sara@lemmy.today
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    Cognitive dissonance. People see themselves as rational and intelligent and anything that counters that is very difficult to accept, so they double down.

  • cacheson@kbin.social
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    I read somewhere a while back that it’s supposedly an evolutionary thing. In a social competition for resource allocation, confidently arguing your position regardless of its correctness is more beneficial than admitting you may be wrong.

    It’s probably exacerbated by the internet, where the relative anonymity and psychological disconnection further reduces any benefits to admitting to an error.

    • SatanicNotMessianic@lemmy.ml
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      Evolutionary biologist here.

      This is actually a tricky one. Lying (and I’m going to fold the projection of false confidence in with that one because I’m talking about deception, intentional or otherwise, not a moral concept) is only effective if others believe you.

      Humans, as the most highly social of the primates and ranking among the most highly social animals on earth, have adapted to believe each other, because this helps with trust, coordination, shared identity, learning, and so on. However, it also creates a vulnerability to manipulation by dishonest actors. Again, I’m not talking about a moral dimension here. There are species in which mating is initiated with the gift of a nuptial present (eg a dead bug) from the male to the female. Sometimes the male will give a fake present (already desiccated insect, eg) to trick the female, and sometimes it works. Deception and detection are an arms race, and it’s believed by many to be one of the drivers that lead to the development of human intelligence, where our information processing capacity developed alongside our increasing social complexity.

      The problem is that when lying becomes the default, then the beneficial effects of communication cease. It’s like when you stop playing games with a kid that just cheats every time, or stop buying from a store that just rips people off. It’s a strategy that only works if few enough people play it. There’s tons of caveats and additional variables, but that’s the baseline. So why do we still see so much of it?

      The first component of course is confirmation bias. If 90% of our interactions are trustworthy, the ones that stick out will be the deceptions, and the biggest deceptions will get the most notice. The second is that the deceptions as a whole have not been impactful enough, over time, to overcome the advantages of trust, either in biological time or in social evolutionary time. You will notice that more trust is given to in-group rather than out-group members, and a number of researchers think that has to do with larger social adaptations, such as collective punishment of deceivers - sending someone to jail for writing bad checks, say, is easier if they’re part of your community as opposed to a tourist from another country. We can also see cultural differences in levels of trust accorded in-group and out-group persons, but that’s getting into a lot of detail.

      The third major operator is the concept of the self. This is a subject where we are just being able to start making scientific headway - understanding where the concept of a self comes from in terms of neurobiology and evolutionary dynamics - but this is still very much a new science layered on top of ancient philosophy. In the concept of the self there is a component of what I’m going to calll the physical integrity of the structure. People find being wrong painful - there are social situations that activate the same parts of the physical brain as physical pain and distress do. This is especially true of those ideas are seen as being held by other group members, because you now have the group structural integrity on top of the one in idea-space. That’s where you get people willing to literally die on the hill of Trump winning in 2020. For the evolutionary construction and nature of the self I’d recommend Sapolsky and Metzinger - it’s too new and too complex to get into here. If you want to just summarize it in your mind, call this component ego defense.

      I think that’s most of what’s going on, at least as we understand it so far.

      • cacheson@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Interesting. I was thinking more of gray area stuff than outright lying, like playing up the importance of facts that support one’s position and downplaying those that don’t.