Get all sides of every story at groundnews.com/Ve and read the news with a data-driven approach. Subscribe through my link for 50% off unlimited access -their biggest discount of the year.
you can try to prove it as much as you want with curves and graphs and stats... it won't stop this squirrel drama from making Trump to win the presidential election 🐿
I'm pretty sure a lot of the participants also answered what they believed over mathing it out so they wouldn't be edited out of context on camera. Like, they don't want to become a participant to a "Look at all these X voters who actually disagree with X policy compilation" video
As a math major, I’m disappointed. I was so ready to get it wrong then it was just… proportions. And then it was about politics and I wanted to cry lol
Julius Caesar: “Men willingly believe what they wish.” Seneca: “People believe that which they wish to be true.” Francis Bacon: “Man prefers to believe what he prefers to be true.” George Orwell: “People can only see what they are prepared to see.”
@@richyfoster7694I don't see any progression. It seems to me more like an uncomfortable fact on human psychology that has been true since ancient times.
Rationality is not a character trait, it's a process. If you fool yourself into believing that you're rational by default, you open yourself up to the most irrational thinking.
And, on the flip side, if you assume the "scientist" is smarter than you, you are also more likely to fall for the "Appeal to expert" fallacy and go, "But so and so in a white lab coat said it so it must be true!"
@@commentinglife6175that's not a flip side, it's the same thing. scientific reasoning like that is why peer review and replicability are core to the scientific method. most scientists acknowledge that scientists are fallible as well
@@plwadodveeefdv If I have learned anything, Scientist are as likely, if not more likely to make really terrible assertions based on bias. They also often let bias affect their ability to see the big picture when looking at statistics. Gun crime for example: Almost all of it is from gang violence or s**cide but lab coats will continuously preach that removing guns fixes the problem but ignoring the fact the tools of the trade just shift.
Very subtle and clever wording in the title. I filled in the gap myself with "Smarter people do worse than dumber people", when in reality it was "Smarter people do worse than they are expected to"
Well, if you write a statement like "Group A does worse", is that not objectively equivalent with "Group A does worse than Group {Not A}"? Certainly "Group A does worse than it is expected to" is not the most plausible interpretation.
It's not "clever", it's just deliberately misleading. Same as the rest of the video, where the questions shown on screen don't match the ones they showed to people (which should be an immediate red flag about the honesty of this "study"), and where they count "the number of cities" instead of the number of _crimes._
I think a big part of why people don't change their views when presented with evidence is that evidence needs a level of trust, most things we disagree about aren't things we can easily see ourselves, we have to trust whoever is collecting the data. And we easily distrust anyone that says something that goes against our beliefs.
I'd also add that it is easier than ever to mistrust the data simply because the capabilities of the researcher are so expanded today. An economics professor I listen to makes this point very clearly when he talks about how easy it is to slice and dice the data and run multiple regressions over and over. In a way, having a more limited tool set provided a sense of "comfort" simply because it was so time consuming to re-run the calculations, we could assume someone was reporting their initial results rather than their fifth or sixth time now that the results matched what they wanted to report.
For the gun control example, I had a discussion with a European buddy who was talking about how Europe has low gun crime rates. Like, what a surprise. It doesn't matter that acid attacks, stabbings, assault, etc. are on the rise, they don't have gun crime! So if you see a statistic about that it's important to see trends before the change happened and what actually is being measured, if you're saying gun control reduced gun crime because there's zero guns, that's not actually measuring total crime trends that could otherwise be rising.
"most things we disagree about aren't things we can easily see ourselves" I don't think so. Take climate change for example. Within my own life span, I myself could clearly see the massive shift in weather pattern, and I am a city guy. Those like farmers whose work heavily depends on weather, they'd be even more likely to notice the shift. Or like with g^n control, I don't believe anyone would actually have a hard time seeing the ridiculousness in things like "without g^ns, a criminal would just be as dea_dly with a knaife." More likely, it starts from what gains or loses people have with such conclusion. A guy who's fond of g^ns would more likely not want to believe in the benefits of g^n control, and then he'd just fuel that narrative with "reasons" that fit, to the point it becomes a "belief".
I think this is definitely an aspect of it. But also changing your position in a political matter means changing your identity. And most aren’t willing to make that trade.
Ground News hit the frickin jackpot over here. This is literally a 15 mins of why we are inherently biased by nature, and have to actively try to not be biased, and thus how critically relevant viewing data and news from both sides is. I imagine you calling them up and midway in your first sentence shouting: Yes, we would LOVE to sponsor you!
What struck me the most about about listening to people defend their answer to the gun control version of the question was that they were not engaging with the numbers at all. All of the arguments they made were about whether it would 'make sense' for gun control to work/not work. Numeracy doesn't matter if you're not looking at the numbers. This is actually the biggest problem I have with my students. Not in terms of political bias but they way they don't engage with the facts presented in a problem and instead reason based on what they would like to be true. It's incredibly frustrating to deal with.
That's very true actually, in light of hard evidence, we tend to rationalize against it if doesn't align with our political views or readily accept it if it does! A good reason is given about why this happens in the video but I'd like to add that these political beliefs become a part of our identity and questioning it is like working against yourself and most people don't like doing that!
The Problem in this example results from presenting data that goes against any reasonable view one could take. Gun control _works._ That is clearly evident by just looking at all other culturally western democracies, where _none_ have the same issues as the US, and in _none_ guns are as easy to access. This specific example therefore imo doesn't work - if you present a proponent of gun control data that lacks any context as to why a ridiculous case has resulted from gun control, as in gun crime going up, then they will rightly find issue with this. Especially if you frame it in a general, specific manner ("gun control") but only specifically mean one measure ("concealed carry"). The hypothetical should be somewhat believable if you want people to intellectually engage with it.
Your experience seems to be represented in this video pretty accurately. The woman who saw the apolitical version and asked what kind of skin cream it was also wasn’t engaging with the numbers
That's the thing, these kinds of student gets so focused on their hypothesis, and when they conduct a study/paper then discovered the data, they tend to skew the result to support their hypothesis.
There is a problem with every study. IF it is shown that smart people are selective in their choice of facts, then it means that scientists are also selective in their choice of facts - and this is a fact as data analysis is not only based on numerical analysis, as not ALL variables are observed, scientists choose variable to observe and sometimes even select data to use based on these beliefs. Now as long as this is some "apolitical" topic it works usually fine, although bias is still a huge issues. But when they are paid based on the results or topic is political, then that research is affected even to a level of fraud. Sabine Hossenfelder recently had some nice videos about current state of science. All this casues that people do not trust ALL research as new myths are created on daily basis, or people are served with ideas based on a tiny fraction of truth. Like: if EU limits emissions of CO2 then the world will be saved, even though rest of the world makes so much more.
@@OldSkullSoldier But I just worry about the systematic problems about financing really. Otherwise, it is well known that each single study might be biased. At least in natural sciences that is not a big problem. Even if an model of little use is popular, eventually a more useful model (AKA "the thruth") will win out.
I don't think I have ever heard a better advertisement for Ground News, I hope they are paying you well. I've avoided checking them out till now because I can't believe anyone is unbiased, but your example was rather compelling.
Don’t know why, but I noticed, simply switching from a medication test to gun control made the question significantly harder to parse for me. Like, the question just suddenly felt much more difficult. I wonder if it’s because when learning about probability, you constantly learn about medication tests, control groups, light vs heavy symptoms, etc. So it becomes second nature to think of all the edge cases and possibilities when looking at a medication trial test. But for gun control, no textbooks ever use that as an example, so our brain can’t go into math mode.
Same here. I think it was partly because the numbers on the first board represented individual people, while the second board represented things happening in 8 cities. I had to forget about the 8 cities, and pretend each number was representing an individual city, then it felt more like the first board.
Hah ever wonder how Trump got re-elected, this is how. People can't read data and just make stupid decisions based on what "feels right" in the moment. "Durr hurr something something something economy, me vote red hat..."
For me it became harder to parse partially because I automatically distrust any statistics found in a political setting. There're so many ways to manipulate the data to make your side of the argument look good. What if gun control decreased crime because implementation of gun control laws tended to be paired with a strengthening of the police force in the area? Correlation is not causation, but determining what's actually causing the problem is a multifaceted issue that doesn't make for good political campaigns.
The timing of this video is not lost on me. Thoroughly enjoyed it. I am definitely guilty of shunning ideas because they're not typically associated with my belief system, despite being a deeply curious person. I have been working on it, and will keep doing so in the future.
I used to do a dumb statistics joke back in 2006 based on official UK Department of Transport car accident figures from 2000. It went something like this. 20% of car accidents involved excessive speed. Which means that 80% of car accidents had appropriate speed. 20% of car accidents involved somebody who was over the blood alcohol limit. Therefore, 80% of car accidents were caused by sober people. My conclusion? Drive at 100mph while drunk is safest.
That is something I am missing here. How many people/cities were involved in the fictious rash/gun study? Because the neutral participants for whom nothing changed are also important for the conclusion. In your analysis it would be the ratio of sober/drunk people who drove without an accident.
@@l4nd3rThe thing is if your approaching random people in a political context, presumable a rally, and asking them these questions, they aren't going to interpret that as you asking them a math question, they're rationally going to interpret that as you asking about their beliefs on the issue
@@joedwyer3297agreed. I’m confused seeing others somehow get this effect when the video doesn’t really allow for it. By doing the cream example first and then explaining the main point through the gun example it’s hard to tell how I would have originally felt.
When you change it to political, the distrust of the data skyrockets. All sides are used to being lied to. What this says about "science communication" is significant.
This was my intuition as well. If I was showed a chart with 4 numbers about gun control and crime I would be like I am not drawing any conclusions from these 4 numbers, please leave.
@@TheMustyrusty Unfortunately for many people there is no such thing as a trustworthy source if it contradicts their current understanding. 4 or 400 numbers, the results wouldnt be too different.
@@agafabaWhich is why we have millions of people who firmly believe that Greenland is facing a threat from unprecedented high temperatures not seen in 10000 years.
I work as a strategy analyst for a major retailer and on a weekly occurrence I have a suspicion or belief that something may save money, or be a better solution for the business. As I research and analyze data I am no longer surprised at how often I prove myself wrong and have quickly learned that data supporting for, against, or even no correlation for either are all equally valid answers. It’s not about getting the results I want but rather getting results period. I’ve applied this same practice to other aspects of my life and it has been very freeing (and humbling). Thanks for the video.
I agree. Most often we want our ideas to be right, but verifying it against real data is necessary. I would rather not disappoint people than lead them on with a fabrication.
One of the most important trait, and the one thats more difficult to learn as a Data Analyst it’s being impartial. Don’t putting one’s belief into an investigation, dashboard, or report it’s very important to learn what’s actually behind the data, not supporting your own idea of it and don’t manipulate the results to skew the outcome you want to have
Yeah, fits in with my experience. This isn't a question of smart and dumb, it's a question of intellectual honesty. Everybody is always smart enough to fool themselves.
There is a TH-cam video called “why smart people believe dumb things” that is exactly about that. Smart people are better at fooling themselves. It’s the old “some ideas are so stupid only intellectuals believe them”. (I have to warn you the video is “anti woke” but if you ignore the political implications it is still a good video and says the exact same thing you said in your comment).
No, the whole point of the study was that people better at complex thinking were better at deluding themselves contrary to data. It's harder to be intellectually dishonest when you don't have as many tricks at your disposal.
Could es well ask a group of dermatologists if daily washing is good for your skin, and the priors formed by a multitude of studies, will not just be overturned by a fictitious one.
“I have no data yet. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts." *-Sherlock Holmes, a Scandal in Bohemia*
"How often have I said to you that when you have eliminated the impossible, whatever remains, however improbable, must be the truth?" - Sherlock Holmes engaging in an obvious logical fallacy
For those wondering, the paper is called, "Motivated numeracy and enlightened self-government". A co-author on this paper was Paul Slovic. My mom worked for him at Decision Research back in the 90's when he was doing his book The Perception of Risk. She's mentioned by name in the forward of the book. He's a really awesome researcher and person. When I saw this paper, I thought it was amazing he was still in the business and still doing great research.
If Veritasium did not completely butcher the content of the paper in this video then im sorry i gotta break this to you, but the guy seems to be fartsniffing moron. He doesnt seem to have bothered considering much about what he actually measures. Smart people know that overly simplified funny numbers cant tell you much about highly complex issues. Thats why they are not convinced by them at all. And instead have follow up questions and reasoning, which could nicely be seen in some of the interview clips. Just like a sudden measurement of neutrinos going faster than light, only an idiot would immediately take that as gospel without checking all other simpler options.
If this video properly charactierizes the study, then it's terrible. What its capturing is the increased likelihood that the participant will accept your premise when it's an issue they care about. At no point did the asked say anything like "if you accepted this data as true, then what would your answer be?". Instead, they presented fake data, we watched as certain participants rejected the data and answered the question, and then those wre marked as getting the math incorrect rather than merely refusing to participate.
You have done more than just “raise awareness”, you have put a “starting point” like you said out there, and it’s the best starting point I have ever seen for this issue which means everything for functional democracy.
I think the problem is people are not relying on the data in the fictional scenario and are relying on facts they believe they know. If you showed them a scenario that showed gravity was driven by air pressure, you would have a hard time getting people to believe that even though the paper showed that.
I think this points toward a deeper question that I'm not sure was answered in the study: how rational is it to accept new information and abandon prior information? If the quality and quantity of previous evidence is high, asking a person to suspend that understanding for a guy with a cardboard sign so you can play with numbers is a factor that's hard to ignore.
One problem with the study could be that people are EXPECTING the skin cream to be a math test (why else would an interviewer want to talk about skin cream data?) But when asked about gun laws, they assume the interviewer wants to hear their political views, so they focus on that.
@@gephyrro I mean, I'd be skeptical of the data too if some random guy with a poorly worded table on a big cardboard sign asked me to interpret the data from a (presumably) unnamed study. Well, to be honest, I'd probably be arguing with how poor the wording was first since "crime improved" and "crime worsened" with the seemingly opposite arrows are really bad. There's also the problem that a lot of other people have pointed out: you likely have background knowledge related to that. If you could somehow prove that someone is believing something that's false to begin with, then sure, I think the video has a point. But if you listened to and internalized conclusions based on quality data, then it's more irrational to change your belief than not. The RAND Corporation has a huge metastudy of gun control split into how well-supported different policies are in affecting different outcomes in well-conducted studies. You can read their published reports as well as easily navigate a quick table. The evidence isn't overwhelming on most issues, but there's good evidence that shall-issue concealed carry and stand your ground laws are bad. They increase homicide rates. Like, you take a question like that and put it up to pro gun control people, is it reasonable for them to just quietly accept the new data from Sign Man? Personal opinion, but no. They should not.
@aek-f3z Also a good point, but they were still having their answers recorded. The camera might not be a problem (if you know for sure), but it was still being phrased and they might assume it's about politics
The conclusion at 9:13 is wrong. People shouldn't change their mind based on a single study, that's not how science is supposed to work. If they have a personal belief that's based on many thoughts/studies/experiences, then a single piece of evidence should at best pique their interest in the topic, but certainly shouldn't make them instantly change their mind
I think you misunderstood what he meant. The study wasn't trying to change their minds. He told them that the numbers didn't actually correspond directly to the real world, it was just getting them to do some math. Those who had lower reasoning skills and contradictory pre-existing beliefs on either side got the math wrong more often because they're so entrenched in those beliefs. Also, a lot of these people didn't form their beliefs with much actual reasoning, many of them just believe what they're told. What the woman at 9:06 says is a perfect example of that. It's people who are more open minded and consider all sides who perform better.
And we are really used to putting politics before logic because we see clips of people being politically "owned" more often than someone correcting another's math... and political tension is a way more emotionally charged situation compared to math
@@michaelgoldsmith6615 I do fundamentally agree, the correct answer in my opinion would be something like "this study shows that crimes have increased when the guns got banned, that's interesting, I might have to read into that". But Derek draws the far too wide conclusion of "people hold their partisan belief regardless of presented evidence", and I think that's a far reach from the results of the study. Also the woman at 9:06 is the perfect example of scientific thinking, she sees enough evidence, so she changes her stance, that's how it's supposed to be.
@karola.7908 I agree that his wording should have been more specific and much less generalising, but i assumed he meant it in the confines of this specific test, and not about their overall views, maybe i'm the one who misunderstood. I also think his use the terms 'evidence' and 'deluding' are quite misplaced. Embarrassingly, i also forgot what the woman actually meant, i thought she was just talking about her mentality in general, rewinding to the question that she's responding to, i've realised that she's not as suggestible as i thought. Still, people shouldn't simply believe things just because they come from a place of authority, you should always check the sources. The media is full of things that claim to be factual evidence while being objectively incorrect. Sorry about all the misunderstanding. I should've rewatched the whole segment first.
very interesting video and study... the last bits about how we are not so different i would comment by slightly changing a famous quote: "divide and conquer" ----> "keep divided and control"
The problem with this experiment is when you ask people to consider a study on a nameless skin-care product that they have no prior knowledge of, you're testing their ability to come to the correct answer based on the numbers. But when you show people a study on a matter they probably have a lot of prior familiarity with, like climate change or gun control, you're not testing their ability to find the empirically right answer, you're asking them to rationalize the study itself based on all of the knowledge they have on the subject matter. The questions shouldn't be does "X lower gun violence", or "does Y affect climate change", the same way you would ask "does Z reduce the rash". You would have to ask them does THIS specific study show a reduction in gun violence, climate change, etc.
I was going to comment about it, on the question about gun control, there's no two sides over here, the data is there on an international scale, having hypothetical data and asking people to ignore all of their previous experiences is unreasonable.
That format also calls for people to take a public position on camera, not knowing if they're being tricked by an activist's video. Plus, while gun control supporters consider it a question of lower vs higher crime rates, many/most opponents of gun control consider it a civil liberties question. That is, even if crime goes up, they'd still support it because it's a civil right, rather than a purely functionalist policy question. That said, confirmation bias is absolutely a thing and this was a good video.
@@crafty1098 Even in the case of a pure numbers question that graph only states total crimes being worsened/improved. It doesn't state whether the thing which was just made illegal was included in those 'crimes'. Imagine if you made cars illegal, there'd be tens of millions of criminals overnight, crime is skyrocketing! But it doesn't state what 'type' of crime is occurring/not occurring, only that it is/isn't. Further it doesn't explain the *other* side of the equation, and that is the numbers of people saved via guns, which is as far as I am aware, like... inordinately disproportional, like 10x more people protected with them than harmed by them because there are just far more law abiding people with guns than criminals with guns, by default. It's one of the reasons that data like that is inherently biased. Even if the data showed that gun crime went down due to a complete and total lack of guns, if you don't also look into if other forms of crime worsened or improved because of that decision you're being disingenuous at best, outright manipulating people at worst.
This is a critical point. It's kind of like like asking someone what's 2+2 and then asking them if the data showed that 2+2=5 would that change their previous belief. If people have reasons for believing that 2+2=4 they are not going to immediately submit to new data without vetting it (that would be completely irrational). In my experience, most people are just not rigorous in vetting data, which is why emotional arguments are so common in politics (they are often more effective than trying to have productive debates).
Your point is valid, but i think the video is still fair. I would also predict that the rewording of "did THIS study show a reduction in Y?" wouldn't change much. Here's why: in Psychology there is a term called "flexible thinking". It's brought up in decision making research. A person who can think flexibly will reduce or block out the influence of their prior knowledge when making a decision from information provided. However, most of the time, when a decision is difficult or information is hard to process, we get cognitively lazy. So instead, we rely on a belief we already have to make our decision. What the video showed me is that people with higher numeracy are less likely to think flexibly and only consider the evidence if it doesn't contradict their beliefs. That's a pretty interesting result
The great problem with data in politics is that it's SO EASY to manipulate studies. Each side has a block of 'Institutes' that conduct studies where the study design virtually guarantees your outcome. Minimum wage for example. I saw a meta paper examining the state of economic studies on the minimum wage. Sure enough they ALMOST divided perfectly into halves. The ones that chose one methodology for trying to establish controls showed no effect of minimum wage on employment. Those using a different method to control for other factors showed a substantial negative impact on hours worked. I say ALMOST because there were two outliers that used the methodology that did not show effects, but those studies DID show small negative effects. But generally, which methodology you chose ABSOLUTELY DICTATED your outcome. Then, politicians from each side go on talking head shows and talk like the only studies that exist are the ones that agree with them. And we've seen how this can happen even in hard science. How many studies showed that [insert anything here] causes cancer? And then you read the particulars and find out that they stuffed the mice full of 10x their body weight of that substance every day for months. And that's before you get into p-hacking or outright fraud. The entire anti-vax movement came out of a single study that was a dude just straight up lying about his data because he wanted to be able to sue the drug companies for big money. He's ADMITTED in court that he lied about his data. But the antivax movement remains strong. So the term 'Studies show...' is the most overused starter phrase in politics. And the most worthless. And people know it, so they're RATIONALLY skeptical of the 'evidence'. What's needed more than anything is better agreement BEFORE a 'study' is done on what are and are not valid methodologies. There needs to be tighter peer review. And any paper that doesn't go through peer review shouldn't be reported on. No pre-print reporting. Don't waste our time telling us what the National Institute of Pre-Determind Biased Outcomes concluded about the effect of abortion on crime or whatever. But that doesn't sell ad impressions, does it?
Which is why people have to actually be trained on how to read studies (including unfortunately a number of people that do meta-studies). And observational studies are even more prone to this problem. Sloppy selection of subjects or situations, poor controls, etc.. All too often meta-studies just throw a bunch of stuff together without any deep analysis of which studies are even good basic science
And this is why I bristle when liberals (they do this more often) will get mad and call names because they have studies done by thinktanks but conservatives won't believe in them. Its because conservatives distrust since they have seen so many biased stories. Its not they are too stupid to read (always) just cautious in trusting everything. This is why liberals are like press = listen to everything they say
A Rice professor and the former mayor of Houston got caught red handed. The professor was hired to do the study. It didn’t give the desired answer. A new check was cut, a new study designed, and voila, it supported the mayor’s position. 🤣😂🤣
You bring up good points. But the autism thing is interesting. If we're talking about a proper scientific RCT experiment there is literally no proof they work, because they were never done. And people who then claim the polio vaccine cured polio are proven wrong by history, it already went way down before it got introduced. I too thought these people were nuts 4 years ago. But then again, we saw what happened in 2020-2022. So when circumstances changed I reexamined the claims and now I changed my belief. But it's definitely difficult to tell if it's one way or another for most vaccines.
I feel that a lot of comments are missing that a meaningful point of the video is that the title is missing the words "...than usual" or "...than expected." The "high numeracy" people still typically do better than the rest, just not as well as they are EXPECTED to do. When the video says "worse" it's in comparison to that demographic's EXPECTED outcome, not worse than the "lower numeracy" groups. The surprise is that by following intuition, "higher numeracy" respondents tend to perform comparably similarly to the "lower numeracy" respondents' results, in certain cases where their beliefs disagree with the presented information).
the title is kinda clickbait. High numeracy people do better because it is literally a numeracy question. It should say, "political beliefs can overshadow numeracy when polling people outside a political rally."
actually I don't think that's the case as you can see in the graph the high level numeracy people had the same level of accuracy as the 0 level numeracy group. Which indicates they are the exact same, regardless of how they are expected to do.
Yes! The data basically shows that people willingly turn off their own numeracy and choose to go with intuitive thinking when the numeracy would lead them to conclusions they dislike.
I very much feel that I am affected by confirmation bias. "But I know I'm correct so all I need to see is evidence that can prove to others that I'm correct!" is basically how I feel confirmation bias works. A sort of circular reasoning.
I feel that I am impacted by confirmation bias, yet I easily got this correct. Experience, effort, and a sincere longing for truth are irrational I guess.
@@GrassFudge7 correct, doubt is very valuable, if you are not sure if your belief is true you are more likely to try to find evidence for both sides of the story
Since your "fits my rule" video (on confirmation bias, can't remember the title) I try to take an evening, every now and then, to prove myself wrong on something I feel strongly about. It has helped a lot! I either get a better understanding of why I am right (if I failed) or I learn something new, and change my mind, which is just an awesome "aha!" moment 😁 I keep a list of things I changed my mind on (and sometimes even changed back) to remind myself. I hope it helps. Thanks for teaching me that!
@@itstruce. Here's one: I thought I completely understood why and how the Keto diet worked. Even advocated for it among friends. Turns out its main hypothesis is disproven, and (in short) it doesn't work in any special way unless you have epilepsy, but may be easier to follow for some. Edit: It was an example of what some call "Mechanism/Mechanistic bias". When the entire explanation seems logical, but isn't actually backed by good science. Usually means: "It's more complicated" or plain wrong.
@@Wrackey The one that did it for me is "market efficiency". In theory, products will get cheaper, while getting better. But when markets make the measure of profits as a target, it ceases to be a good measure. This perfectly explains the disconnect of quality, price and volume production.
3:02 .... Love it. The self-confidence. The slightly raised voice. The "I don't fall for your trick question" attitude. Funny and frustrating at the same time.
This is similar to a video I've seen by After Skool, on "Why smart people believe dumb things." Apparently, stronger reasoning abilities can help you justify your views, instead of rationalizing your way to the correct conclusion. You start with a belief, and work on justifying it, instead of starting by looking at the evidence, and working towards the conclusion. Also something called the Nobel Prize Effect, I think? Where Nobel Laureates believe in pseudoscience and all sorts of mystical nonsense, because they are outside of their area of expertise. The solution that comes to mind is constantly questioning your beliefs and assumptions, I guess, which is easier said than done.
I think your conclusion "that the solution could be constantly questioning your beliefs and assumptions" is what means to be "interested in science", since "falsifying your hypothesis or symptoms, is the key of science"... people that reject to analyze they can be wrong and only take into account what supports their view, have "confirmation bias".... which is, in my opinion, the key of pseudoscience... The more knowledge one have the more resources to justify or defend your bias.... Knowledge (to comprehend how things work) does not equal to Intelligence (how knowledge is used).... So it's possible to have "stupid knowledgeable" people and " uneducated smart" people as well. :) Know i would have loved to learn which variables were considered as evidence of people "being interested in science" versus people "having knowledge in science". :)
Hearing the people talk about how they got to their wrong answer reminds me so much of listening to people add in all the caveats in their answers to simple philosphical questions (such as the trolly problem) to try to escape the point of the question. Adding in "well I'd have to look into the studies and the sources that they cite" is just missing the point of what is being asked. It is a completely fictional result that they are being asked to assess.
Well, I work in science. And if there is something that people don't realize, it's that intelligence comes in a lot of different flavors... I know many scientists that are brilliant but totally stubborn and dumb in the face of truth. Having an open mind, a critical mind, and a curious mind, are not necessarily because of intelligence. Many people with high IQ have the most blown up ego... Meanwhile you can find the most humble farmer in the most remote region, who can't read and can't do math, but has the most progressive views on human rights and the environment... Great men aren't all smart men, and a lot of smart men are just a bunch of cunts...
Something I think hinders the reasoning process it that people think logic is the highest form of reasoning. It's really not, it's just the building blocks of reasoning: the fact that you can stick 1,000 bricks together doesn't mean you've built a sound house. The true path to sound reasoning is objective rationality: where you continuously try to prove your own conclusions wrong, while searching for valid points within wrong arguments.
Things like this are why I strongly dislike people who say things like the solution to the problem is education or critical thinking. Here is data that shows social and emotional factors will cause those skills to be applied to things like creating confirmation bias.
Either low numeracy skills because you don't understand it on any level or high numeracy skill because you understand it's underspecified. Sample size of the control and test weren't specified and its not necessarily reasonable to assume that control and test sample were different sizes which is required to reach the 'correct' answer.
man, i wish he would have paused the video a bit right after he asked the initial question with the skin cream. I immediately was told the answer by the first guy and had no time to make my own. Now i'll never know if me dumb or smarts
He quickly calculated that death by firearm is 11x more likely if you're black, skewing the crime rate affectable by policy for him to be near zero, and thus dismissing those heavily diluted numbers.
This phrase is certainly more than 12 years old. Why not quote the one who came up with it instead? You're also crediting the actor for something his character said, like he's some kind of genius for being able to read the script he was given.
It seems like tribalism explains a lot in the current political moment, but I think it's one step removed from the primary cause. The problem is lack of trust. In that context, my "tribe" is just the people I trust. That said, tribalism isn't the only engine for trust - it's just the easiest. The way out of this is to restore people's trust in more than just their political cohort. Rabid individualism and self-serving institutions have done much to create the current crisis of trust, and building reliable institutions populated by people willing to speak & act in the best interest of those institutions is the best way out of this mess. Unfortunately, I don't think we'll reverse this political/intellectual culture of self-harm before we hit rock bottom.
For the real world, I agree with you that tribalism is the strongest factor - BUT some of that is that most people simply don't have the time or trained analysis skills to examine data and figure it out. So, in that sense, some part of tribalism is just an optimization of time usage (in a strange sense - sort of like thinking slow / thinking fast situations).
Check out social identity theory and the minimum group paradigm. It isn’t really a lack of trust. It’s tribalism, and that tribalism causes us to lack trust in people who aren’t part of our group.
Tribalism is good, actually, and the people who argue otherwise are inevitably the ones who want to erode their competitor's tribe to the advantage of their own. No one who complains about tribalism will ever go out to find common ground with their political enemies, or seek to find understanding of why they believe what they believe.
@@SomeGuy-ty7kr People tend to recognize tribalism in other people but not themselves. We humans are very good at rationalizing our behavior in terms that make us feel virtuous and smart, so we're often quick to think in terms of " that team bad" than we are to figure out why we believe what we believe, and why people who believe something different, or even incompatible, need not be stupid, wrong, or evil. It's easy to blame their tribe. That said, I'm not sure tribalism is either good or bad. Many people think religion is bad tribalism, and point to the millions killed in religious wars as proof. But Catholic Charities and the Salvation Army are also religious tribes. Tribalism is inevitable. Politics has been, and will always be tribal. That's why it's important to recognize that our problem isn't tribalism, per se, but that we've lost so much trust in each other and civic institutions that trust in our tribe is all we have left. That's not good.
@@livemusicfannc It's always been true that most people lack the time and/or background to verify most facts. That's why trust is important. When we can't decide for ourselves, who are we going to trust to decide for us? If your answer is, "That radio talk show host on the MAGA right." or "That social justice warrior on the woke left." you're probably going to be wrong about many things that have nothing to do with populism or social justice.
3:08 this lady “ what do the numbers mean” mason energy. “ I don’t think they mean anything to me. I need to see the product in the rash myself.” I actually tried to see it from her perspective for her for a second to give her the benefit of the doubt and I couldn’t
This way of thinking is why people fall for homeopathic remedies. If you put an onion in your sock, and the flu goes away. That must mean the onion made the flu go away.
This lady is just an example of a very common type of person that makes almost all of their day-to-day decisions based on social influences. They often describe themselves as "very sensitive" and "intuitive" and whether or not they realize it, they are extremely suggestible. A person like this would be presented with a sample of a skin care cream by a sales person, and decide based on how they felt about the person selling it. Or they would have a friend with them who would voice an opinion about it (or mentioned it in a past conversation), and they would be influenced that way. I personally find it extremely frustrating to be around people like this, but on the other hand, as Derek brings up, it can be thought of as a rational behaviour within the context of human evolution and psychology. By influencing yourself with people whose advice has worked out well, you can continue to have a source of good decisions, while also endearing yourself to that person who is also likely to be in a socially dominant position.
@@waylandsmith on a very side note, this is what frustrates me when people (usually psychologists) talk about high-functioning autistic people. This lady is the opposite of autistic and yet is exactly what people claim autistic people have: lack of abstract thinking. This lady cannot abstract at all, she just *feels*. Almost every high-functioning autistic person I know is very good with abstractions and couldn't care less if a seller is warm towards them. The whole video got me thinking a lot about divergent people in general. We tend to see stuff in a wildly different way and to be more curious and do not care about fitting or not, but with the raw data [well, mostly].
Just a reminder: We do know very litte about that woman. So be open to her view an her personal approach to live. Not to trust abstract numbers may seem like a resistance but is probably also the result of life experiences.
Why did you label it 'crime improved' and 'crime worsened'? That seems so much more ambiguous than 'increased' and 'decreased' wtf all of you in XAI760K ??
Bro, my intuition went straight to comparing the proportions. Picking the number just because its the biggest one isn't intuitive at all. Ct's called guessing without thinking. Cuz bigger is generally better
The "higher positives" winning the argument (ie Poker maths) is kinda cultural thing I think that if they did this exact same experiment somewhere else like in China or Japan for example - I'd expect a different result
Humankind is very alike, but humans are not. What bums me the most is that we take humans as the forefront and reasoning to separate humankind, instead of respecting each other's differences and learning from each other. We take information that we align the most without thinking about what consequences are or what others are thinking from social media, and acting upon it. There was a time in politics where, although we had our human political differences, we would work together to get through it because we're all part of humankind. People were seen as decent human beings, members of society, but respected.
I think it's very important how you phrase the question. "Based on these numbers did gun control increase crime or decrease crime?" is very different from just showing the numbers and then asking if gun control is effective. You can agree that the data leads you to a certain conclusion but still disagree with the methods with which the data was gathered or presented.
"Increased crime" is indeed ambiguous in these sorts of things because, when you make something illegal, there will of course be more crime, because something was just made illegal
yes, it did not seem that he asked in a very standardized way watching the video. And the one person did not even seem to consider his data sheet really. But then again, I doubt they really wanted to reproduce the result as they said (they did not show their own statistics). They just needed some people say weird stuff to fill the gaps in the video and make this more entertaining. I am sure in the paper they were more careful.
And if you make gun ownership illegal, you are going to get people committing the crime of owning a gun, because it is now illegal. In the UK for example, there are about 6000-7000 recorded offences every year related to gun ownership which would be eliminated if gun ownership were made legal. But the 29 gun homicides per year would likely increase, to somewhere nearer 10,000 if it ends up being anything like the USA.
This. When I realized that the study was about politics, the wording became far more apparent that it was aimed at leading than observing. At that point it had nothing to do with "intelligent people do worse than unintelligent people" and had everything to do with "political bias taints how we interpret data", which includes the way that the data was obtained for this study. It's worthless when you lead people towards or against their biases instead of asking them to make a rational observation based on the data and numbers in front of them irrespective of their political biases. I'm pretty sure there is a rule about not doing this in your data collecting methods when it comes to peer reviewable studies.
I'd say it's implied in this hipotetical situation that gun control is the only factor in the increase/decrease of crimes. The fact that you need to talk about phrasing and justifying the answer for the "gun control" case IS the point of the video, because if they asked you about the "skin cream" case, you would just accept that it's a simplified problem and directly answer if it helped or not. And a perfectly rational person would answer both questions the same way.
Maybe I misunderstood, but the study showed more that if you present the data in a bad way, people don't look into it much unless it conflicts with their worldview.
Yes I doubt if you gave the question in a math test, there would be any difference from the worldview. In this case most people just assumed that the data you show me is not worth a lot. Context matters, on the street clearly people assume you are asking about a topic, usually you want to make a point.
Aren't people with higher numeracy more likely to have opinions formed from previously investigating data on the most hot-button topics? How do you control for that in this experiment?
They should still correctly interpret the numbers from the single study, as oppose to relying on previous information to assume what the numbers state. Now, should one study change a person's opinion on a topic they've previously researched? Probably not, but numbers are numbers.
They explicitly told people the data was made up, and to interpret the numbers. Any prior data should have been irrelevant because all they needed to know was in front of them.
@@michajozwiak7650 What do you mean? It made all people of all levels more biased based on their politics, their actual intelligence wasn't actually related to the trends when it was a partisan issue.
No. They are saying the smarter you are the better you are at manipulating the numbers to see what you believe. It’s a surprising conclusion that I’ve heard before- smarter people are more ideological because they are better at manipulating data to “prove” their ideology.
10:40 is a somewhat dangerous suggestion as it is the classical "let's deal with the symptom first and the cause later (in most cases this means never, with the example of climate change, it is irreversible)" ... imagine a doctor just giving you pain killers repeatedly to deal with your knee pain instead of actually just fixing it (if possible) while giving you some pain killers to "survive" the operation.
Yes, I immediately thought of that. You are not solving the problem by doing this. Like cleaning out the water from a leak, instead of fixing the leak itself.
I think they realized no one cares about solving the root cause, so at least they work on mitigation. Better than nothing. Same with health care, people don't want to stop smoking or exercise.
Whenever I see poll data of any kind, I am most wondering about the questions asked, how they're asked, etc. I see so many studies where I can predict "unexpected" outcomes based on poorly worded questions, multiple interpretations of choices, whether it was multiple choice or open-ended, etc. There's one other thing I want to know specifically for the political version of the question. As the skin cream is made up, no one could have read a previous study on that exact skin cream. Some people could have read existing studies on this exact political question, and I don't mean "read a random article online", I want to know if having read a peer reviewed paper on this topic has an any difference on the outcome.
A big one is where they compare countries. I saw one where they measured agreement with the statement "I trust most people, in general" and the results were completely messed up by the translation, because other languages express it using words that are stronger/weaker
But previous data wouldn't have changed the data presented here, so it's just another cause for bias. The participants were presented numbers in a vacuum and couldn't keep it that way before answering, that's the point.
@@GreenOnionBrother at least for the video that's not really the point, since its second half implies this matters because it is representative of the way people answer things outside of a vacuum. Also, just because you say "please consider in a vacuum", doesn't mean people can simply shut down their beliefs. And who knows, one might also argue that the difference between cream and politics is actually not down to "bias" so much as it is about having or not having context.
@@user-sl6gn1ss8p I don't know what to tell you. They were presented numbers that only allow one conclusion, but due to bias (and this includes previous studies and statistics they've read, regardless of their validity) failed to answer a question they would otherwise have less of a problem with. That is the point. How bias affects or rational thinking.
@@GreenOnionBrother my point is just that it was not clear to me, from the video, how well this could disentangle "bias" from "rational thinking", as in, how much can this actually show that people might make worse decisions in this sense due to their bias - the "tangle" being the fact that this bias may include, as you said, perfectly valid information. To be clear, I'm not disputing that the effect exists - I'm just not sold that the study shows that much, going by the video.
The data used in this imaginary experiment was deliberately chosen to trigger spontaneous answers. Firstly, very different sample sizes were chosen for the experimental and control groups, making a direct comparison difficult. Secondly, a very strong difference between the two results (improved/deteriorated) was chosen in both groups, and prime numbers (223 and 107) were even used on the left-hand side to discourage probands from calculating the ratios in their heads accurately. They also only give two possible answers (whether the experiment showed that the skin condition of people treated with the cream was more likely to “get better” or “get worse.”) which may lead probands to think that the table must show a clear difference between the two groups. I also found no information on how much time the probands had to answer this question… However, a look at the statistics shows that it is debatable whether there is a significant difference between the groups at all, as a normal chi-square test gives a p-value of p = 0.0472 and using the Yates correction, p > 0.05. Also the entire experimental setup is vague, controls missing, etc. So if I had to answer this question I would say that the experiment and data is not good enough to make a statement about the creams effect on skin condition (even if you look at the results without a statistical test). As a side note, please never use the word ‘significant’ when you don't show results of statistical tests like at 1:42. I'd like to see what the trial results look like when you use more ‘normal/scientific’ data.
@@meneldal Although it is not good scientific practice, it does occur quite frequently. However, the use of suitable statistical tests and careful interpretation are particularly important in cases like that.
My first thought was about the p-value as well (before I realized it was American propaganda) I would have answered "I don't know, the numbers don't mean anything" and I think actual scientist that work with data would give the same answer, I don't know who these "high numeracy" people are, but the correct answer is idk.
You just sent me down an hour of refresher on stats and I've come out confused as to what you're complaining about. A p value of 0.0472 would generally be considered significant since 0.0472 < 0.05. Also from what I've refreshed on, yates correction is a solution to when you have very small expected frequencies. For 1 that doesn't apply to this example. For 2 people generally recommend not using it anyway as its introducing different types of errors in the data. This really reads like you didn't like what the video presented, went and did the stats, then ignored the data in favor of justifying your preconceived bias. Which is amazingly on topic.
I love that this took an anthropological spin. Acknowledging your bias is a very difficult task, but important to keep at the forefront of your mind when analyzing a new information. Experiments like this demonstrate it beautifully. It never hurts to ask yourself, how does my culture/society shape my worldview?
Since I have a background in clinical trials and experimental design, I got fixated on why the groups weren't equal size (since the people or cities were supposed to be assigned randomly, why would you end up with two different sized groups?). I would have just ended up questioning why the research problem was trying to fool me. Does that make me more or less rational?
Yeah, also one of these is a controlled randomized experiment and one of these is a study. People are well within their rights to question the methodology and impose their own interpretation of the correlation on the study group as the correlation doesn't necessarily imply causation.
With cities it would be natural to have different sample sizes - the number of cities matching criteria is limited and there's a lot of "noise". You're not conducting an experiment here - you're trying to gather observations and you grab whatever you can. With cream, however, you're definitely right - the obvious thing to do would be to get equal sized groups. You COULD end up with some differences due to people dropping out of the experiment for some reason (it did last some time and stuff happens) but it shouldn't make one group several times larger than the other.
I also noticed that the two groups were not the same size and I was wonderng if there was some third option (not shown in the data). In any case, it was automatic to think about proportions to "normalize" the data. I am also a scientist and I was also questioning the experimental desing. Since one group is getting a treatment and the other is getting nothing there is no consideration for placebo effect. Furthermore, the study is not properly blinded (you know that if you are receiving the cream you are in the treatment group.
I think you were overanalyzing the problem due to expecting to be wrong. I did the same for a short while, before just giving up and going with the conclusion that the skin cream seemed to seemed to make rashes worse on average _and_ that this was wrong somehow. In other words, if the problem was presented in another way (e.g. you were just shown the statistical results without any prelude) you perhaps would make a different conclusion pretty quickly. Although, questioning the study might put you in the "science curious"-category. Or perhaps a more dangerous "science skeptic"-category :I
A great lesson in humanity is that at our core we are extraordinarily similar. Getting to know the "opposite side" is usually enough to at least start a process of introspection.
There’s a few different meanings of the word “rational”. One is using logic and facts to come to a consistent conclusion based on truth. This is the usual definition we think of. This could be called philosophical rationality. The other comes from economic literature. A rational human in the economic sense is one who pursues their self interest and doesn’t sabotage themself. I think there are many more people who are rational economically than are rational philosophically. I think it can be important to remember that even if someone is supporting something that isn’t 100% true, they’re just trying to act in their best self interest.
And it's worth noting that both types of rational thinking are vulnerable to Confirmation Bias, because the human brain is very, very good at discarding information that goes against our pre-established world views.
There's a problem with how you formulated the rationality in the economic sense, because with this formulation everyone will be considered rational, because everyone pursues what they believe is their self interest. At that point, the term loses usefulness. What it should be about is being rational in the economic sense meaning that one is pursuing that which is objectively in their self interest, rather than subjectively. Because that's where most people fall short. That's the hard part because it requires perfectly aligning oneself with actual objective reality and actual truth, as this in practice is not even possible to perform perfectly as the world is too chaotic and applying precise bayesian reasoning to every little thing is computationally explosive. I find it useful to split the meaning of the word rationality the way Yudkowsky does: 1. Epistemic rationality: systematically improving the accuracy of your beliefs. (truth) 2. Instrumental rationality: systematically achieving your values. (winning) And I love the way Vervaeke puts it: In practice, rationality is knowing when to use logic.
Title should be something more like, "People with the capacity to analytically consider data may take shortcuts to their preferred conclusion." But... I'm guessing that would suck for SEO purposes.
@@wax2562 I mean... is it? Are people with higher numeracy scores "smarter" than everyone else? And when the title says "these questions," it pretty clearly suggests specific questions which disproportionately confound "smart people," rather than abstractions designed to elicit an emotional/tribal response, where "smart people" answer the question with accuracy similar to that of not-"smart people."
this is the exact point where "isolated study" becomes something that is NOT taken as "isolated study". you can agree that the study says something, and disagree that the study is correct or was done correctly, which is what I think had happened here. especially with the method of the study changing from lab experiments to crime statistics, the former being a highly controlled environment with low margins of error while the other is marred with statistical error margins that often eclipse the sample size by a sizable factor. with that, people are often biased to pool from intuition and prior experience instead of the currently provided information as it is deemed "useless", which is what likely happened here.(the actual information provided to the people hearing about the study was useless as analysis of crime statistics require a LOT more than "got better" and "got worse" graph)
The two being treated as equivalent is such SOP in the soft sciences. The overall conclusion that politics (or any strongly held belief) can cause people to irrationally reject data was fine. Talk to someone about their favorite sports team/star. And, yes, today politics is a team sport with equally stupid fans.
@@barongerhardt It's not a fine conclusion at all, because people with real data about a real thing are being presented with fake data in the most unreliable way possible, and the study is assuming that will not have an effect on the way people answer. People don't know any data about skin cremes and those are presented as controlled laboratory studies. People DO know things about political topics, and they're being presented data as uncontrolled, unvetted statistics. It's not difficult to see how that could completely skew the data that they are attempting to treat as a fixed variable. Completely different types of data, completely different contexts, and then a causation is being provided based on the faulty correlation. All huge red flags of a bad study with bad conclusions.
@barongerhardt but is this study actually showing those results? I'd argue that this study shows no real data because the hand cream example is not a proper control group for this experiment due to the factors I mentioned in my original comment.
I would say there are two scenarios to consider:1. The cream does indeed make most people better, yet it contains irritating substances which might trigger immune responses of the skin of certain peopl, worsening the rashes. 2. It worsens the rashes of all people, hence the ratio of people who got better to whose conditions got worsened is lower than that of people who didn't treat their rashes with the cream. Conclusion: no conclusion can be made as insufficient information is given.
You can see a negative correlation between the use of skin cream and the conditions of the rashes, however. So i suppose out of those two answers, worse would be a better option
that one yt commentor happy to see another person failing (fun fact she's not) therefore he thinks sh'es dumber than himself.. I have barely no doubt: such comment shows a low degree of empathy and a high degree of narcissism, therefore.. pathetic narcissistic ree spotted..
@@JJzerro the study was really about being able to read data and extrapolate a meaning, and she did her best to ignore the data. She was the type of person who spreads articles that agree with her narrative and politely decline any other data. They are told specifically that the numbers are fake, are not real, so the numbers are all you have. Knowing that it is not real, what would implications be if they were real? And she clearly could not make that determination because she questions the depth of everything too much. Some people are just not numerical people.
How exactly do you need additional info for a math question? The whole point is that it's about the numbers and the context (skin cream or gun things) just adds flavor.
The questions are not equal. The problem is whether the questions exist in a vacuum or not. We have no prior knowledge of the skin cream and do not expect any bias. The gun control question has already been researched by both sides with their particular biases. So this just becomes one more study in a sea where bias is rampant. If it contradicts what you already 'know', then you are going to question the study, the sources, the funding, etc. You expect the study to be biased.
which does not make it impossible to analyze data objectively. after that, you can either change for mistakes in the experiment that gave you the numbers or you just learned something new. Science works, but only if you can stomach being wrong.
@@erumaaro6060 sometimes it's just hard to know you are wrong. like the video shows, we all have subconscious biases that are hard to keep check. I always try to do this. when i am looking unto the data that's connected something i am emotionally attached to. i try to look at it twice or three times again but in a different context of what if i was not me.
They were similarly presented to the people being interviewed, during the video demonstration for us, it didn't go in depth in the second case to avoid exact repetition
Who do you believe more: "Acshually 🤓"-guy with fake numbers, or yourself? Smart people know that if gut feeling tells you against data, something wrong with data because it made up by much more stupid people. Misleading data is everywhere. In this particular case they again was right, because they spotted that data was fake, which is true.
Literally why "trust but verify" should be a mantra of anyones' life. \ But can't have that otherwise people will start questioning things they're 'not supposed to'...
@@Sauvva_ one step is all we need, and "it" is the main thing you are not allowed to question, it is literaly illegal to express doubt about it in my country. (not really applied now but still in the books)
It's so interesting that they have successfully divided us almost perfectly 50-50 and convinced each group that if the other party wins, our country is over. We are all stuck in this frame of mind and all closed off to outside ideas. I once experienced bias in myself. It was related to equipment in a niche field of extreme sports. It took me 2 years to open myself to the idea that I was wrong about a product and the manufacturer, which I had worked with for a while. When I finally overcame that mental block it honestly rocked my world and made me question what other areas I had these same biases.
From a game theory perspective it does make a lot of sense that it would be so 50-50 and polarizing, ideas that we disagree on are more likely to inflame into bigger issues, and each side reworks their beliefs to be more palatable to the masses only when they're behind in the polls. We're unfortunately stuck in a system that basically guarantees that there will be polarization between two roughly equally powerful parties. The issues may change, but the polarization itself is eternal
Considering one party has already made one attempt to violently overthrow the American government, it seems like this time its actually true that if one side gets elected, that will destroy american democracy. Not everything is rhetoric.
Who is "they". There is only one rational side left, there are still irrational people on it but the side itself is much more rational compared to the other side. And its not the one that has 75% of respondents denying human involvement in causing climate change.
I think another issue that isn't often addressed is that data is often gathered with the conclusion already decided instead of drawing conclusions based on the data. That's why whenever people hear something that doesn't re-enforce their beliefs they dismiss it or try to "logic"it into their view. If the studies or surveys (whether it's governments or private interests) didn't cherry pick or manipulate data, and the conclusion was, to the best ability, an unbiased interpretation of reality and not what they want it to be, I think some of this would be different. But currently, nobody trusts anything and will always try to make reality fit their beliefs.
As social creatures, it is easy to become misled by those around us for sure. Logical thinking is so hard to come by now, but these videos definitely help in spreading the word.
1:06 my answer is NO, it was worse but within error. 33% vs 19% ~10% margin for error Total sample size not labeled 3rd group (no change) not labeled Not enough data presented to conclude a study Answer at best is a red font "Inconclusive" or a red font "no change" , this is the correct answer. After skimming the jist of the rest of the video, I gotta accuse you of using a clickbait title because nothing here is unexpected.
You're supposed to assume that those are the total samples and they all either improved or worsened. Those are fictitious numbers after all. Knowing this, I don't know how to calculate if p
I think a big difference between the cream example and the gun control example is how abstract the question is framed. For the cream it is just a maths problem, but for gun control it is a question where there are preconceived notions that prevent the person from interpreting the data, or even really looking at it, expecting the data to reflect their personal beliefs.
I was looking for this. This seems like another poorly designed study. Participants answering the skin cream question interpreted it as a math question, and were evaluated as if it was a math question. Participants answering the gun control question interpreted it as a social policy question, and were evaluated as if it was a math question. I read through the study. There is minimal effort to inform participants that they're evaluating just the data presented, and the choices they're given are phrased as "cities that enacted a ban on carrying concealed handguns were more likely to have a decrease in crime" rather than "this data supports..." The cream "control" question is also a very poor control. Rather than being simply apolitical, it's presented as entirely hypothetical, in expected math test fashion, rather than being a real world question like the gun control question.
"Personal beliefs" here also can just be "correctly being aware of scientific consensus". It doesn't matter how many fake studies you show me data from to explain that climate change isn't real. My answer will be that climate change is real, because that's just a fact.
The very first thing I did when seeing the skin cream chart was pause the video before he gave any more information away and then I looked at the ratios and determined that the skin cream wasn't working. When you're asked a theoretical question about some studies, I don't understand why the topic affects how you approach the data! 😂 Yet here we are! Haha
The most interesting thing to me was the graphs from the other study that he showed regarding how numeracy, etc effects views, showed that conservative Republicans were typically much closer to a flatline across their viewpoint of a subject regardless of their numeracy score. While the increase in division seem to occur with higher numeracy liberal Democrats thinking much differently than lower numeracy Liberal Democrats and Conservative Republicans.
@@GarrettBShaw I think the issue is this - Imagine I give you data on the temperature in a specific place over the years during roughly the same time of the year and the data shows that temperature has not risen at all over the last 10 years (I assume you can find such a place) and then ask you: "Is climate change real?" instead of "Does this graph support the claim of climate change?" The second question is an obvious no, the first an obvious yes. I proved absolutely nothing about your literacy of interpreting graphs by asking you the first question. Did politics (or rather scientific literacy) affect my choice for question 1? Obviously, its a stupid question to give me random cherrypicked data from somewhere and try to disprove a huge chunk of research. Maybe there is a random district in a random city where no gun control reduced crime. I obviously concede it did in that case but its not going to take one study I have no clue about to change my opinion. The phrasing was absolute BS.
TL;DW: People are tribalistic regarding political issues, and expected levels of rationality become compromised when something related to current partisan talking points is mentioned.
It doesn't help that quite often the data would be manipulated for political purposes . The smarter people are justified in not taking it at face value. Their views are formed using a wider frame, not just data from a source they don't trust.
Sure but thats not the point, they are asking what does the data show, not is the data valid or do you believe what it shows, or what your personal opinion is. None of those matter.
@jaro6985 It shouldn't matter if it was a question at an exam. But this isn't - as in normal life you would use information other than what's immediately in front of you
@11:18 the problem is that is purely reactive policy/action, which is the most expensive kind of action. To be proactive on something, you have to have agreement on causes and work to fix those. We've forgotten the old sayings of "a stitch in time saves nine" and "an ounce of prevention is worth a pound of cure". We see the same thing these days with the Y2K issues we had. Since a bunch of companies spend massive amounts of time and money *proactively* fixing things so that "nothing happened", today we have a bunch of people thinking Y2K was all just a big hoax. And that kind of thinking has continued to everything else. We won't be proactive on anything, we'll just have to reactively deal with the symptoms rather than proactively try to address the causes.
The other side of this issue is that to come up with the solution to the cause, i.e., the thing you do to fix the problem, you have to be sure of the cause itself. Part of that is understanding that if A causes B, then we can use the existence of A to predict B. Unfortunately, when it comes to some issues (like climate change), the predictions are faulty - illustrating that we actually do not know the cause. Why would I spend money to fix this problem you identified here if you can't show me it will actually solve the issue you claim it is solving? To put this in a non-political example, if I say my car won't start, telling me to put air in the tires won't do me a lot of good - even if the tires are flat!
@@commentinglife6175 I suggest you look at climate change prediction more. Even the predictions that Exxon made (and hid) back in the 1970s are reasonably accurate.
We need a combination of short term and long term problem solving. The issue is that long term work requires trust (which we're lacking in the U.S.). But successful cooperation on short term problems is a good way to foster the trust needed to tackle the long term stuff.
That said, I do think that confirmation bias is a real thing, and if people see data that is consistent with their ideology, they are less likely to question it. do you take part in XAI760K ? great work by them!
Get all sides of every story at groundnews.com/Ve and read the news with a data-driven approach. Subscribe through my link for 50% off unlimited access -their biggest discount of the year.
.
Hiii
first
you can try to prove it as much as you want with curves and graphs and stats... it won't stop this squirrel drama from making Trump to win the presidential election 🐿
it depends if its gonna help me with general knowledge for my life
I have an unfounded belief that asking people a question on the street with a camera instantly deducts 30 IQ points
That's also worth a street study i'd say!
I'd have said I need to quietly ponder the problem before making any judgment. (Because I want to give an accurate judgment.)
Yes. Also, when you use words like "improved" or "worsened" you create supposition by default.
I'm pretty sure a lot of the participants also answered what they believed over mathing it out so they wouldn't be edited out of context on camera. Like, they don't want to become a participant to a "Look at all these X voters who actually disagree with X policy compilation" video
I have a strong feeling that the lady at 3:03 was naturally living with a constant 30 IQ points deficit.
As a math major, I’m disappointed. I was so ready to get it wrong then it was just… proportions. And then it was about politics and I wanted to cry lol
Politics is not even supposed to matter it's simple math.
Not to mention the fact that the sample sizes were drastically different. They used more than twice as many for people who used real skin cream, dumb
@@milesonyoutube8222 exactlyyy
@@milesonyoutube8222 i thought so too until i realised the rest of people had no changes propably
Yeah totally retarded video. They probably shouldn't speak about smart vs less smart ppl when they're at this lvl themselves
"You are *not* immune to propaganda"
- garfield
But that one guy from twitter said I am!
@@TheOneWhoSometimesSaysOk its called X, mkaay?
no u
Garfield is the reason why I hate Mondays and love lasagna
@@mariocuric6690 Nobody calls it X, lmfao.
Julius Caesar: “Men willingly believe what they wish.”
Seneca: “People believe that which they wish to be true.”
Francis Bacon: “Man prefers to believe what he prefers to be true.”
George Orwell: “People can only see what they are prepared to see.”
Disturbing progression, but apparently true. It's getting scary.
@@richyfoster7694 Not so much of a progression as just different ways of looking at the same unfortunate truth.
@@richyfoster7694I don't see any progression. It seems to me more like an uncomfortable fact on human psychology that has been true since ancient times.
Paul Simon:
"All lies and jest, still a man hears what he wants to hear, And disregards the rest"
[lyric, The Boxer]
Rationality is not a character trait, it's a process. If you fool yourself into believing that you're rational by default, you open yourself up to the most irrational thinking.
And, on the flip side, if you assume the "scientist" is smarter than you, you are also more likely to fall for the "Appeal to expert" fallacy and go, "But so and so in a white lab coat said it so it must be true!"
@@commentinglife6175that's not a flip side, it's the same thing. scientific reasoning like that is why peer review and replicability are core to the scientific method. most scientists acknowledge that scientists are fallible as well
This is only the case if you're wrong about being rational by default.
@@plwadodveeefdv It all comes back to confirmation bias yeah
@@plwadodveeefdv If I have learned anything, Scientist are as likely, if not more likely to make really terrible assertions based on bias. They also often let bias affect their ability to see the big picture when looking at statistics. Gun crime for example: Almost all of it is from gang violence or s**cide but lab coats will continuously preach that removing guns fixes the problem but ignoring the fact the tools of the trade just shift.
Very subtle and clever wording in the title. I filled in the gap myself with "Smarter people do worse than dumber people", when in reality it was "Smarter people do worse than they are expected to"
crazy i did too, probably an ego driven reaction
Well, if you write a statement like "Group A does worse", is that not objectively equivalent with "Group A does worse than Group {Not A}"? Certainly "Group A does worse than it is expected to" is not the most plausible interpretation.
Yes, very clever, like a pop science magazine (or channel) wanting to get more clicks
It's not "clever", it's just deliberately misleading. Same as the rest of the video, where the questions shown on screen don't match the ones they showed to people (which should be an immediate red flag about the honesty of this "study"), and where they count "the number of cities" instead of the number of _crimes._
Same, it's misleading
I think a big part of why people don't change their views when presented with evidence is that evidence needs a level of trust, most things we disagree about aren't things we can easily see ourselves, we have to trust whoever is collecting the data.
And we easily distrust anyone that says something that goes against our beliefs.
I'd also add that it is easier than ever to mistrust the data simply because the capabilities of the researcher are so expanded today. An economics professor I listen to makes this point very clearly when he talks about how easy it is to slice and dice the data and run multiple regressions over and over. In a way, having a more limited tool set provided a sense of "comfort" simply because it was so time consuming to re-run the calculations, we could assume someone was reporting their initial results rather than their fifth or sixth time now that the results matched what they wanted to report.
For the gun control example, I had a discussion with a European buddy who was talking about how Europe has low gun crime rates.
Like, what a surprise. It doesn't matter that acid attacks, stabbings, assault, etc. are on the rise, they don't have gun crime!
So if you see a statistic about that it's important to see trends before the change happened and what actually is being measured, if you're saying gun control reduced gun crime because there's zero guns, that's not actually measuring total crime trends that could otherwise be rising.
"most things we disagree about aren't things we can easily see ourselves"
I don't think so. Take climate change for example. Within my own life span, I myself could clearly see the massive shift in weather pattern, and I am a city guy. Those like farmers whose work heavily depends on weather, they'd be even more likely to notice the shift. Or like with g^n control, I don't believe anyone would actually have a hard time seeing the ridiculousness in things like "without g^ns, a criminal would just be as dea_dly with a knaife."
More likely, it starts from what gains or loses people have with such conclusion. A guy who's fond of g^ns would more likely not want to believe in the benefits of g^n control, and then he'd just fuel that narrative with "reasons" that fit, to the point it becomes a "belief".
I think this is definitely an aspect of it. But also changing your position in a political matter means changing your identity. And most aren’t willing to make that trade.
Everything that's wrong with the world right now in one paragraph
Ground News hit the frickin jackpot over here. This is literally a 15 mins of why we are inherently biased by nature, and have to actively try to not be biased, and thus how critically relevant viewing data and news from both sides is. I imagine you calling them up and midway in your first sentence shouting: Yes, we would LOVE to sponsor you!
What struck me the most about about listening to people defend their answer to the gun control version of the question was that they were not engaging with the numbers at all. All of the arguments they made were about whether it would 'make sense' for gun control to work/not work. Numeracy doesn't matter if you're not looking at the numbers.
This is actually the biggest problem I have with my students. Not in terms of political bias but they way they don't engage with the facts presented in a problem and instead reason based on what they would like to be true. It's incredibly frustrating to deal with.
Or the guy defending gun control as "you cant make it illegal and then take it away.... like drugs" 🤣🤣🤣
That's very true actually, in light of hard evidence, we tend to rationalize against it if doesn't align with our political views or readily accept it if it does!
A good reason is given about why this happens in the video but I'd like to add that these political beliefs become a part of our identity and questioning it is like working against yourself and most people don't like doing that!
The Problem in this example results from presenting data that goes against any reasonable view one could take. Gun control _works._ That is clearly evident by just looking at all other culturally western democracies, where _none_ have the same issues as the US, and in _none_ guns are as easy to access. This specific example therefore imo doesn't work - if you present a proponent of gun control data that lacks any context as to why a ridiculous case has resulted from gun control, as in gun crime going up, then they will rightly find issue with this. Especially if you frame it in a general, specific manner ("gun control") but only specifically mean one measure ("concealed carry").
The hypothetical should be somewhat believable if you want people to intellectually engage with it.
Your experience seems to be represented in this video pretty accurately. The woman who saw the apolitical version and asked what kind of skin cream it was also wasn’t engaging with the numbers
That's the thing, these kinds of student gets so focused on their hypothesis, and when they conduct a study/paper then discovered the data, they tend to skew the result to support their hypothesis.
Veritasium: A study on skin cream is “apolitical”
USA: hold my beer.
There is a problem with every study. IF it is shown that smart people are selective in their choice of facts, then it means that scientists are also selective in their choice of facts - and this is a fact as data analysis is not only based on numerical analysis, as not ALL variables are observed, scientists choose variable to observe and sometimes even select data to use based on these beliefs.
Now as long as this is some "apolitical" topic it works usually fine, although bias is still a huge issues. But when they are paid based on the results or topic is political, then that research is affected even to a level of fraud. Sabine Hossenfelder recently had some nice videos about current state of science.
All this casues that people do not trust ALL research as new myths are created on daily basis, or people are served with ideas based on a tiny fraction of truth. Like: if EU limits emissions of CO2 then the world will be saved, even though rest of the world makes so much more.
@@OldSkullSoldier I don't think you get the joke buddy, he means that America is political and the quotes say 'apolitical'
@@OldSkullSoldier But I just worry about the systematic problems about financing really. Otherwise, it is well known that each single study might be biased. At least in natural sciences that is not a big problem. Even if an model of little use is popular, eventually a more useful model (AKA "the thruth") will win out.
"hold ny gun"
Studyon skin cream is sooooo racist OMG! How can you not see it? You are not woke enough. Go take the green pill.
Smart people doing worse? Finally, my moment to shine!
Political people are not smart
Smart comment!
Made me laugh, thanks friend.
😂
I can say you are a boy just by your name and pic
I don't think I have ever heard a better advertisement for Ground News, I hope they are paying you well.
I've avoided checking them out till now because I can't believe anyone is unbiased, but your example was rather compelling.
Don’t know why, but I noticed, simply switching from a medication test to gun control made the question significantly harder to parse for me. Like, the question just suddenly felt much more difficult.
I wonder if it’s because when learning about probability, you constantly learn about medication tests, control groups, light vs heavy symptoms, etc. So it becomes second nature to think of all the edge cases and possibilities when looking at a medication trial test. But for gun control, no textbooks ever use that as an example, so our brain can’t go into math mode.
Same here. I think it was partly because the numbers on the first board represented individual people, while the second board represented things happening in 8 cities. I had to forget about the 8 cities, and pretend each number was representing an individual city, then it felt more like the first board.
Hah ever wonder how Trump got re-elected, this is how. People can't read data and just make stupid decisions based on what "feels right" in the moment. "Durr hurr something something something economy, me vote red hat..."
For me it became harder to parse partially because I automatically distrust any statistics found in a political setting. There're so many ways to manipulate the data to make your side of the argument look good. What if gun control decreased crime because implementation of gun control laws tended to be paired with a strengthening of the police force in the area?
Correlation is not causation, but determining what's actually causing the problem is a multifaceted issue that doesn't make for good political campaigns.
That's just y'all fr, I could do it just fine
I think if it was vaccines, it would be easier to get right.
6:05 - "crime improved" and "crime worsened" is a strange wording
Improved means decreased
@@LooperEpicwhat if I improved my criminal methods
@@LooperEpic yes, that's how it's usually worded, to avoid confusion.
@@LooperEpic it also means "people are getting better at doing crime"
"Turn up the air conditioner" / "Turn down the air conditioner"
6:04 Why did you label it "crime improved" and "crime worsened"? That seems so much more ambiguous than "increased" and "decreased"
To make it work for this silly video
Because study participants are famously so dumb that they don't know whether crime increasing is worse or better.
Only if you're a dork enough to think that more crime is an improvement
exactly what i was feeling
I no longer understand the table with those labels. You’re saying almost 300 cities outlawed guns? I don’t think there are even that many cities
The timing of this video is not lost on me. Thoroughly enjoyed it. I am definitely guilty of shunning ideas because they're not typically associated with my belief system, despite being a deeply curious person. I have been working on it, and will keep doing so in the future.
I used to do a dumb statistics joke back in 2006 based on official UK Department of Transport car accident figures from 2000. It went something like this.
20% of car accidents involved excessive speed. Which means that 80% of car accidents had appropriate speed.
20% of car accidents involved somebody who was over the blood alcohol limit. Therefore, 80% of car accidents were caused by sober people.
My conclusion? Drive at 100mph while drunk is safest.
That is something I am missing here. How many people/cities were involved in the fictious rash/gun study? Because the neutral participants for whom nothing changed are also important for the conclusion. In your analysis it would be the ratio of sober/drunk people who drove without an accident.
Let’s be real, this is a correlation vs. causation type of scenario. Looking at ratios in cases like these are useless
We should test it, for science.
Nice joke😂!
michael bayes has entered the chat
8:30 "The numbers are actually not real whatsoever" is so hilarious 🤣🤣
So it’s even
Lol ,I didn't understand that conversation btw them .
@@prajwalshivgan2847 that person missed the point of the question completely because the presented 'results' were against his personal beliefs.
@@l4nd3rThe thing is if your approaching random people in a political context, presumable a rally, and asking them these questions, they aren't going to interpret that as you asking them a math question, they're rationally going to interpret that as you asking about their beliefs on the issue
@@hadhamalnam the difference being? Lol.
I wish you presented the gun control question first so the audience has a chance to fall for their political bias and self reflect afterwards.
That could have been interesting actually
@@joedwyer3297agreed. I’m confused seeing others somehow get this effect when the video doesn’t really allow for it. By doing the cream example first and then explaining the main point through the gun example it’s hard to tell how I would have originally felt.
I suspect they did it to avoid viewers turning it off in disgust because he went too "political".
Probably the team has a pilot test to see whether putting that part first is safe or not, and they conclude no
Haha... self-reflection... If that existed en masse we wouldn't be in the situation we're in.
That’s a video everyone should watch. Thank you for such a good work!
If I was put on the spot I would’ve said, “I’m not one to make rash decisions”
😅😅😅
It's no skin off my nose.
When you change it to political, the distrust of the data skyrockets. All sides are used to being lied to. What this says about "science communication" is significant.
This was my intuition as well. If I was showed a chart with 4 numbers about gun control and crime I would be like I am not drawing any conclusions from these 4 numbers, please leave.
@@TheMustyrusty Unfortunately for many people there is no such thing as a trustworthy source if it contradicts their current understanding. 4 or 400 numbers, the results wouldnt be too different.
Science communicators lie, too. PIs on studies with big funding wildly misrepresent their studies all the time.
@@agafabaWhich is why we have millions of people who firmly believe that Greenland is facing a threat from unprecedented high temperatures not seen in 10000 years.
@@toomanymarys7355 This comment would be hilarious if it wasn't so sad...
I work as a strategy analyst for a major retailer and on a weekly occurrence I have a suspicion or belief that something may save money, or be a better solution for the business. As I research and analyze data I am no longer surprised at how often I prove myself wrong and have quickly learned that data supporting for, against, or even no correlation for either are all equally valid answers. It’s not about getting the results I want but rather getting results period. I’ve applied this same practice to other aspects of my life and it has been very freeing (and humbling). Thanks for the video.
The ability to be humble and see our misconceptions when they are presented to us is really rare and really what we need more of right now
What was the most surprising thing that saved money or cost money?
I agree. Most often we want our ideas to be right, but verifying it against real data is necessary.
I would rather not disappoint people than lead them on with a fabrication.
One of the most important trait, and the one thats more difficult to learn as a Data Analyst it’s being impartial. Don’t putting one’s belief into an investigation, dashboard, or report it’s very important to learn what’s actually behind the data, not supporting your own idea of it and don’t manipulate the results to skew the outcome you want to have
Derek, that solution you were talking about not having is right here.
You have an amazing way of explaining things and using examples in all of your videos to make it all make sense! Good stuff!
Yeah, fits in with my experience. This isn't a question of smart and dumb, it's a question of intellectual honesty. Everybody is always smart enough to fool themselves.
Absolutely on point
There is a TH-cam video called “why smart people believe dumb things” that is exactly about that. Smart people are better at fooling themselves.
It’s the old “some ideas are so stupid only intellectuals believe them”.
(I have to warn you the video is “anti woke” but if you ignore the political implications it is still a good video and says the exact same thing you said in your comment).
No, the whole point of the study was that people better at complex thinking were better at deluding themselves contrary to data. It's harder to be intellectually dishonest when you don't have as many tricks at your disposal.
Could es well ask a group of dermatologists if daily washing is good for your skin, and the priors formed by a multitude of studies, will not just be overturned by a fictitious one.
Same. I kept waiting for some trickier math because it seemed too simple. But that wasn't the point at all it turns out.
“I have no data yet. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts."
*-Sherlock Holmes, a Scandal in Bohemia*
Insightful quote from the original Sherlock Holmes. I have to interject that this is near antithetical to the Benedict Cumberbatch version, though.
I recall a similar quote about the very powerful and the very stupid having that in common. Tom Baker’s Doctor, I recall…
@@matthewmoulton1 How is that antithetical to the Benedict Cumberbatch version?
@@matthewmoulton1I disagree, but I’d love for you to give reasons for what you’ve said, in case I’m wrong.
"How often have I said to you that when you have eliminated the impossible, whatever remains, however improbable, must be the truth?" - Sherlock Holmes engaging in an obvious logical fallacy
For those wondering, the paper is called, "Motivated numeracy and enlightened self-government". A co-author on this paper was Paul Slovic. My mom worked for him at Decision Research back in the 90's when he was doing his book The Perception of Risk. She's mentioned by name in the forward of the book. He's a really awesome researcher and person. When I saw this paper, I thought it was amazing he was still in the business and still doing great research.
If Veritasium did not completely butcher the content of the paper in this video then im sorry i gotta break this to you, but the guy seems to be fartsniffing moron. He doesnt seem to have bothered considering much about what he actually measures. Smart people know that overly simplified funny numbers cant tell you much about highly complex issues. Thats why they are not convinced by them at all. And instead have follow up questions and reasoning, which could nicely be seen in some of the interview clips.
Just like a sudden measurement of neutrinos going faster than light, only an idiot would immediately take that as gospel without checking all other simpler options.
If this video properly charactierizes the study, then it's terrible.
What its capturing is the increased likelihood that the participant will accept your premise when it's an issue they care about.
At no point did the asked say anything like "if you accepted this data as true, then what would your answer be?". Instead, they presented fake data, we watched as certain participants rejected the data and answered the question, and then those wre marked as getting the math incorrect rather than merely refusing to participate.
@@giantermite7039 Such as the woman who said about the skin cream, "The numbers mean nothing to me," you're assuming she was included in the N-value.
@@nexus1g I am assuming that. The video certainly suggests as much.
@@giantermite7039 How do you figure the video suggests that?
You have done more than just “raise awareness”, you have put a “starting point” like you said out there, and it’s the best starting point I have ever seen for this issue which means everything for functional democracy.
I think the problem is people are not relying on the data in the fictional scenario and are relying on facts they believe they know. If you showed them a scenario that showed gravity was driven by air pressure, you would have a hard time getting people to believe that even though the paper showed that.
AI is only hope for correcting human tribal bias.
@@alexoolau More like the hope for tribal people to propagate their tribal bias under the guise of unbiased facts...
@@alexoolauabsolutely not lol
@@alexoolau if you feed ai your biases, it will regurgitate back your biases
I think this points toward a deeper question that I'm not sure was answered in the study: how rational is it to accept new information and abandon prior information? If the quality and quantity of previous evidence is high, asking a person to suspend that understanding for a guy with a cardboard sign so you can play with numbers is a factor that's hard to ignore.
One problem with the study could be that people are EXPECTING the skin cream to be a math test (why else would an interviewer want to talk about skin cream data?)
But when asked about gun laws, they assume the interviewer wants to hear their political views, so they focus on that.
But that just proves the point even more that they're ignoring the numbers
@@gephyrro I mean, I'd be skeptical of the data too if some random guy with a poorly worded table on a big cardboard sign asked me to interpret the data from a (presumably) unnamed study. Well, to be honest, I'd probably be arguing with how poor the wording was first since "crime improved" and "crime worsened" with the seemingly opposite arrows are really bad.
There's also the problem that a lot of other people have pointed out: you likely have background knowledge related to that. If you could somehow prove that someone is believing something that's false to begin with, then sure, I think the video has a point. But if you listened to and internalized conclusions based on quality data, then it's more irrational to change your belief than not. The RAND Corporation has a huge metastudy of gun control split into how well-supported different policies are in affecting different outcomes in well-conducted studies. You can read their published reports as well as easily navigate a quick table. The evidence isn't overwhelming on most issues, but there's good evidence that shall-issue concealed carry and stand your ground laws are bad. They increase homicide rates. Like, you take a question like that and put it up to pro gun control people, is it reasonable for them to just quietly accept the new data from Sign Man? Personal opinion, but no. They should not.
@gephyrro Good point and @hansle170 that's a good view. That's the reason they mess it up is the question changes, not just the topic
The study wasn’t done in interview format tho, this video is separate to the original study.
@aek-f3z Also a good point, but they were still having their answers recorded. The camera might not be a problem (if you know for sure), but it was still being phrased and they might assume it's about politics
The conclusion at 9:13 is wrong. People shouldn't change their mind based on a single study, that's not how science is supposed to work. If they have a personal belief that's based on many thoughts/studies/experiences, then a single piece of evidence should at best pique their interest in the topic, but certainly shouldn't make them instantly change their mind
I think you misunderstood what he meant. The study wasn't trying to change their minds. He told them that the numbers didn't actually correspond directly to the real world, it was just getting them to do some math. Those who had lower reasoning skills and contradictory pre-existing beliefs on either side got the math wrong more often because they're so entrenched in those beliefs.
Also, a lot of these people didn't form their beliefs with much actual reasoning, many of them just believe what they're told. What the woman at 9:06 says is a perfect example of that. It's people who are more open minded and consider all sides who perform better.
@michaelgoldsmith6615 this test does not determine "open-mindedness" when there is a common political topic and a camera to your face
And we are really used to putting politics before logic because we see clips of people being politically "owned" more often than someone correcting another's math... and political tension is a way more emotionally charged situation compared to math
@@michaelgoldsmith6615 I do fundamentally agree, the correct answer in my opinion would be something like "this study shows that crimes have increased when the guns got banned, that's interesting, I might have to read into that". But Derek draws the far too wide conclusion of "people hold their partisan belief regardless of presented evidence", and I think that's a far reach from the results of the study.
Also the woman at 9:06 is the perfect example of scientific thinking, she sees enough evidence, so she changes her stance, that's how it's supposed to be.
@karola.7908 I agree that his wording should have been more specific and much less generalising, but i assumed he meant it in the confines of this specific test, and not about their overall views, maybe i'm the one who misunderstood. I also think his use the terms 'evidence' and 'deluding' are quite misplaced.
Embarrassingly, i also forgot what the woman actually meant, i thought she was just talking about her mentality in general, rewinding to the question that she's responding to, i've realised that she's not as suggestible as i thought. Still, people shouldn't simply believe things just because they come from a place of authority, you should always check the sources. The media is full of things that claim to be factual evidence while being objectively incorrect.
Sorry about all the misunderstanding. I should've rewatched the whole segment first.
very interesting video and study... the last bits about how we are not so different i would comment by slightly changing a famous quote: "divide and conquer" ----> "keep divided and control"
The problem with this experiment is when you ask people to consider a study on a nameless skin-care product that they have no prior knowledge of, you're testing their ability to come to the correct answer based on the numbers. But when you show people a study on a matter they probably have a lot of prior familiarity with, like climate change or gun control, you're not testing their ability to find the empirically right answer, you're asking them to rationalize the study itself based on all of the knowledge they have on the subject matter.
The questions shouldn't be does "X lower gun violence", or "does Y affect climate change", the same way you would ask "does Z reduce the rash". You would have to ask them does THIS specific study show a reduction in gun violence, climate change, etc.
I was going to comment about it, on the question about gun control, there's no two sides over here, the data is there on an international scale, having hypothetical data and asking people to ignore all of their previous experiences is unreasonable.
That format also calls for people to take a public position on camera, not knowing if they're being tricked by an activist's video.
Plus, while gun control supporters consider it a question of lower vs higher crime rates, many/most opponents of gun control consider it a civil liberties question. That is, even if crime goes up, they'd still support it because it's a civil right, rather than a purely functionalist policy question.
That said, confirmation bias is absolutely a thing and this was a good video.
@@crafty1098 Even in the case of a pure numbers question that graph only states total crimes being worsened/improved. It doesn't state whether the thing which was just made illegal was included in those 'crimes'. Imagine if you made cars illegal, there'd be tens of millions of criminals overnight, crime is skyrocketing! But it doesn't state what 'type' of crime is occurring/not occurring, only that it is/isn't. Further it doesn't explain the *other* side of the equation, and that is the numbers of people saved via guns, which is as far as I am aware, like... inordinately disproportional, like 10x more people protected with them than harmed by them because there are just far more law abiding people with guns than criminals with guns, by default.
It's one of the reasons that data like that is inherently biased. Even if the data showed that gun crime went down due to a complete and total lack of guns, if you don't also look into if other forms of crime worsened or improved because of that decision you're being disingenuous at best, outright manipulating people at worst.
This is a critical point. It's kind of like like asking someone what's 2+2 and then asking them if the data showed that 2+2=5 would that change their previous belief. If people have reasons for believing that 2+2=4 they are not going to immediately submit to new data without vetting it (that would be completely irrational). In my experience, most people are just not rigorous in vetting data, which is why emotional arguments are so common in politics (they are often more effective than trying to have productive debates).
Your point is valid, but i think the video is still fair. I would also predict that the rewording of "did THIS study show a reduction in Y?" wouldn't change much.
Here's why: in Psychology there is a term called "flexible thinking". It's brought up in decision making research. A person who can think flexibly will reduce or block out the influence of their prior knowledge when making a decision from information provided.
However, most of the time, when a decision is difficult or information is hard to process, we get cognitively lazy. So instead, we rely on a belief we already have to make our decision.
What the video showed me is that people with higher numeracy are less likely to think flexibly and only consider the evidence if it doesn't contradict their beliefs. That's a pretty interesting result
The great problem with data in politics is that it's SO EASY to manipulate studies. Each side has a block of 'Institutes' that conduct studies where the study design virtually guarantees your outcome. Minimum wage for example. I saw a meta paper examining the state of economic studies on the minimum wage. Sure enough they ALMOST divided perfectly into halves. The ones that chose one methodology for trying to establish controls showed no effect of minimum wage on employment. Those using a different method to control for other factors showed a substantial negative impact on hours worked. I say ALMOST because there were two outliers that used the methodology that did not show effects, but those studies DID show small negative effects. But generally, which methodology you chose ABSOLUTELY DICTATED your outcome. Then, politicians from each side go on talking head shows and talk like the only studies that exist are the ones that agree with them.
And we've seen how this can happen even in hard science. How many studies showed that [insert anything here] causes cancer? And then you read the particulars and find out that they stuffed the mice full of 10x their body weight of that substance every day for months. And that's before you get into p-hacking or outright fraud. The entire anti-vax movement came out of a single study that was a dude just straight up lying about his data because he wanted to be able to sue the drug companies for big money. He's ADMITTED in court that he lied about his data. But the antivax movement remains strong.
So the term 'Studies show...' is the most overused starter phrase in politics. And the most worthless. And people know it, so they're RATIONALLY skeptical of the 'evidence'. What's needed more than anything is better agreement BEFORE a 'study' is done on what are and are not valid methodologies. There needs to be tighter peer review. And any paper that doesn't go through peer review shouldn't be reported on. No pre-print reporting. Don't waste our time telling us what the National Institute of Pre-Determind Biased Outcomes concluded about the effect of abortion on crime or whatever. But that doesn't sell ad impressions, does it?
Which is why people have to actually be trained on how to read studies (including unfortunately a number of people that do meta-studies). And observational studies are even more prone to this problem. Sloppy selection of subjects or situations, poor controls, etc.. All too often meta-studies just throw a bunch of stuff together without any deep analysis of which studies are even good basic science
And this is why I bristle when liberals (they do this more often) will get mad and call names because they have studies done by thinktanks but conservatives won't believe in them. Its because conservatives distrust since they have seen so many biased stories. Its not they are too stupid to read (always) just cautious in trusting everything. This is why liberals are like press = listen to everything they say
A Rice professor and the former mayor of Houston got caught red handed. The professor was hired to do the study. It didn’t give the desired answer. A new check was cut, a new study designed, and voila, it supported the mayor’s position. 🤣😂🤣
You bring up good points. But the autism thing is interesting. If we're talking about a proper scientific RCT experiment there is literally no proof they work, because they were never done. And people who then claim the polio vaccine cured polio are proven wrong by history, it already went way down before it got introduced. I too thought these people were nuts 4 years ago. But then again, we saw what happened in 2020-2022. So when circumstances changed I reexamined the claims and now I changed my belief. But it's definitely difficult to tell if it's one way or another for most vaccines.
@@nunyabidness3075 These days given the bias/manipulation/fraud it's about a 50/50 whether a study is right or wrong. You might as well flip a coin.
I feel that a lot of comments are missing that a meaningful point of the video is that the title is missing the words "...than usual" or "...than expected."
The "high numeracy" people still typically do better than the rest, just not as well as they are EXPECTED to do.
When the video says "worse" it's in comparison to that demographic's EXPECTED outcome, not worse than the "lower numeracy" groups. The surprise is that by following intuition, "higher numeracy" respondents tend to perform comparably similarly to the "lower numeracy" respondents' results, in certain cases where their beliefs disagree with the presented information).
the title is kinda clickbait. High numeracy people do better because it is literally a numeracy question. It should say, "political beliefs can overshadow numeracy when polling people outside a political rally."
actually I don't think that's the case as you can see in the graph the high level numeracy people had the same level of accuracy as the 0 level numeracy group. Which indicates they are the exact same, regardless of how they are expected to do.
Yes, and in the end it's basically the same conclusion that Conscientiousness and IQ aren't correlated.
AI is only hope for correcting human tribal bias.
Yes! The data basically shows that people willingly turn off their own numeracy and choose to go with intuitive thinking when the numeracy would lead them to conclusions they dislike.
Is this a re-upload? I could swear I've seen this video before
I very much feel that I am affected by confirmation bias.
"But I know I'm correct so all I need to see is evidence that can prove to others that I'm correct!" is basically how I feel confirmation bias works. A sort of circular reasoning.
Does that mean my excessive self-doubt might make me less affected
@@GrassFudge7 you are not immune to propaganda and intellectual failures.
@@cewla3348 less
I feel that I am impacted by confirmation bias, yet I easily got this correct.
Experience, effort, and a sincere longing for truth are irrational I guess.
@@GrassFudge7 correct, doubt is very valuable, if you are not sure if your belief is true you are more likely to try to find evidence for both sides of the story
1:59 "math doesn't help". Goes on to explain how to solve the problem with math.
Ngl I barely used math i just knew based off the amount that didn't work to did work
@@C.S.Argudo🐒
Cause the mathematician is the problem 😂
@@C.S.Argudo yea thats just really basic math or recognition because of your numeracy skills from experience/iq
I use proportions, it was easy as hell, the problem is most people are ignorant of statistics.
Since your "fits my rule" video (on confirmation bias, can't remember the title) I try to take an evening, every now and then, to prove myself wrong on something I feel strongly about. It has helped a lot! I either get a better understanding of why I am right (if I failed) or I learn something new, and change my mind, which is just an awesome "aha!" moment 😁 I keep a list of things I changed my mind on (and sometimes even changed back) to remind myself. I hope it helps. Thanks for teaching me that!
Can you share examples from your list?
How do you attempt to do that?
@@itstruce. Here's one: I thought I completely understood why and how the Keto diet worked. Even advocated for it among friends. Turns out its main hypothesis is disproven, and (in short) it doesn't work in any special way unless you have epilepsy, but may be easier to follow for some. Edit: It was an example of what some call "Mechanism/Mechanistic bias". When the entire explanation seems logical, but isn't actually backed by good science. Usually means: "It's more complicated" or plain wrong.
@@Wrackey thanks
@@Wrackey The one that did it for me is "market efficiency". In theory, products will get cheaper, while getting better. But when markets make the measure of profits as a target, it ceases to be a good measure.
This perfectly explains the disconnect of quality, price and volume production.
3:02 .... Love it. The self-confidence. The slightly raised voice. The "I don't fall for your trick question" attitude. Funny and frustrating at the same time.
12:12 me?! I would never
😂😂😂
This is similar to a video I've seen by After Skool, on "Why smart people believe dumb things." Apparently, stronger reasoning abilities can help you justify your views, instead of rationalizing your way to the correct conclusion. You start with a belief, and work on justifying it, instead of starting by looking at the evidence, and working towards the conclusion.
Also something called the Nobel Prize Effect, I think? Where Nobel Laureates believe in pseudoscience and all sorts of mystical nonsense, because they are outside of their area of expertise.
The solution that comes to mind is constantly questioning your beliefs and assumptions, I guess, which is easier said than done.
I think your conclusion "that the solution could be constantly questioning your beliefs and assumptions" is what means to be "interested in science", since "falsifying your hypothesis or symptoms, is the key of science"... people that reject to analyze they can be wrong and only take into account what supports their view, have "confirmation bias".... which is, in my opinion, the key of pseudoscience... The more knowledge one have the more resources to justify or defend your bias.... Knowledge (to comprehend how things work) does not equal to Intelligence (how knowledge is used).... So it's possible to have "stupid knowledgeable" people and " uneducated smart" people as well. :)
Know i would have loved to learn which variables were considered as evidence of people "being interested in science" versus people "having knowledge in science". :)
Hearing the people talk about how they got to their wrong answer reminds me so much of listening to people add in all the caveats in their answers to simple philosphical questions (such as the trolly problem) to try to escape the point of the question. Adding in "well I'd have to look into the studies and the sources that they cite" is just missing the point of what is being asked. It is a completely fictional result that they are being asked to assess.
Well, I work in science. And if there is something that people don't realize, it's that intelligence comes in a lot of different flavors... I know many scientists that are brilliant but totally stubborn and dumb in the face of truth. Having an open mind, a critical mind, and a curious mind, are not necessarily because of intelligence. Many people with high IQ have the most blown up ego... Meanwhile you can find the most humble farmer in the most remote region, who can't read and can't do math, but has the most progressive views on human rights and the environment... Great men aren't all smart men, and a lot of smart men are just a bunch of cunts...
Something I think hinders the reasoning process it that people think logic is the highest form of reasoning. It's really not, it's just the building blocks of reasoning: the fact that you can stick 1,000 bricks together doesn't mean you've built a sound house. The true path to sound reasoning is objective rationality: where you continuously try to prove your own conclusions wrong, while searching for valid points within wrong arguments.
Things like this are why I strongly dislike people who say things like the solution to the problem is education or critical thinking. Here is data that shows social and emotional factors will cause those skills to be applied to things like creating confirmation bias.
I don’t even understand the question. Where does that put me
Einstein level. 😂
Either low numeracy skills because you don't understand it on any level or high numeracy skill because you understand it's underspecified. Sample size of the control and test weren't specified and its not necessarily reasonable to assume that control and test sample were different sizes which is required to reach the 'correct' answer.
man, i wish he would have paused the video a bit right after he asked the initial question with the skin cream. I immediately was told the answer by the first guy and had no time to make my own. Now i'll never know if me dumb or smarts
In the hood lil bro 👊 😎
That would be the Democrats xd
Thank you for keeping this video generally unbiased
"See mom? I scored bad on that test cuz im too smart for it."
"See mom, i failed thr test cuz guns were banned!" 😂
see mom, banned thr failed were test cuz i guns
thr
He quickly calculated that death by firearm is 11x more likely if you're black, skewing the crime rate affectable by policy for him to be near zero, and thus dismissing those heavily diluted numbers.
@sugy747 I need the stuff you are taking , looks really effective.🤣🤣🤣
I really like how humbly you try and deal with a topic like this. Well done on this thought provoking video
It went from cream and rashes to politics extremely fast
[starts a rumor that the skin cream has 5G in it]
There's more crossover between those two topics than you might be comfortable knowing about.
Never trust Big Cream
It went from intuition to bias real fast
I
I don’t get the rash question because if more people are improving, then why isn’t that the correct answer?
❤❤❤ also that tip on how to use ground news more effectively is super helpful. Ty!
“The first step in solving any problem is recognizing there is one.” - Jeff Daniels (The Newsroom)
This phrase is certainly more than 12 years old. Why not quote the one who came up with it instead?
You're also crediting the actor for something his character said, like he's some kind of genius for being able to read the script he was given.
Only Veritasium could make a political video that increases my faith in humanity
"increases my faith in humanity"? Every time I watch a video where he does a survey, I lose faith in humanity. Are people REALLY this dumb?
It's not a political video though
Gaytasium vitamins for harris
One spoonful for you all
to me it decreased.
but it also made me hope that things can change.
It seems like tribalism explains a lot in the current political moment, but I think it's one step removed from the primary cause. The problem is lack of trust. In that context, my "tribe" is just the people I trust. That said, tribalism isn't the only engine for trust - it's just the easiest. The way out of this is to restore people's trust in more than just their political cohort. Rabid individualism and self-serving institutions have done much to create the current crisis of trust, and building reliable institutions populated by people willing to speak & act in the best interest of those institutions is the best way out of this mess. Unfortunately, I don't think we'll reverse this political/intellectual culture of self-harm before we hit rock bottom.
For the real world, I agree with you that tribalism is the strongest factor - BUT some of that is that most people simply don't have the time or trained analysis skills to examine data and figure it out. So, in that sense, some part of tribalism is just an optimization of time usage (in a strange sense - sort of like thinking slow / thinking fast situations).
Check out social identity theory and the minimum group paradigm. It isn’t really a lack of trust. It’s tribalism, and that tribalism causes us to lack trust in people who aren’t part of our group.
Tribalism is good, actually, and the people who argue otherwise are inevitably the ones who want to erode their competitor's tribe to the advantage of their own. No one who complains about tribalism will ever go out to find common ground with their political enemies, or seek to find understanding of why they believe what they believe.
@@SomeGuy-ty7kr People tend to recognize tribalism in other people but not themselves. We humans are very good at rationalizing our behavior in terms that make us feel virtuous and smart, so we're often quick to think in terms of " that team bad" than we are to figure out why we believe what we believe, and why people who believe something different, or even incompatible, need not be stupid, wrong, or evil. It's easy to blame their tribe.
That said, I'm not sure tribalism is either good or bad. Many people think religion is bad tribalism, and point to the millions killed in religious wars as proof. But Catholic Charities and the Salvation Army are also religious tribes. Tribalism is inevitable. Politics has been, and will always be tribal. That's why it's important to recognize that our problem isn't tribalism, per se, but that we've lost so much trust in each other and civic institutions that trust in our tribe is all we have left. That's not good.
@@livemusicfannc It's always been true that most people lack the time and/or background to verify most facts. That's why trust is important. When we can't decide for ourselves, who are we going to trust to decide for us? If your answer is, "That radio talk show host on the MAGA right." or "That social justice warrior on the woke left." you're probably going to be wrong about many things that have nothing to do with populism or social justice.
Our minds play games with us. Yet we think we are smart and Intellectually honest.
3:08 this lady “ what do the numbers mean” mason energy. “ I don’t think they mean anything to me. I need to see the product in the rash myself.” I actually tried to see it from her perspective for her for a second to give her the benefit of the doubt and I couldn’t
This way of thinking is why people fall for homeopathic remedies. If you put an onion in your sock, and the flu goes away. That must mean the onion made the flu go away.
some people cannot deal with hypotheticals, it's wild
This lady is just an example of a very common type of person that makes almost all of their day-to-day decisions based on social influences. They often describe themselves as "very sensitive" and "intuitive" and whether or not they realize it, they are extremely suggestible. A person like this would be presented with a sample of a skin care cream by a sales person, and decide based on how they felt about the person selling it. Or they would have a friend with them who would voice an opinion about it (or mentioned it in a past conversation), and they would be influenced that way. I personally find it extremely frustrating to be around people like this, but on the other hand, as Derek brings up, it can be thought of as a rational behaviour within the context of human evolution and psychology. By influencing yourself with people whose advice has worked out well, you can continue to have a source of good decisions, while also endearing yourself to that person who is also likely to be in a socially dominant position.
@@waylandsmith on a very side note, this is what frustrates me when people (usually psychologists) talk about high-functioning autistic people. This lady is the opposite of autistic and yet is exactly what people claim autistic people have: lack of abstract thinking. This lady cannot abstract at all, she just *feels*. Almost every high-functioning autistic person I know is very good with abstractions and couldn't care less if a seller is warm towards them.
The whole video got me thinking a lot about divergent people in general. We tend to see stuff in a wildly different way and to be more curious and do not care about fitting or not, but with the raw data [well, mostly].
Just a reminder: We do know very litte about that woman. So be open to her view an her personal approach to live. Not to trust abstract numbers may seem like a resistance but is probably also the result of life experiences.
Why did you label it 'crime improved' and 'crime worsened'? That seems so much more ambiguous than 'increased' and 'decreased' wtf all of you in XAI760K ??
I think it is probably an improvement if crime decreases
Because not all crime is bad? Idk i agree that it is intentionally ambiguous.
"That seems so much more ambiguous"
so what? it does not affect the numbers which is what he is asking about
@@bighillraft just realized we are talking to a bot. Anything with XAI760K is pushing some crypto nonsense.
Waait you're right... does crime improved mean that the crime got more sophisticated?🤣
Bro, my intuition went straight to comparing the proportions. Picking the number just because its the biggest one isn't intuitive at all. Ct's called guessing without thinking. Cuz bigger is generally better
Yeah, literally any smart person thinks that way
I think the "smarter people do worse" is meant to apply to the politically "charged" version of the question, not the skincare one.
Same here. Actually, my first thought was why didn't the same number didn't get the cream as not.
The "higher positives" winning the argument (ie Poker maths) is kinda cultural thing
I think that if they did this exact same experiment somewhere else like in China or Japan for example - I'd expect a different result
Even "stupid" people will look at how many sample is getting worse, and its much higher
Humankind is very alike, but humans are not. What bums me the most is that we take humans as the forefront and reasoning to separate humankind, instead of respecting each other's differences and learning from each other.
We take information that we align the most without thinking about what consequences are or what others are thinking from social media, and acting upon it.
There was a time in politics where, although we had our human political differences, we would work together to get through it because we're all part of humankind. People were seen as decent human beings, members of society, but respected.
I think it's very important how you phrase the question. "Based on these numbers did gun control increase crime or decrease crime?" is very different from just showing the numbers and then asking if gun control is effective.
You can agree that the data leads you to a certain conclusion but still disagree with the methods with which the data was gathered or presented.
"Increased crime" is indeed ambiguous in these sorts of things because, when you make something illegal, there will of course be more crime, because something was just made illegal
yes, it did not seem that he asked in a very standardized way watching the video. And the one person did not even seem to consider his data sheet really. But then again, I doubt they really wanted to reproduce the result as they said (they did not show their own statistics). They just needed some people say weird stuff to fill the gaps in the video and make this more entertaining. I am sure in the paper they were more careful.
And if you make gun ownership illegal, you are going to get people committing the crime of owning a gun, because it is now illegal. In the UK for example, there are about 6000-7000 recorded offences every year related to gun ownership which would be eliminated if gun ownership were made legal. But the 29 gun homicides per year would likely increase, to somewhere nearer 10,000 if it ends up being anything like the USA.
This. When I realized that the study was about politics, the wording became far more apparent that it was aimed at leading than observing. At that point it had nothing to do with "intelligent people do worse than unintelligent people" and had everything to do with "political bias taints how we interpret data", which includes the way that the data was obtained for this study. It's worthless when you lead people towards or against their biases instead of asking them to make a rational observation based on the data and numbers in front of them irrespective of their political biases. I'm pretty sure there is a rule about not doing this in your data collecting methods when it comes to peer reviewable studies.
I'd say it's implied in this hipotetical situation that gun control is the only factor in the increase/decrease of crimes.
The fact that you need to talk about phrasing and justifying the answer for the "gun control" case IS the point of the video, because if they asked you about the "skin cream" case, you would just accept that it's a simplified problem and directly answer if it helped or not. And a perfectly rational person would answer both questions the same way.
Maybe I misunderstood, but the study showed more that if you present the data in a bad way, people don't look into it much unless it conflicts with their worldview.
This is exactly it! The study is more about confirmation bias and criticality of research methodology than it is about the getting the "answer" wrong.
This!😂
Yes I doubt if you gave the question in a math test, there would be any difference from the worldview. In this case most people just assumed that the data you show me is not worth a lot.
Context matters, on the street clearly people assume you are asking about a topic, usually you want to make a point.
@@youruniquehandle2 As soon as he switched to gun control I immediately thought, is the study coming from CNN or Fox News lol.
This
Aren't people with higher numeracy more likely to have opinions formed from previously investigating data on the most hot-button topics? How do you control for that in this experiment?
They should still correctly interpret the numbers from the single study, as oppose to relying on previous information to assume what the numbers state. Now, should one study change a person's opinion on a topic they've previously researched? Probably not, but numbers are numbers.
because people with more education or higher social standing tend to have more confidence in the correctness of their world view.
^ this is the correct answer. People who think they are smart are rigid in their beliefs and are unwilling to change.
If people were "previously investigating data" you wouldn't have such polarized, opposite beliefs. You would be drawing the same conclusions.
They explicitly told people the data was made up, and to interpret the numbers. Any prior data should have been irrelevant because all they needed to know was in front of them.
Imagine how blind we were to this phenomenon throughout history until now. Our personality biases the way we view data and information
7:50 "illegal crime... I think."
That's hilarious
You don't Say...
Right on brother. Illegal crime is the problem
@jeremybuckets
I learned this point that legal crimes r justifications of illegal crimes 🤓
So you’re saying politics makes people biased?! The hell you say!
Point is - it makes more numerous people more biased.
Politics make smarter people even more biased - not something you'd expect
@@michajozwiak7650 What do you mean? It made all people of all levels more biased based on their politics, their actual intelligence wasn't actually related to the trends when it was a partisan issue.
No. They are saying the smarter you are the better you are at manipulating the numbers to see what you believe. It’s a surprising conclusion that I’ve heard before- smarter people are more ideological because they are better at manipulating data to “prove” their ideology.
@@swparsons true, the smarter you are the better you are at manipulating yourself and others
10:40 is a somewhat dangerous suggestion as it is the classical "let's deal with the symptom first and the cause later (in most cases this means never, with the example of climate change, it is irreversible)" ... imagine a doctor just giving you pain killers repeatedly to deal with your knee pain instead of actually just fixing it (if possible) while giving you some pain killers to "survive" the operation.
Yes, I immediately thought of that. You are not solving the problem by doing this. Like cleaning out the water from a leak, instead of fixing the leak itself.
I think they realized no one cares about solving the root cause, so at least they work on mitigation. Better than nothing. Same with health care, people don't want to stop smoking or exercise.
People can sometimes be biased even when they think they're not
Whenever I see poll data of any kind, I am most wondering about the questions asked, how they're asked, etc. I see so many studies where I can predict "unexpected" outcomes based on poorly worded questions, multiple interpretations of choices, whether it was multiple choice or open-ended, etc.
There's one other thing I want to know specifically for the political version of the question. As the skin cream is made up, no one could have read a previous study on that exact skin cream. Some people could have read existing studies on this exact political question, and I don't mean "read a random article online", I want to know if having read a peer reviewed paper on this topic has an any difference on the outcome.
A big one is where they compare countries. I saw one where they measured agreement with the statement "I trust most people, in general" and the results were completely messed up by the translation, because other languages express it using words that are stronger/weaker
But previous data wouldn't have changed the data presented here, so it's just another cause for bias. The participants were presented numbers in a vacuum and couldn't keep it that way before answering, that's the point.
@@GreenOnionBrother at least for the video that's not really the point, since its second half implies this matters because it is representative of the way people answer things outside of a vacuum. Also, just because you say "please consider in a vacuum", doesn't mean people can simply shut down their beliefs.
And who knows, one might also argue that the difference between cream and politics is actually not down to "bias" so much as it is about having or not having context.
@@user-sl6gn1ss8p I don't know what to tell you. They were presented numbers that only allow one conclusion, but due to bias (and this includes previous studies and statistics they've read, regardless of their validity) failed to answer a question they would otherwise have less of a problem with. That is the point. How bias affects or rational thinking.
@@GreenOnionBrother my point is just that it was not clear to me, from the video, how well this could disentangle "bias" from "rational thinking", as in, how much can this actually show that people might make worse decisions in this sense due to their bias - the "tangle" being the fact that this bias may include, as you said, perfectly valid information.
To be clear, I'm not disputing that the effect exists - I'm just not sold that the study shows that much, going by the video.
This doesn't show that smarter people are worse at numeracy. Rather, it shows that political biases affect rational thinking skills
And that it affects smarter people more
@@CobisTabaSmarter people arent at political rallies, sorry to tell you.
Dumbest=politic party defenders
@@CobisTabano
@@CobisTaba Correct.
This is an awesome way to present something we all already knew in our guts, but didn't have the words or balls to admit it to ourselves
The data used in this imaginary experiment was deliberately chosen to trigger spontaneous answers.
Firstly, very different sample sizes were chosen for the experimental and control groups, making a direct comparison difficult. Secondly, a very strong difference between the two results (improved/deteriorated) was chosen in both groups, and prime numbers (223 and 107) were even used on the left-hand side to discourage probands from calculating the ratios in their heads accurately.
They also only give two possible answers (whether the experiment showed that the skin condition of people treated with the cream was more likely to “get better” or “get worse.”) which may lead probands to think that the table must show a clear difference between the two groups. I also found no information on how much time the probands had to answer this question…
However, a look at the statistics shows that it is debatable whether there is a significant difference between the groups at all, as a normal chi-square test gives a p-value of p = 0.0472 and using the Yates correction, p > 0.05. Also the entire experimental setup is vague, controls missing, etc. So if I had to answer this question I would say that the experiment and data is not good enough to make a statement about the creams effect on skin condition (even if you look at the results without a statistical test). As a side note, please never use the word ‘significant’ when you don't show results of statistical tests like at 1:42.
I'd like to see what the trial results look like when you use more ‘normal/scientific’ data.
Also the first question I'd be asking is why the 2 groups have such a different size
@@meneldal Although it is not good scientific practice, it does occur quite frequently. However, the use of suitable statistical tests and careful interpretation are particularly important in cases like that.
@@meneldal And where is the data for the outcomes where the rash neither worsened nor improved ?
My first thought was about the p-value as well (before I realized it was American propaganda) I would have answered "I don't know, the numbers don't mean anything" and I think actual scientist that work with data would give the same answer, I don't know who these "high numeracy" people are, but the correct answer is idk.
You just sent me down an hour of refresher on stats and I've come out confused as to what you're complaining about.
A p value of 0.0472 would generally be considered significant since 0.0472 < 0.05.
Also from what I've refreshed on, yates correction is a solution to when you have very small expected frequencies. For 1 that doesn't apply to this example. For 2 people generally recommend not using it anyway as its introducing different types of errors in the data.
This really reads like you didn't like what the video presented, went and did the stats, then ignored the data in favor of justifying your preconceived bias. Which is amazingly on topic.
I love that this took an anthropological spin. Acknowledging your bias is a very difficult task, but important to keep at the forefront of your mind when analyzing a new information. Experiments like this demonstrate it beautifully. It never hurts to ask yourself, how does my culture/society shape my worldview?
Also love that he acknowledges that there is a very real and rational reason why we have adapted to do this.
Would have been interesting to have them get the result blind and then show the titles.
Since I have a background in clinical trials and experimental design, I got fixated on why the groups weren't equal size (since the people or cities were supposed to be assigned randomly, why would you end up with two different sized groups?). I would have just ended up questioning why the research problem was trying to fool me. Does that make me more or less rational?
less rational since group sizes do not matter if they are great enough
Yeah, also one of these is a controlled randomized experiment and one of these is a study. People are well within their rights to question the methodology and impose their own interpretation of the correlation on the study group as the correlation doesn't necessarily imply causation.
With cities it would be natural to have different sample sizes - the number of cities matching criteria is limited and there's a lot of "noise". You're not conducting an experiment here - you're trying to gather observations and you grab whatever you can.
With cream, however, you're definitely right - the obvious thing to do would be to get equal sized groups. You COULD end up with some differences due to people dropping out of the experiment for some reason (it did last some time and stuff happens) but it shouldn't make one group several times larger than the other.
I also noticed that the two groups were not the same size and I was wonderng if there was some third option (not shown in the data). In any case, it was automatic to think about proportions to "normalize" the data.
I am also a scientist and I was also questioning the experimental desing. Since one group is getting a treatment and the other is getting nothing there is no consideration for placebo effect. Furthermore, the study is not properly blinded (you know that if you are receiving the cream you are in the treatment group.
I think you were overanalyzing the problem due to expecting to be wrong. I did the same for a short while, before just giving up and going with the conclusion that the skin cream seemed to seemed to make rashes worse on average _and_ that this was wrong somehow.
In other words, if the problem was presented in another way (e.g. you were just shown the statistical results without any prelude) you perhaps would make a different conclusion pretty quickly.
Although, questioning the study might put you in the "science curious"-category. Or perhaps a more dangerous "science skeptic"-category :I
A great lesson in humanity is that at our core we are extraordinarily similar. Getting to know the "opposite side" is usually enough to at least start a process of introspection.
There’s a few different meanings of the word “rational”.
One is using logic and facts to come to a consistent conclusion based on truth. This is the usual definition we think of. This could be called philosophical rationality.
The other comes from economic literature. A rational human in the economic sense is one who pursues their self interest and doesn’t sabotage themself.
I think there are many more people who are rational economically than are rational philosophically. I think it can be important to remember that even if someone is supporting something that isn’t 100% true, they’re just trying to act in their best self interest.
And it's worth noting that both types of rational thinking are vulnerable to Confirmation Bias, because the human brain is very, very good at discarding information that goes against our pre-established world views.
There's a problem with how you formulated the rationality in the economic sense, because with this formulation everyone will be considered rational, because everyone pursues what they believe is their self interest. At that point, the term loses usefulness. What it should be about is being rational in the economic sense meaning that one is pursuing that which is objectively in their self interest, rather than subjectively. Because that's where most people fall short. That's the hard part because it requires perfectly aligning oneself with actual objective reality and actual truth, as this in practice is not even possible to perform perfectly as the world is too chaotic and applying precise bayesian reasoning to every little thing is computationally explosive.
I find it useful to split the meaning of the word rationality the way Yudkowsky does:
1. Epistemic rationality: systematically improving the accuracy of your beliefs. (truth)
2. Instrumental rationality: systematically achieving your values. (winning)
And I love the way Vervaeke puts it: In practice, rationality is knowing when to use logic.
@@mariokotlar303 thank you very much for the additions, that makes a lot of sense.
This video is like the study in the Video. The Title has nothing to deal with the Numbers that are presented.😂
Correct title should be "On these questions, smart people do not do better" (which still requires ignoring the average joe and Einsteins in 6:33)
Title should be something more like, "People with the capacity to analytically consider data may take shortcuts to their preferred conclusion." But... I'm guessing that would suck for SEO purposes.
@@michaelkruse9818 the title is 100% correct, they in fact did do worse then when give the study with guns compared to the study the cream
I cancelled the video and disliked it
I really dislike clickbaitint titles in generell
@@wax2562 I mean... is it? Are people with higher numeracy scores "smarter" than everyone else? And when the title says "these questions," it pretty clearly suggests specific questions which disproportionately confound "smart people," rather than abstractions designed to elicit an emotional/tribal response, where "smart people" answer the question with accuracy similar to that of not-"smart people."
this is the exact point where "isolated study" becomes something that is NOT taken as "isolated study".
you can agree that the study says something, and disagree that the study is correct or was done correctly, which is what I think had happened here.
especially with the method of the study changing from lab experiments to crime statistics, the former being a highly controlled environment with low margins of error while the other is marred with statistical error margins that often eclipse the sample size by a sizable factor.
with that, people are often biased to pool from intuition and prior experience instead of the currently provided information as it is deemed "useless", which is what likely happened here.(the actual information provided to the people hearing about the study was useless as analysis of crime statistics require a LOT more than "got better" and "got worse" graph)
The two being treated as equivalent is such SOP in the soft sciences. The overall conclusion that politics (or any strongly held belief) can cause people to irrationally reject data was fine. Talk to someone about their favorite sports team/star. And, yes, today politics is a team sport with equally stupid fans.
@@barongerhardt It's not a fine conclusion at all, because people with real data about a real thing are being presented with fake data in the most unreliable way possible, and the study is assuming that will not have an effect on the way people answer. People don't know any data about skin cremes and those are presented as controlled laboratory studies. People DO know things about political topics, and they're being presented data as uncontrolled, unvetted statistics. It's not difficult to see how that could completely skew the data that they are attempting to treat as a fixed variable. Completely different types of data, completely different contexts, and then a causation is being provided based on the faulty correlation. All huge red flags of a bad study with bad conclusions.
@barongerhardt but is this study actually showing those results?
I'd argue that this study shows no real data because the hand cream example is not a proper control group for this experiment due to the factors I mentioned in my original comment.
I would say there are two scenarios to consider:1. The cream does indeed make most people better, yet it contains irritating substances which might trigger immune responses of the skin of certain peopl, worsening the rashes.
2. It worsens the rashes of all people, hence the ratio of people who got better to whose conditions got worsened is lower than that of people who didn't treat their rashes with the cream.
Conclusion: no conclusion can be made as insufficient information is given.
You can see a negative correlation between the use of skin cream and the conditions of the rashes, however. So i suppose out of those two answers, worse would be a better option
3:05 that one lady with 0 numeracy score
actually no, she's asking more questions, she is asking for additional info, I wouldn't say it's dumb
that one yt commentor happy to see another person failing (fun fact she's not) therefore he thinks sh'es dumber than himself..
I have barely no doubt: such comment shows a low degree of empathy and a high degree of narcissism, therefore.. pathetic narcissistic ree spotted..
@@JJzerro the study was really about being able to read data and extrapolate a meaning, and she did her best to ignore the data. She was the type of person who spreads articles that agree with her narrative and politely decline any other data.
They are told specifically that the numbers are fake, are not real, so the numbers are all you have. Knowing that it is not real, what would implications be if they were real? And she clearly could not make that determination because she questions the depth of everything too much. Some people are just not numerical people.
She sounds smart to me though. How can you have a placebo where you do nothing. Seems biased and insufficient.
How exactly do you need additional info for a math question?
The whole point is that it's about the numbers and the context (skin cream or gun things) just adds flavor.
The questions are not equal. The problem is whether the questions exist in a vacuum or not. We have no prior knowledge of the skin cream and do not expect any bias. The gun control question has already been researched by both sides with their particular biases. So this just becomes one more study in a sea where bias is rampant. If it contradicts what you already 'know', then you are going to question the study, the sources, the funding, etc. You expect the study to be biased.
in short, PROPAGANDA HITS
which does not make it impossible to analyze data objectively.
after that, you can either change for mistakes in the experiment that gave you the numbers or you just learned something new.
Science works, but only if you can stomach being wrong.
@@erumaaro6060 sometimes it's just hard to know you are wrong. like the video shows, we all have subconscious biases that are hard to keep check.
I always try to do this.
when i am looking unto the data that's connected something i am emotionally attached to.
i try to look at it twice or three times again but in a different context of what if i was not me.
In short, it is hard to be a science curiuos and a skeptic at the same time.
There was SO MUCH MORE discussion about the actual numerical figures in the skin cream version than in the gun control version!
They were similarly presented to the people being interviewed, during the video demonstration for us, it didn't go in depth in the second case to avoid exact repetition
Who do you believe more: "Acshually 🤓"-guy with fake numbers, or yourself?
Smart people know that if gut feeling tells you against data, something wrong with data because it made up by much more stupid people. Misleading data is everywhere.
In this particular case they again was right, because they spotted that data was fake, which is true.
Uh, yeah, that's the point of the video
you (and so many other commenters) literally doing exactly what the video is trying to explain... and it's quite amusing
You saw a few people from a large number of people surveyed.
This video made me feel so much better regarding people. Thank you
Me: "I'm surprised Derek isn't making a political science video so close to elec.....ooooooh I see what you did there."
This video went in such a different direction than I expected and I love it. Great video! Very thought-provoking.
2:02 Why did I expect him to say "onet housand one hundred and onety one" 😂
Eleventy eleven eleven
THIS IS LITERALLY THE OFFICE SCENE WHERE KEVIN CAN DO MATH IN PIES BUT NOT SALADS!!!!!
Literally why "trust but verify" should be a mantra of anyones' life.
\
But can't have that otherwise people will start questioning things they're 'not supposed to'...
exactly, how many ovens? for how long? what throughput?
this is way less then 6*10^6 cookies, the numbers say it is impossible.
Earth is flat and smartphones don't exist.
@@arthurborlet wow that was fast
@@Sauvva_ one step is all we need,
and "it" is the main thing you are not allowed to question, it is literaly illegal to express doubt about it in my country.
(not really applied now but still in the books)
It's so interesting that they have successfully divided us almost perfectly 50-50 and convinced each group that if the other party wins, our country is over. We are all stuck in this frame of mind and all closed off to outside ideas.
I once experienced bias in myself. It was related to equipment in a niche field of extreme sports. It took me 2 years to open myself to the idea that I was wrong about a product and the manufacturer, which I had worked with for a while. When I finally overcame that mental block it honestly rocked my world and made me question what other areas I had these same biases.
From a game theory perspective it does make a lot of sense that it would be so 50-50 and polarizing, ideas that we disagree on are more likely to inflame into bigger issues, and each side reworks their beliefs to be more palatable to the masses only when they're behind in the polls. We're unfortunately stuck in a system that basically guarantees that there will be polarization between two roughly equally powerful parties. The issues may change, but the polarization itself is eternal
Considering one party has already made one attempt to violently overthrow the American government, it seems like this time its actually true that if one side gets elected, that will destroy american democracy.
Not everything is rhetoric.
Who is "they". There is only one rational side left, there are still irrational people on it but the side itself is much more rational compared to the other side. And its not the one that has 75% of respondents denying human involvement in causing climate change.
In a sense, the prediction is the prophecy.
They're not sports teams. Neutrality is not objectivity.
3:03 you know who she gonna vote for.
The irony …
JAJAJAJA
I think another issue that isn't often addressed is that data is often gathered with the conclusion already decided instead of drawing conclusions based on the data. That's why whenever people hear something that doesn't re-enforce their beliefs they dismiss it or try to "logic"it into their view. If the studies or surveys (whether it's governments or private interests) didn't cherry pick or manipulate data, and the conclusion was, to the best ability, an unbiased interpretation of reality and not what they want it to be, I think some of this would be different. But currently, nobody trusts anything and will always try to make reality fit their beliefs.
As social creatures, it is easy to become misled by those around us for sure. Logical thinking is so hard to come by now, but these videos definitely help in spreading the word.
1:06 my answer is NO, it was worse but within error.
33% vs 19%
~10% margin for error
Total sample size not labeled
3rd group (no change) not labeled
Not enough data presented to conclude a study
Answer at best is a red font "Inconclusive" or a red font "no change" , this is the correct answer.
After skimming the jist of the rest of the video, I gotta accuse you of using a clickbait title because nothing here is unexpected.
You're supposed to assume that those are the total samples and they all either improved or worsened. Those are fictitious numbers after all. Knowing this, I don't know how to calculate if p
I think a big difference between the cream example and the gun control example is how abstract the question is framed.
For the cream it is just a maths problem, but for gun control it is a question where there are preconceived notions that prevent the person from interpreting the data, or even really looking at it, expecting the data to reflect their personal beliefs.
Yeah. Umm. That's what the video is about.
I was looking for this. This seems like another poorly designed study.
Participants answering the skin cream question interpreted it as a math question, and were evaluated as if it was a math question.
Participants answering the gun control question interpreted it as a social policy question, and were evaluated as if it was a math question.
I read through the study. There is minimal effort to inform participants that they're evaluating just the data presented, and the choices they're given are phrased as "cities that enacted a ban on carrying concealed handguns were more likely to have a decrease in crime" rather than "this data supports..."
The cream "control" question is also a very poor control. Rather than being simply apolitical, it's presented as entirely hypothetical, in expected math test fashion, rather than being a real world question like the gun control question.
And here again, we see how the debate about "how was this research done" can be as important, if not more important, than the research itself.
sounds like the same back peddling the people in the video were doing. you're doing great!
"Personal beliefs" here also can just be "correctly being aware of scientific consensus". It doesn't matter how many fake studies you show me data from to explain that climate change isn't real. My answer will be that climate change is real, because that's just a fact.
And here i am. Trying to prove my point with math, and eventually change my mind losing an argument if i am wrong and getting nothing if i am correct.
The very first thing I did when seeing the skin cream chart was pause the video before he gave any more information away and then I looked at the ratios and determined that the skin cream wasn't working.
When you're asked a theoretical question about some studies, I don't understand why the topic affects how you approach the data! 😂
Yet here we are! Haha
The most interesting thing to me was the graphs from the other study that he showed regarding how numeracy, etc effects views, showed that conservative Republicans were typically much closer to a flatline across their viewpoint of a subject regardless of their numeracy score. While the increase in division seem to occur with higher numeracy liberal Democrats thinking much differently than lower numeracy Liberal Democrats and Conservative Republicans.
@@GarrettBShaw I think the issue is this - Imagine I give you data on the temperature in a specific place over the years during roughly the same time of the year and the data shows that temperature has not risen at all over the last 10 years (I assume you can find such a place) and then ask you: "Is climate change real?" instead of "Does this graph support the claim of climate change?" The second question is an obvious no, the first an obvious yes. I proved absolutely nothing about your literacy of interpreting graphs by asking you the first question.
Did politics (or rather scientific literacy) affect my choice for question 1? Obviously, its a stupid question to give me random cherrypicked data from somewhere and try to disprove a huge chunk of research. Maybe there is a random district in a random city where no gun control reduced crime. I obviously concede it did in that case but its not going to take one study I have no clue about to change my opinion. The phrasing was absolute BS.
There's 3 reasons dumb like me.
1 - ignorance is bliss
2 - I don't have to do math on a daily basis.
Saw what you did there 😂
Um that's just 2
you should call 2, B instead
Wow, you really have numerical skill 😂
So this passes for humor today?
TL;DW:
People are tribalistic regarding political issues, and expected levels of rationality become compromised when something related to current partisan talking points is mentioned.
bro it’s a 14 minute well put together video
feels pretty rude to tldw someone else’s video like that
gpt head ahh
TL:DW Pissed in my bed.
Or they just understand that a staggering number of people have the attention span and intellect of a gnat. @@makpls
It doesn't help that quite often the data would be manipulated for political purposes . The smarter people are justified in not taking it at face value. Their views are formed using a wider frame, not just data from a source they don't trust.
Sure but thats not the point, they are asking what does the data show, not is the data valid or do you believe what it shows, or what your personal opinion is. None of those matter.
@jaro6985 It shouldn't matter if it was a question at an exam. But this isn't - as in normal life you would use information other than what's immediately in front of you
LOVING THE QUALITY AND CONSISTENT CONTENT!! I really always look forward to the next!
@11:18 the problem is that is purely reactive policy/action, which is the most expensive kind of action. To be proactive on something, you have to have agreement on causes and work to fix those. We've forgotten the old sayings of "a stitch in time saves nine" and "an ounce of prevention is worth a pound of cure". We see the same thing these days with the Y2K issues we had. Since a bunch of companies spend massive amounts of time and money *proactively* fixing things so that "nothing happened", today we have a bunch of people thinking Y2K was all just a big hoax. And that kind of thinking has continued to everything else. We won't be proactive on anything, we'll just have to reactively deal with the symptoms rather than proactively try to address the causes.
The other side of this issue is that to come up with the solution to the cause, i.e., the thing you do to fix the problem, you have to be sure of the cause itself. Part of that is understanding that if A causes B, then we can use the existence of A to predict B. Unfortunately, when it comes to some issues (like climate change), the predictions are faulty - illustrating that we actually do not know the cause. Why would I spend money to fix this problem you identified here if you can't show me it will actually solve the issue you claim it is solving? To put this in a non-political example, if I say my car won't start, telling me to put air in the tires won't do me a lot of good - even if the tires are flat!
@@commentinglife6175 I suggest you look at climate change prediction more. Even the predictions that Exxon made (and hid) back in the 1970s are reasonably accurate.
We need a combination of short term and long term problem solving. The issue is that long term work requires trust (which we're lacking in the U.S.). But successful cooperation on short term problems is a good way to foster the trust needed to tackle the long term stuff.
That said, I do think that confirmation bias is a real thing, and if people see data that is consistent with their ideology, they are less likely to question it. do you take part in XAI760K ? great work by them!
Report this comment for crypto scamming
10:05 cheeky sponsor