The Writer Files is a nonpartisan show in its attempt to explore all facets of the writing life, and in the last few months you can’t seem to throw a rock without hitting a social media article about fake news or alternative facts — especially on Twitter and Facebook.
Fake news isn’t new — some form of it has existed since the beginning of printed news, including examples by leaders of the American Revolution concocting stories to stoke the political engine (see: Benjamin Franklin or John Adams, historically) — but it seems to be on everyone’s mind now, especially since November.
Luckily research scientist Michael Grybko — of the Department of Psychology at the University of Washington — has returned to the podcast to help me get some answers.
The Transcript
A Neuroscientist s Perspective on Fake News, with Michael Grybko
Voiceover: Rainmaker FM.
Kelton Reid: The Writer Files is brought to you by the all the new StudioPress sites, a turnkey solution that combines the ease of an all in one website builder with the flexible power of WordPress. It’s perfect for authors, bloggers, podcasters and affiliate marketers, as well as those selling physical products, digital downloads and membership programs. If you’re ready to take your WordPress site to the next level, see for yourself why over 200,000 website owners trust StudioPress. Go to Rainmaker.FM/StudioPress now. That’s Rainmaker.FM/StudioPress. And if you’re a fan of The Writer Files, please click subscribe to automatically see new interviews as soon as they’re published.
The Problem with the Proliferation of Biased Fake News in Our Social Media Feeds
Kelton Reid: We are rolling once again and welcome to another guest segment of The Writer’s Brain. We’re calling this one the Fact versus Fiction edition where I pick the brain of a neuroscientist about fake news in particular: why we fall for it, why it spreads unchecked, and what we can kind of do to combat it, for ourselves at least. You know The Writer Files is a nonpartisan show in its attempt to explore and survey writers about all facets of the writing life. In the last few months, it doesn’t seem like I can throw a rock without hitting an article in my social media news feeds about fake news or alternative facts, especially on Facebook and Twitter.
We’ve seen stories about especially politically oriented stories about, for instance, Obama invalidating election results, a post that apparently got over a quarter of a million Facebook shares. Or Hillary Clinton paying Jay Z and Beyonce $62 million to perform. And of course, a couple of my favorites. Trump sending his personal plane to transport stranded marines or the Pope endorsing him, for example, but those are just some political examples. I know there’s a lot of fake news out there and it’s not new, right, Michael?
Michael Grybko: Right, right.
Kelton Reid: Fake news is nothing new.
Michael Grybko: I remember that Trump also removed, was it Martin Luther King’s statue from the office or something we …
Kelton Reid: Right.
Michael Grybko: Yeah.
Kelton Reid: Okay, but you know, some form of fake news has existed, probably since the beginning of printed news including, you know, examples of leaders of the American revolution, concocting stories to kind of stoke the political engine. I’m speaking specifically about like Benjamin Franklin, writing some crazy or a whole edition of a fake newspaper. John Adams doing the very same thing, but it seems to be on everybody’s mind since November, but it seems like it’s just more in the news than ever now. The election s past, but we can’t seem to shake it, so luckily, I have asked research scientist Michael Grybko of the Department of Psychology at University of Washington to return to the podcast to help me get some answers. Thanks for popping on to do this man.
Michael Grybko: Thanks for having me on and another interesting topic to discuss here.
Kelton Reid: Yeah, and I know we could probably talk about this for, as we noted before we hit record, hours and hours, but in the spirit of brevity, we’ll just launch right into this.
Michael Grybko: Yeah.
Kelton Reid: What started this conversation for us? It was that we were sharing a couple of articles, I think especially Sean Blanda s article that another co-worker shared with me about the reason you can’t stand the news anymore. A couple of facts popped out that I’ll just throw out really quick.
That a lot of Americans believe that the fake news problem is sowing confusion. 23% say they’ve shared a made up news story either knowingly or not and of course, online news consumption is at an all time high, especially the younger generation, kind of overwhelmingly, preferring it to print, for instance. And just to kind of sum up what he said there was that the methods to fund modern journalism are kind of undermining trust in these news outlets.
Michael Grybko: Right.
Kelton Reid: Because of this order of operations that it requires to accelerate the reach via social media, mostly Facebook, leveraging and selling ad space using programmatic or automated advertising technologies algorithms, the leverage of selling native advertising, sponsored content, stuff we just can’t, like for the most part we can’t see the difference of in our news feeds, right? It’s hard, it’s often hard to tell the difference between something that’s fake.
Michael Grybko: Right, yeah, yeah.
Kelton Reid: Something that’s real, and part of that problem is of course, this great proliferation of biased fake news now, so …
Michael Grybko: Right.
Kelton Reid: If you want to say something about that really quick.
Why People Disregard Evidence that is Contrary to Their Strongly Held Beliefs
Michael Grybko: Yeah. What s not surprising what s I think also fake news, also we’re talking about usually political issues and that’s for, that’s what I was focusing on, and the reason that we accept fake news or fall for it is largely, I think, because political issues generally carry so much emotional weight.
Kelton Reid: Yeah.
Michael Grybko: We talked about these, the emotional components of our thought process and cognition and the impacts our emotions have on that before, on a number of episodes. But yeah, this got, I found some research that was done about 10 years ago. A couple of guys or a couple of people, Hyhan and Reifler, I think their names were. The name of the article is When Corrections Fail: The persistence of political misperceptions.
Kelton Reid: Yeah.
Michael Grybko: This doesn’t talk exactly about fake news, but what they look at is basically they show that when people have a very strong political belief, that it’s very hard to change that person’s belief. So, even when they’re presented with corrective information, they still will remain with their original belief and then they even found in some instances that corrective information had the opposite effect, actually strengthen these person s …these people’s misperceptions.
Kelton Reid: For sure.
Michael Grybko: Yeah, I think a lot of that weighs in to these being very emotional issues and we think about the issues. A lot of these are end up being life and death issues, things like war, death penalty. There’s health issues, you know, and just our perception, self-perception. Healthcare is a big issue now.
Kelton Reid: Yeah.
Michael Grybko: These have a lot of emotional weight.
Kelton Reid: For sure, and I think pointing back to kind of how all this came about, there’s been some great studies into how false stories during, specifically the Presidential campaign, were spread on Facebook, monetized by Google AdSense.
Michael Grybko: Right.
Kelton Reid: By Craig Silverman was on Fresh Air. He’s BuzzFeed news. He spent years studying media and accuracy. He did a great study with Ipsos and BuzzFeed, but what they were seeing in the run up to the election was just this great proliferation of these fake news articles, specifically, aimed at this campaign, and they saw the fake news actually overtaking the real
Michael Grybko: Right, that’s scary.
Kelton Reid: Yeah, the real news outlets in popularity and they were trying to figure out how this happened, but there was no question that they resonated with people. As you say, these are things that are kind of scary. They stoke that …
Michael Grybko: Right.
Kelton Reid: Again, you and I have done episodes on storytelling, empathy, creativity and writer’s block, specifically. I’ll link to all those episodes, but.
Michael Grybko: Yeah.
Kelton Reid: Some great stuff there, but obviously, these well told stories are resonating with people. A lot of the times, it’s just scraped content, but it’s getting this enormous reach through these algorithms.
Michael Grybko: Yeah.
Kelton Reid: Then I’ll point back to a great episode on Rainmaker FM where our CEO, Brian Clark, on Unemployable, chatted with news curator, Next Draft’s Dave Pell, about fake news. You know he said, specifically, trust in the media has never been lower and the new norm of social content and distribution allows fake information or fluff to go viral and that just kind of amplifies the skepticism, right?
Michael Grybko: Right.
Kelton Reid: As you said, it kind of works inversely, almost.
Michael Grybko: Right, right.
Kelton Reid: Dave said that the ability to spread that news and to make that news look more real whether by design or sharing it on social media, makes it more dangerous and prevalent and he thought, he thinks that the bigger problem is that people actually believe it.
Michael Grybko: Right.
Kelton Reid: There’s that kind of the confirmation bias, lack of fact checking so people are just like, what, scanning headlines and then sharing because they have more social shares. The algorithms propel them, so on and so forth. So anyway, it’s kind of scary, but that final factoid from BuzzFeed news and the Ipsos poll was that 75% of American adults who were familiar with a fake news headline viewed the story as accurate even though they’d been debunked.
How Your Emotional State can Change the Way You React to Information that Challenges Your Beliefs
Michael Grybko: Right, so I think that gets back to this emotional thing and this emotional link we have to a lot of these issues and that makes it very easy for us to sort of take the bait, if you will, when we encounter a fake news story. And it comes down to that we actually process information differently when we’re in an emotional state, rather than another coherent or cognitive state. And yeah, there is research that supports this, so there’s a lot of research that’s been shown just that emotions can change our cognitive behavior and we sort of switch kind of impulsive decision making to impulsive decision making from a more kind of, I don’t know, coherent and analytical state, and emotions have this switch.
There is actually another research article came out by Sam Harris, and this is really recent. This is, I sent this link to you and I think you looked at it. This is Sam Harris and … came out in December 2016, so really recent, and it’s also an open access article so I think everyone should be able to get it and read it.
Kelton Reid: Yeah.
Michael Grybko: The title is Neural correlates of maintaining one s political beliefs in the face of counter evidence.
Kelton Reid: Right.
Michael Grybko: It’s similar to the article I discussed first, and it was a similar paradigm where individuals had very strong political beliefs and then were presented counter-evidence to those beliefs. But then he took it another step and he … just regular beliefs they had, non-political beliefs, and looked at those and presented evidence against those beliefs. And he did this while he was monitoring their brain activity in MRIs, so I won’t go through all that, as we talked about the MRI a number of times.
Kelton Reid: Right.
Michael Grybko: Basically, you’re looking at blood flow to an area of the brain as the blood flow increases, we’re inferring that neuronal activity increases. And he found, or Sam Harris and his team found, that challenges to political beliefs kind of shifted activity into what he called the default mode network and this is, as it sounds, is more of that impulsive network and it’s kind of a disengagement from the external world and more internal, so you become less receptive to outside stimuli and information.
Kelton Reid: Okay.
Michael Grybko: Then the non-political beliefs, he did not see this change, so the areas of the brain that are more higher in analytical cognitive function remained intact, on activity here.
Kelton Reid: Yeah.
Michael Grybko: But they also found something really interesting. So individuals who did tend to change their beliefs had less activity in a couple of areas, the amygdala and the ancillary cortex. This is important because these two areas have been associated with high emotional states, so this kind of makes sense that, you know, it’s some good evidence that yes, we are in fact political charged issues, are in fact eliciting an emotional response in us and this may change this response, may change how we process the information. And this could be an explanation why we take the bait, so to speak, with fake news, that we’re not using our higher reasoning skills and we’re just kind of looking at something that fits with our … what we already know, our more impulsive behaviors.
Kelton Reid: Yeah.
Michael Grybko: Yeah, that was a little scary.
Kelton Reid: Right.
Michael Grybko: Yeah, it’s this easy and then there’s more studies too. I think that there’s also something known as the framing effect, and I’ll try to pronounce this name, Yue-jia Luo. I think it’s a Chinese name, but the name of the article is Neural Basis of Emotional Decision Making in Trait Anxiety, and here they go on, he’s not necessarily looking at political issues, but just decision making in general, and finds that how the information is framed, so the context it’s put in, will influence how we make our decisions. And this is really getting closer to that fake news and misleading news. He based this on another study done in MRI and individuals had to choose between two different conditions.
For instance, they had to do a small word problem and decide which is greater, which option to go with, A or B. So, like for an equal condition, you’d have like the negative effect, the negative framing would be, If you choose this, you’ll lose 20% of 100. And then the positive condition would be, If you choose this, you gain 80% of 100. They end up the same but just the frame is different, so the absolute amount is the same.
And he played with those variables and he found that the framing effect would change the negativity when people succumb to the negative effect and avoided the negative even if it was the better choice or even if there was a shift in how they processed information, similar to the shift we saw before. Again, the frame, just how you word the information can cause this emotional response and a shift in brain processing, in processing of information, and different neural networks are used. So yeah, we can easily be misled, and once our emotions take hold, we can have changes in how we process information.
Kelton Reid: For sure. I think yeah, yeah, I mean that’s a lot to unpack, but in layman’s terms, circling back to the storytelling piece, obviously, why a lot of the stuff goes viral, Craig Silverman was saying how it’s presented. So that post that were, say, included a meme or video obviously were far more widely shared than just basically text.
Michael Grybko: Right, in-depth news piece, right?
Kelton Reid: Yeah, and those are easy ways to get into people’s brains, right? Because they’re fast, they’re easy, they’re shareable. That’s kind of a
Michael Grybko: When you think about it, they’re probably using this more impulsive area of the brain.
Kelton Reid: Yeah, and there’s no way to fact check a meme…
Michael Grybko: That’s different, yeah.
Kelton Reid: I linked to a couple of articles, I sent you a couple of articles about, you know, politically motivated reasoning or how some political party supporters are willing to lie to preserve their ideological identities, that’s pretty interesting. But yeah, that Yale law professor explained in a study for the advances in political psychology, that people who score high on this kind of politically motivated reasoning are, you know they tend to be partisan, very partisan. We kind of all are, all in our bubbles right? But then, when confronted with evidence to the contrary, it often makes people kind of, again, cling more firmly to their beliefs, right? On these controversial topics, especially.
Michael Grybko: Yeah.
Kelton Reid: There was that other article I linked to that shows that using science in an argument makes people more partisan as well. It kind of goes hand-in-hand so this behavioral economist at Yale, Dan Kahan, spent a decade studying whether the use of reason aggravates or reduces partisan beliefs. His research showed that aggravation easily wins, and that kind of plays right into your research there.
Michael Grybko: Yeah. That we’re slipping into these default mode networks when we process the information and makes it very, very difficult to process new information and even accept it or even give it a chance, if you will. Yeah.
Kelton Reid: Yeah, so the process called ‘biased assimilation,’ you know, we’re talking about cognitive biases and confirmation bias, people will selectively credit and discredit information in patterns that reflect their commitment to certain values.
Michael Grybko: Right, right. Yeah, confirmation bias is the … people search for information that confirms their view of the world and ignore the information that doesn’t fit with that view and their beliefs, yeah. Yeah, that fits right into all this, and when we’re in that emotional state, like the studies are showing, again, these heightened emotional states and the information, the way the information is presented, influences how our brains process and incorporate this information and this, in turn, can influence the action we choose to take from that.
This is a good example that information being processed using different neural networks and when activity, say, shifts to more of these, we see more activity in emotional networks or areas, the amygdala for example, we sort of fall victim to framing effect and go for those memes and short little snippets of information instead of in-depth articles.
Why Fake News Works and the Fallibility of Our Brains
Kelton Reid: Yeah, right. It kind of makes one’s head spin, of course, I do say that often when I talk to Michael, but are you saying that the human brain is flawed?
Michael Grybko: Oh yeah, it’s tremendously flawed. It is, yeah it’s a crazy thing.
Kelton Reid: Going back to your point about the emotional resonance of certain content, for instance, I can t think of a more emotional election.
Michael Grybko: Right, right, so yeah, so it’s been a very emotional election, very polarized, so I think we’re all very susceptible to this. But yeah, our brains are flawed. How we perceive information, it’s not great. Our perception of the world isn’t always accurate, and I mean, a very easy example of this are just optical illusions. That’s just your brain not working correctly.
I remember growing up, I thought like the coolest thing … I remember this Porsche came out and had the rims on it and they looked like the wheels are spinning backwards when it was going forwards. You know that’s not true, but that’s just your brain not working and these are really simple things. So when you get in these more complex issues, I think we can fall victim to fake news and our perceptions may be off. I think this is why fake news is so damaging is that it really starts to distort our sense of reality, because you get multiple realities depending on your news sources.
That’s my big fear of all this. There’s a number of implications and bad implications from fake news and we’re a democracy. For democracy to work, you need a well informed public that has an accurate view of the world and is getting good information, but I think fake news and this distorting our sense of reality can really wear away at our society. It gets back to empathy and theory of mind, which we talk a lot about in storytelling and I think fake news can start to erode some of these relationships that are forged from this.
The Sam Harris article I mentioned earlier has a pretty good intro that describes the importance of empathy and shared emotional experiences in forging relationships with each other so we can work together and build societies and build skyscrapers and all these amazing things we do together. As I think about it, just our perception of reality is really based on sensory information. It’s highly processed and often incomplete, so these shared emotional experiences, the empathy help us serve as a check of reality. So I’m thinking if you insert fake news, it will really disrupt this. If I have an emotional response to a stimulus and I can see that you have the same response, then it helps validate my sense of reality and my perception, so I know it’s accurate.
When you use fake news, it really disrupts them, like, “Hey, this is my view of events and it makes me sad,” and I look over to you and you have a happy reaction to this. What’s going on? An example, let me just kind of … imagine someone’s walking down the sidewalk and they slip and fall and they hurt themselves or that’s how I see it. I would be upset but then all of a sudden I look up and I see everyone else around me is laughing, I know my perception is off or their perception is off, right? Then I realize, I might talk to someone, like what’s going, oh it’s a big gag. These guys are just joking around all day.
Now let’s switch it up a bit. Say I got some information and someone told me, oh I’ve got some fake news, someone said, hey I’m going to push this person. It’s just a joke. They’re going to fall and spill everything they’re carrying but it’s just a joke. It’s going to be really funny. This happens and I start laughing, but it wasn’t a joke. This person s really getting hurt. Now everyone else around sees me laughing and the person’s really hurt. I end up looking like a jerk. That’s kind of how fake news works, right?
Kelton Reid: Yeah.
Michael Grybko: Where we seem to have the wrong emotional response in someone’s eyes. It’s contrary to what they think we should be having, the emotional response we should be having. This fake and misleading news could erode this kind of, these relationships forged by empathy and this is because individuals start having very different responses to events that have, carry a very strong, heavy, emotional component to them. So that really worries me that this can end up … fake news because that can end up putting a huge rift in society by tearing apart these relationships.
Kelton Reid: I mean, it’s interesting and it’s frightening to hear you say that at the same time. As Maria Konnikova, she’s a contributing writer at The New Yorker, and author most recently of the book The Confidence Game, it’s a great read. I’ll link to her episode with me on this show. She said recently in an article for, I believe it was The Atlantic, that the distressing reality is our sense of truth is far more fragile than we would like to think it is, as you ve noted.
Michael Grybko: Yeah, yeah, that’s, as you put it.
Kelton Reid: Especially in the political arena and especially when that sense of truth is twisted by political figures, and as 19th century Scottish philosopher Alexander Bain put it, “The great master of fallacy of the human mind is believing too much.” False beliefs, once established are incredibly hard to correct.
Michael Grybko: Right, right.
Kelton Reid: That’s been proven in studies as well, that once a lie is introduced, you have to actually accept it as truth in order to take that second step to make the conscious effort to reject it.
Michael Grybko: You’re right. And that can be hard to do, right?
Kelton Reid: Well, if you, yeah.
Michael Grybko: That’s important to, I think move on, like how do we get out of this rut we’re in, as a society?
Kelton Reid: Yeah, yeah.
How to Combat Fake News with Your Own Analytical Curiosity
Michael Grybko: I think yeah, another article I think you passed on to me was from The Atlantic, and it was how curiosity burst our political bubbles and yeah, there’s one quote in there basically says that, and I thought that was a really good way to approach how to combat fake news and the quote was, “Curiosity seems to be the pin that bursts our partisan bubbles.”
Kelton Reid: Yeah.
Michael Grybko: Yeah, once … you have to stay active.
Kelton Reid: Vigilance.
Michael Grybko: Yeah, realize we’re in bubbles and start looking for other news sources. Start looking for, we can name certain news outlets are, have biases, I’m sure we can all — some jump to mind. Go to the other news outlet than the one you normally go to and try to read it with an open mind and not slipping into that emotional state that we were talking about earlier. Try to stay in a more coherent, analytical frame of mind.
Kelton Reid: Yeah.
Michael Grybko: That can be hard to do, but I think we owe it to ourselves as a society if we want to make this work, to take these steps.
Kelton Reid: For sure, for sure. You know, we have some other tips, I think I’ll throw in maybe closer to when we wrap on how to kind of stay vigilant.
Michael Grybko: Right.
Kelton Reid: Yeah, I mean there’s some scary stuff happening, especially around Facebook, which I think we could just touch on quickly.
Michael Grybko: Yeah. That’s a great, great way to frame information.
Kelton Reid: Yeah, yeah, okay. So let’s talk about Facebook specifically and psychologist, Michal Kosinski, who developed a method to analyze people in minute detail.
Michael Grybko: Yeah.
Kelton Reid: Based on their Facebook activity.
Michael Grybko: Yeah, this was a peer reviewed paper. It’s a science, scientific research, but yeah, I think it was Motherboard, there was an article on it, and then the peer reviewed paper was Private traits and attributes are predictable from digital records of human behavior, and yeah, it was pretty scary. He, based on our online profiles, he was able to construct a pretty accurate personality profile from people and then in turn, with these advertising agencies and marketing agencies and political marketing agencies who can buy this information.
Kelton Reid: Yeah, well, this.
Michael Grybko: Now this kind of serves as a manual how to push our buttons and frame information for a specific group of people.
Kelton Reid: Though specifically, they proved that on the basis of an average of only 68 Facebook likes by one user.
Michael Grybko: Right.
Kelton Reid: That they could predict anything, I mean this is crazy.
Michael Grybko: Yeah.
Kelton Reid: Skin color with 95% accuracy. Sexual orientation with 88% accuracy. Affiliation to democrat or republican parties with 85% accuracy and it didn’t stop there. They covered intelligence, religious affiliation.
Michael Grybko: Yeah, right, right.
Kelton Reid: Drug use, these could all be determined and they could basically, with this, you know, just a simple questionnaire and these likes can basically pinpoint exactly who you are. They almost know more about you than you do, right?
Michael Grybko: Yeah, right, right and it’s interesting is I can see how Kosinski, if you look him up, you can, I think on his, I forgot what University he s with, but you can actually go and take the test, I did it. Yeah, put in my, just a link when he looked at my Facebook profile.
Kelton Reid: I was scared.
Michael Grybko: Yeah, it came up. I thought it was pretty accurate. Yeah, it’s really scary. We’re pretty much open books. We’re not as mysterious as we believe we are, yeah. It’s scary.
Kelton Reid: In the wrong hands, that information can pinpoint exactly what kind of a … you know.
Michael Grybko: Fake news delivered to us.
Kelton Reid: Fake news you’re looking at. They can promote it.
Michael Grybko: Yeah, you can just frame it, yeah.
Kelton Reid: Yes, sponsor it and get it right in front of you and you might not notice that it’s sponsored or …
Michael Grybko: Yeah, right, right.
Kelton Reid: That’s kind of scary.
Michael Grybko: Yeah.
Kelton Reid: Well, you know, that psychometrics piece with the big data, I guess you know these platforms have to do something themselves and that’s part of the problem, probably partly part of the solution, right, is how are they going to combat this, at say a Facebook?
Michael Grybko: Right.
Kelton Reid: Without creating further I guess, further bias and or without suppressing information?
Michael Grybko: Right, yeah.
Kelton Reid: Freedom of speech.
Michael Grybko: It becomes a legal issue.
Kelton Reid: Yeah, and of course, they are no strangers to legal issues.
Michael Grybko: No.
Kelton Reid: They’ve been sued in the past for privacy stuff, so you know, it’s a big can of worms. I mean it really is can we get the toothpaste back in the tube? Probably not.
Michael Grybko: Right.
Kelton Reid: Between the algorithms and the human curators there, it really is kind of just going back to this distrust that’s been sown, just a double-edged sword.
Michael Grybko: Right.
Kelton Reid: It’s a breakdown in discourse, the harsh truth of good journalism really is like that it’s expensive, fact-based, unbiased. Journalism is expensive and does not have a great return on investments, sadly.
Michael Grybko: Right, right.
Kelton Reid: We’re not trying to address that fix for the industry.
Michael Grybko: Right, right, well, I think we need to be a more, I think, information, I don’t know, prone society if you will, or respected society, or we need more respect for information and I think that would help out.
Kelton Reid: Well, if nothing else …
Michael Grybko: Especially in these instances, and we just see the more, our emotions get involved in these things. We’re using more of a default mode network and our higher order reasoning areas of the brain are kind of getting turned off and this will eventually succumb to these fake news.
Kelton Reid: Hopefully, shining a light on just the fact that it exists will make people more vigilant.
Michael Grybko: I think that’s really important is … the solution, I think, is gotta come back and it’s going to be bursting that bubble, what we were talking about earlier. And people have to realize that they’re being manipulated through fake news and misleading news and I think once we turn that corner, then maybe we won’t have to get into all the legal issues as much and have the courts decide as for us tell us what to read and what not to read. Hopefully, society will just figure this out for ourselves and so, yeah.
Kelton Reid: Well, I like, John Avlon, Managing Editor of Daily Beast, and a CNN political analyst has reminded us on I think it was Bill Maher s show, that there is an objective reality, you know. Lies are always going to be lies, but you know, we do need to stay vigilant, lest we descend into this Orwellian ‘two and two make five’ world. But, do you want to cover a few more, maybe some solutions that we can throw out there between the two of us?
Michael Grybko: Yeah.
Kelton Reid: We’re not going to solve the crisis.
Helpful Tips to Stop Yourself from Sharing False Information
Michael Grybko: Yeah, we already, with curiosity, I think that s really important, and just realizing, I think that’s when people, when they read news … you can tell when you start getting emotional and be aware of that. Start tuning into that and being like, Okay, I’m not using my analytical brain as much. and yeah, try to stay away from those emotional states and then stay away from news reports that play on that.
Try to stay away from the very partisan rhetoric if you feel there’s a lot of, it should be fair and balanced, right, the news, and I think I just stole someone’s line there. A certain organization’s line, but you know, it shouldn’t have a bias but it’s really clear when they do, and realize that, and try to stay away from it and then do research. Like I said, go and find more than one news source.
Kelton Reid: Yeah.
Michael Grybko: Then don’t repost stuff. If, you have to know it’s … you know it’s not right. Or you checked it before you, yeah, that’s a good rule. Before you post stuff out on Facebook, get multiple sources.
Kelton Reid: Yeah, well, I mean okay, well, there was a good, there were some helpful hints posted right around the election time by Nick Robins-Early, I think it was Huff-Po but you know, read past the headline.
Michael Grybko: Right, yeah.
Kelton Reid: Before you just click ‘Share,’ because you like the incendiary headline. Check who published it, you know.
Michael Grybko: Yeah.
Kelton Reid: Click through, see if it’s a reputable news source. Check the publish date and time, it could be from years before.
Michael Grybko: Right, that’s one good one, yeah.
Kelton Reid: Who’s the author? Is it a real person? Have they written other things that were incendiary? Check the links and the sources they’re using … and these are hard things to do if you’re in a hurry.
Michael Grybko: Yeah.
Kelton Reid: You know, in our short attention span world.
Michael Grybko: Not everyone has that much time, right.
Kelton Reid: You know, being wary of the confirmation bias is a really good one and it’s yeah, it’s so hard.
Michael Grybko: It is. It is, especially that, I know we can get a whole other episode on confirmation bias and why it works, well, basically, there’s reward mechanisms in our brain that when we find something that fits previously held beliefs, our brain is happy with that, and we get a little reward. We become approach motivated, which we talked about before, and continue on that path. It’s like confirmation. Our brain realizes that, “Oh, we did good.”
Kelton Reid: Yeah, yeah.
Michael Grybko: These are also you know, I’d like to see, hey, journalists do a pretty good job in the major news organizations. Most of the journalists I’ve interacted with take their job seriously and try to be transparent and not have a bias in their writing, especially news writing, opinion pieces sometimes are different, editorials. But they need to stay aware of their emotional stance and try to make this, make sure that doesn’t bleed through in their writing and how they report the news.
I don’t hear from editors much, and I was wondering, I mean I’m not familiar with the process, but I think one of the things journalism has going for it is the editorial review. I think that’s a very important part of it where editors, the facts are actually checked. There is a fact checking process and this is kind of similar. Science has a similar thing. We have peer review. It takes a lot longer, but we can’t publish anything and get our grants funded unless other scientists look at the work and report on it.
Kelton Reid: Sure.
Michael Grybko: I feel like that’s sort of the role of the editor and I would like to know, I think it will be good if for the news organizations, they’re trying to combat this, where the editors maybe to come out with a statement then it’s clear to all of us, just explaining their vetting process for news.
Kelton Reid: Sure.
Michael Grybko: Maybe come up with a rating system: “This source we used is iffy. This source we used is really good. We’ve used them many times and these are their credentials.” You know, some kind of way to vet, how is the information vetted? How confident are they in the information? Then sort of a mission statement of what they look for in true news, you know.
Kelton Reid: For sure. Yeah, I mean in this day and age, you know scraped content and bots, and …
Michael Grybko: Right.
Kelton Reid: Just the ability …
Michael Grybko: Automated news.
Kelton Reid: But also the ability to just publish with the click of a button out there on a server in Macedonia, and send, news that looks just incredible. Yeah, I mean that’s so scary, so we definitely need to innovate in the area of both human and algorithmic detection for misinformation.
Michael Grybko: Yeah. I think that’s where editors would really come in, and having a statement from editors of these major news organizations, coming out and saying why they’re different, what they do to check their news, and maybe the journalists as well. Have something they can easily access on their website or something just to look at, you know, why they’re different, what makes their news good.
Why You Need to do Your Homework
Kelton Reid: Definitely. I love it, and I want to talk to you journalists as well, so there might be a part two to this, where I actually have some journalists on to discuss your findings and our talk. A couple of things I will point to before we say goodbye. There are some rumor trackers. One is called Emergent. That is actually a real time rumor tracker. Part of a research project for the Tow Center for Digital Journalism at Columbia University that focuses on how unverified information and rumor are reported to the media. It aims to develop best practices for debunking misinformation, so you can go … I’ll point to that, Emergent.
And then Snopes.com, also founded in 1995, I believe, so with 20 years of experience as a professional researcher and writer created Snopes. David Mikkelson created Snopes, basically to do kind of the same thing, to track rumors and debunk them or confirm them, so those are a couple of resources I will link to that are you know, just good for, in the spirit of curiosity and staying curious and vigilant, you can use to just see if something is true or not.
Michael Grybko: Yeah, do some homework.
Kelton Reid: Do some homework.
Michael Grybko: Well, this is important. This is important work. We’re tearing apart …
Kelton Reid: No one likes homework, Michael.
Michael Grybko: We’re tearing apart the fabric of society, distorting our reality here.
Kelton Reid: With our very flawed, big, brains that require a lot of water. So hey, man, thank you so much for coming back and illuminating the topic and subject. I know we’ve only just touched surface, but it’s always enlightening to …
Michael Grybko: Yeah, this was interesting.
Kelton Reid: Rap with you my friend, so to listeners, stay vigilant and curious, and thanks for listening if you’ve made it this far.
Thanks so much for joining us for a glimpse into the workings of the writer’s brain. For more episodes of The Writer Files, or to simply leave us a comment or a question, drop by WriterFiles.FM. You can always chat with me on Twitter @KeltonReid. Cheers. Talk to you next week.