top of page

 

How social media and human nature have spawned hoaxes and hate-mongering

Lyn SnodgrassNelson Mandela Metropolitan University

The internet held the promise of an interconnected global village that facilitated cooperation and dialogue through authentic information sharing. But the interaction between our inherent human tendencies and social media platforms has produced an epidemic of misinformation, hoaxes and hate-mongering that threatens this vision.

Social media is increasingly influencing the way we consume news. Research by the Reuters Institute for the Study of Journalism in 26 countries shows that more than half of those sampled use social media as a news source.

This trend comes at a cost as social media is not known for its accuracy, or the advancement of challenging and diverse perspectives. Filter bubbles, created through personalised and algorithmic news feeds, reinforce this.

Unrestricted access to information is a cornerstone of a vibrant democracy. But if this information is inaccurate, biased or falsified, the fundamental freedom of informed choice is denied. In essence government accountability, social justice and equality are severely compromised. Thus social media, as the most effective purveyor of fake news and conspiracies, poses a serious threat to democracy.

Rise of fake news

In the wake of political upheavals – the US presidential race and Brexit are good examples – there has been a surge in fake news, conspiracies and pseudoscience discourses on social media platforms.

In South Africa 2016 was an annus horribilis for the governing African National Congress (ANC) and President Jacob Zuma. A pervasive conspiracist narrative about a sinister “third force” meddling in the nation’s affairs hogged the headlines. Fighting for his political life, Zuma blamed Western intelligence for allegedly stirring up criticism of him.

These and other events have seen “post-truth” emerge as the Oxford Dictionary international word of 2016. The term refers to the irrationality that prevails when appeals to emotions and personal beliefs, rather than hard evidence, are more powerful in forming political opinions.

There is now a burgeoning “cottage industry” of websites that invent fake stories. Analysis byBuzzfeed of the recent US election pointed to the prevalence of fake and hyper-partisan content on Facebook pages and websites.

The attraction to this fake news isn’t surprising. Research suggests that the public is also more likely to indulge in conspiracy theorising during periods of insecurity and discontent.

Threat to democracy

Analysts and politicians warn of a “digital virus” of falsehoods spread by conspiracy theorists and trolls that entrench polarised politics. These threaten democracy.

With the rising tide of populism we have seen popular mistrust, and even rejection, of the political establishment and mainstream media. In a climate of “us vs them” researchers find that people, especially conspiracists, are attracted to alternative news sources. They are motivated by the desire to avoid the perceived manipulation by mainstream media and become susceptible to fake news.

In Africa, conspiracy making occurs across the political divide. It is used by regimes to entrench power, or by the opposition to erode it.

Zimbabwe is a prime example. Conspiracy theories have been part of 92-year-old Robert Mugabe’s presidency in the almost 37 years of his rule. He has muzzled the country’s media and railed against Western powers for conspiring to unseat him and destroy the economy. Conspiracies about plots to assassinate him abound.

In South Africa conspiracy theories proliferate from the constant crises around embattled Zuma and the ANC. A constant refrain has been that the media conspire with third parties to discredit the ANC and mislead the public.

Conspiracy theories: narratives on steroids

Stories and storytelling are an inextricable part of human consciousness. It is through stories that we interpret the world, imagine other possibilities and adopt other perspectives.

In this way humans are hardwired for conspiracy stories that contradict official accounts of events or factual evidence. As such conspiracists are not unhinged or paranoid. They “cut across gender, age, race, income, political affiliation, educational level and occupational status”. No individual or group is immune from conspiratorial thinking. And if a group believes one conspiracy, it is likely to believe others.

Conspiracy theories – described as “narrative on steroids” – offer enticing clickbait opportunities for the human brain on social media platforms. The typical fictitious plot describes the sinister machinations of powerful groups or organisations that work in secret against the public good.

The danger of such narratives is exemplified by the bizarre US “Pizzagate” saga. A gunman fired an assault weapon in a pizza shop acting on fabricated social media claims that it was the site of a child sex abuse ring that involved Hillary Clinton.

Research shows that narratives have powerful traction online when they feed into a conspiratorial worldviewthat affirms a rejection of official explanations. Such stories consciously, or unconsciously, induce emotional contagion – communal emotions of hate, anger and fear – that are further amplified.

Why technology can’t save us from ourselves

Is the post-truth climate and the concomitant surge in falsehoods and conspiracy theories a spasm in history, or does it reflect a seismic political shift? The jury is still out. But social media as a news source – without the fact-checking and the editorial filters of responsible journalism – is a growing trend.

It would seem logical that a “digital virus” of insidious mistruths and half-truths created by the use of technology would, and should, be cured by technology. Some technological correctives have in fact presented themselves. Facebook, for example, has announced it will use fact-checking services to flag fake stories as “disputed”.

But technology cannot be the panacea when the intense and overwhelming social media space presents a perilous mismatch with our innate human capacities and tendencies. This human-digital interface makes social media the most effective and dangerous enabler of human irrationality, distorted perceptions, and conspiratorial thinking ever invented.

Lyn Snodgrass, Associate Professor and Head of Department of Political and Conflict Studies, Nelson Mandela Metropolitan University

This article was originally published on The Conversation. Read the original article.

http://lessonbucket.com/news-and-comment/social-media-human-nature-spawned-hoaxes-hate-mongering/

What Are Some of the Known Issues With Social Media?

Social media isn't all just fun and games with your friends, celebrities you admire, and brands you follow. There are lots of common problems that most major social media platforms haven't totally solved, despite their effort to do so.

Spam: Social media makes it easy for spammers — both real people and bots — to bombard other people with content. If you have a Twitter account, you've probably experienced a few spambot follows or interactions. Likewise, if you run a WordPress blog, you may have gotten a spam comment or two caught by your spam filter.

 

Cyberbullying/Cyberstalking: Children and teenagers are especially susceptible to cyberbullying because they take more risks when it comes to posting on social media. And now that we all interact on social media via our mobile devices, most major platforms make it possible to share our locations, opening up the doors for cyberstalkers to target us.

 

Self-image manipulation: What a user posts about themselves on social media only represents a small portion of their life. While followers may see someone who's happy and living it up via their posts on social media in such a way that makes them feel boring or inadequate by comparison, the truth is that users have the power to completely control what parts they do and don't want to broadcast on social media to manipulate their own self-image.

 

Information overload: It's not unusual to have over 200 Facebook friends or follow over 1,000 Twitter accounts. With so many accounts to follow and so many people posting new content, it's almost impossible to keep up.

 

Fake news: Fakes new websites promote links to their own totally false news stories on social media in order to drive traffic to them. Many users have no idea that they're fake in the first place.

 

Privacy/Security: Many social media platforms still get hacked from time to time despite having good security measures in place. Some also don't offer all the privacy options that users need to keep their information as private as they want them to be.

 

What Does the Future Hold for Social Media?

It's difficult to predict anything exactly, but if one thing can be said about the future of social media, it will probably be more personalized and less noisy. Over-sharing will be less of a problem and filtering out irrelevant information will become a stronger trend.

Snapchat is a social media platform that's really at the forefront of social media evolution. Rather than blasting out updates for all our friends and followers to see, we use Snapchat more like we communicate in real life–with specific people only at specific times.

 

If anything, social media is probably about to move more toward ephemeral sharing for quicker, more intimate sharing without the stress of having to blast something out to hundreds or thousands of followers that stays up there unless it's manually deleted. Instagram has already made the move toward ephemeral content sharing with its Snapchat-like stories feature, so maybe more platforms will be soon to follow.

Updated by: Elise Morea

The positives and negatives of using social networking sites

Facebook Plans to Rewire Your Life. Be Afraid.

FEB 17, 2017

 

ByLeonid Bershidsky

 

Facebook founder Mark Zuckerberg's manifesto, penned clearly in response to accusations leveled at the social network in the wake of the bitter U.S. election campaign, is a scary, dystopian document. It shows that Facebook -- launched, in Zuckerberg's own words five years ago, to "extend people’s capacity to build and maintain relationships" -- is turning into something of an extraterritorial state run by a small, unelected government that relies extensively on privately held algorithms for social engineering.

 

In 2012, Zuckerberg addressed future Facebook investors in a letter attached to the company's initial public offering prospectus. Here's how he described the company's purpose:

 

People sharing more — even if just with their close friends or families — creates a more open culture and leads to a better understanding of the lives and perspectives of others. We believe that this creates a greater number of stronger relationships between people, and that it helps people get exposed to a greater number of diverse perspectives. By helping people form these connections, we hope to rewire the way people spread and consume information. We think the world’s information infrastructure should resemble the social graph — a network built from the bottom up or peer-to-peer, rather than the monolithic, top-down structure that has existed to date. We also believe that giving people control over what they share is a fundamental principle of this rewiring.

 

Whatever those beliefs were based on, they have largely failed the test of time. Instead of creating stronger relationships, Facebook has spawned anxieties and addictions that are the subject of academic studies from Portugal to Australia. Some studies have determined that using Facebook detracts from a user's life satisfaction.

A Danish experiment in 2015, involving people weaned from Facebook for a week and a control group that kept using it, showed that people on the social network are 55 percent more likely to feel stressed; one of the sources of that stress is envy of the glossified lives reported by other users. Users' well-being, research has showed, only tends to increase when they have meaningful interactions -- such as long message exchanges -- with those who are already close to them. 

 

In his latest manifesto, Zuckerberg uses parenting groups as an example of something his company does right. But recent research shows that some new mothers use Facebook to obtain validation of their self-perception as good parents, and failing to get enough such validation causes depressive symptoms.

As for the "rewired" information infrastructure, it has helped to chase people into ideological silos and feed them content that reinforces confirmation biases. Facebook actively created the silos by fine-tuning the algorithm that lies at its center -- the one that forms a user's news feed. The algorithm prioritizes what it shows a user based, in large measure, on how many times the user has recently interacted with the poster and on the number of "likes" and comments the post has garnered. In other words, it stresses the most emotionally engaging posts from the people to whom you are drawn -- during an election campaign, a recipe for a filter bubble and, what's more, for amplifying emotional rather than rational arguments.

Bragging in his new manifesto, Zuckerberg writes: "In recent campaigns around the world -- from India and Indonesia across Europe to the United States -- we've seen the candidate with the largest and most engaged following on Facebook usually wins." In the Netherlands today, liberal Prime Minister Mark Rutte's page has 17,527 likes; that of fiery nationalist Geert Wilders, 174,188. In France, rationalist Emmanuel Macron has 165,850 likes, while far-right Marine Le Pen boasts 1.2 million. Helping them win is hardly something that would make Zuckerberg, a liberal, proud -- but, with his algorithmic interference in what people can see on his network, he has created a powerful tool for populists.

Zuckerberg doesn't want to correct this mistake and stop messing with what people see on the social network. Instead, the new manifesto talks about Facebook as if it were a country or a supranational bloc rather than just a communication-enabling technology. Zuckerberg describes how Facebook sorts groups into "meaningful" and, presumably, meaningless ones. Instead of facilitating communication among people who are already part of social support groups offline, he wants to project Facebook relationships into the real world: Clearly, that's a more effective way of keeping competitors at bay. 

The Facebook chief executive says his team is working on artificial intelligence that will be able to flag posts containing offensive information -- nudity, violence, hate speech -- and pass them on for final decisions by humans. If past experience is any indication, the overtaxed humans will merely rubber-stamp most decisions made by the technology, which Zuckerberg admits is still highly imperfect. Zuckerberg also suggests enabling every user to apply the filters provided by this technology:

Where is your line on nudity? On violence? On graphic content? On profanity? What you decide will be your personal settings. We will periodically ask you these questions to increase participation and so you don't need to dig around to find them. For those who don't make a decision, the default will be whatever the majority of people in your region selected, like a referendum. Of course you will always be free to update your personal settings anytime.

The real-life effect will be that most users, too lazy to muck around with settings, will accept the "majority" standard, making it even less likely that anything they see would jar them out of their comfort zone. Those who use the filters won't be much better off: They'll have no idea what is being filtered out because Facebook's algorithms are a black box. 

Zuckerberg casts Facebook as a global community that needs better policing, governance, nudging toward better social practices. He's willing to allow some democracy and "referendums," but the company will make the ultimate decision on the types of content people should see based on their behavior on Facebook. Ultimately, this kind of social engineering affects people's moods and behaviors. It can drive them toward commercial interactions or stimulate giving to good causes but it can also spill out into the real world in more troubling ways. 

It's absurd to expect humility from Silicon Valley heroes. But Zuckerberg should realize that by trying to shape how people use Facebook, he may be creating a monster. His company's other services -- Messenger and WhatsApp -- merely allow users to communicate without any interference, and that simple function is the source of the least controversial examples in Zuckerberg's manifesto. "In Kenya, whole villages are in WhatsApp groups together, including their representatives," the Facebook CEO writes. Well, so are my kids' school mates, and that's great.

People are grateful for tools that help them work, study, do things together -- but they respond to shepherding in unpredictable ways.  "Virtual identity suicide" is one; the trend doesn't show up in Facebook's reported usage numbers, but that might be because a lot of the "active users" the company reports are actually bots. If you type "how to leave" into the Google search window, "how to leave Facebook" will be the first suggestion. 

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

To contact the author of this story:

Leonid Bershidsky at lbershidsky@bloomberg.net

To contact the editor responsible for this story:

Therese Raphael at traphael4@bloomberg.net

Cracking the Code - Monday 10 April 2017

I'm a paragraph. Click here to add your own text and edit me. It's easy.

I'm a paragraph. Click here to add your own text and edit me. It's easy.

"What's on your mind?" 

It's the friendly Facebook question which lets you share what you're thinking and what you've been up to. It's also the question that unlocks the details of your life and helps turn your thoughts into Facebook's profits.

"They are the most successful company arguably in human history at just gathering people's time and turning that time into money." Reporter

On Monday night, Four Corners explores the world of Facebook and how your data is being mined to drive the huge success of the social media giant.

"Facebook's very well aware of our sentiment, our mood…it can put all that data together and start to understand who our ex's are, who our friends are, who our old friends are, who our new friends are, and that's how it really works."  Marketing Executive

Reporter Peter Greste examines the Facebook business model and shows why your private life is making them billions.

"Facebook has very cleverly figured out how to wrap itself around our lives. It's the family album. It's your messaging to your friends. It's your daily diary. It's your contact list. It's all these things wrapped around your life." Digital Privacy Expert

The program investigates how Facebook has the ability to track much of your browsing history, even when you're not logged on, and even if you aren't a member of the social network at all.

"Even if you close your account, even if you log out of all of your services, the way that they're set up, with their sharing buttons, they're still going to be able to build a profile for you. It's very difficult to opt out of Facebook's reach."  IT Security Consultant

And shows how the methods used to deliver targeted advertising also drives what 'news' appears in your Facebook feed, and why you are unlikely to see anything that challenges your world view. This feedback loop is fuelling the rise and power of "fake news". 

"We're seeing news that's tailored ever more tightly towards those kinds of things that people will click on, and will share, rather than things that perhaps are necessarily good for them."  Media Analyst

With more than 16 million Australian Facebook accounts, joining more than a billion other users, Four Corners investigates how much we are giving up to be part of the social network.

"If somebody was going to build a dossier on me based on what Facebook knows about me, what would it look like? I should be able to know that so that I can make informed decisions at how I'm going to use the platform." Internet Privacy Advocate

Essay Questions   to complete at the conclusion of reading  and viewing  

 

Select One of the following  

Topic One 

Has social networking improved the quality of relationships in society?

 

 750 -1000 words

I'm a paragraph. Click here to add your own text and edit me. It's easy.

Write an argumentative essay discussing how Facebook empowers or disempowers its users 

Guide on writing an

argumentative essay

Topic Two

I'm a paragraph. Click here to add your own text and edit me. It's easy.

bottom of page