An international technology expert and academic has claimed that the social drive to spread misinformation online is eroding humanity.
Professor Nolen Gertz, a technology ethicist and assistant professor at the University of Twente in the Netherlands, said people are unwilling to take responsibility for what they post on social media.
Prof Gertz, an author and a member of the Scientific Committee of the Global Centre of the Future, is giving a talk in Dublin about the concerns surrounding the dangers of democracy posed by social media platforms.
Speaking to PA, he raised concerns over the public’s “knee-jerk” reaction to sharing posts on Facebook without checking its source or legitimacy.
“You have this bad faith issue of an unwillingness to take responsibility and that’s compounded by the virtual feeling of ‘I am just clicking on things on the computer, what could that do?'” he said.
“It’s this conception of what it means to be human that’s underneath all this, and that all these things erode our humanity in ways we are not even thinking about.
“People treat the virtual as implying the strict physical non-physical distinction and because we think of responsibility in causal terms, what’s not physical, what is not immediately caused by me isn’t my problem.”
He said that a major problem in preventing people from spreading misinformation online is establishing what drives the public to do it.
Prof Gertz, who will speak in the Trinity Long Room Hub on Thursday, said that Facebook’s algorithm is a huge part of this.
He added: “It’s about understanding the idea that it isn’t necessarily that people don’t care about what they are sharing but more this idea that, if you think about Facebook’s algorithm as driving, that you have to share certain kind of posts that are likely to drive engagement, and that turns every Facebook user into their own little company.
“You can imagine this sort of thinking about Facebook as filling a void in people’s lives, and when we talk about how we improve Facebook it’s not about how to better understand that void but better understand the life-preserver that we have clung to.
“It’s trying to understand what it is about society, what sort of political issues are the background of these that we feel so disconnected from each other.”
Facebook and other social media platforms have come under huge pressure to crack down on allowing fake news to spread on its platform, particularly around election misinformation campaigns.
“The concern is how platforms like Facebook change what we think responsibility even means, and we should think much more in legalistic terms about fines, regulations,” Prof Gertz added.
Earlier this year, Facebook’s Mark Zuckerberg met with a number of TDs in Dublin where they discussed the threat of fake news, child protection and the regulation of social media platforms.
The Facebook founder spoke of how the social media giant has put in place new regulations which seek to provide greater transparency in political advertising.
Prof Gertz also said that another issue stems from some government’s little understanding of how technology works and its inability to clamp down on fake news.
He added: “The people who want to have political power, the government, don’t actually understand the technology and the people who do understand the technology don’t have an interest in politics.
“So it’s a perfect recipe to have a completely useless democracy.
“This isn’t a media literacy problem but a literacy technology problem.
“It’s not about helping people identify real news from fake news, but identify the role of the algorithmic curation in social media.”