Coronavirus, Online Algorithms and the need for Public Service Broadcasting

Indian journalists standing outside of a hospital in India, where a student who had been to China is kept in isolation on Jan. 30, 2020. Picture by: https://www.militarytimes.com/news/pentagon-congress/2020/01/30/coronavirus-could-become-a-major-security-threat-in-developing-countries-military-commanders-warn/

“The Corona pandemic shows us once more that we need public broadcasting services to keep citizens equally informed and provide us with well-researched, fact-based information,” argues Dr. Clemens Apprich, assistant professor at the department of journalism and media studies at the University of Groningen. Apprich is an expert for digital cultures, online personalization and algorithmic filtering.

“In the past, mass media like newspapers, radio, or television was the average citizen’s primary source of news and information. Its job was to keep all citizens equally informed about important issues such as new laws, regulations, or  and currently, the Corona Pandemic,” says Dr. Apprich. But things have changed – nowadays, most Europeans and citizens of the global west access their news online. A study conducted by the Pew Research center reveals that about two third of US-Americansget their news on social media. This development has profound consequences for the public sphere and society at large, Dr. Apprich says. Social media networks and online search engines are are changing the way we receive an consume the news. They feed us our own preferences so we are no longer exposed to differing ideas.

According to Dr. Apprich, social media platforms such as Facebookor Google can not take the role of public service news providers; in fact, they even deny responsability for their content. However, these platforms do deliver content based on what is trending, in order to amplify users’ activity. When users inform themselves via social media, the lines between supposedly important issues (the corona virus pandemic) and unimportant ones (sport events, celebrity gossip, etc.) blur. „Social media users’ newsfeed might be full with news about their favorite celebrity, or issues that users seem to be interested in. Important issues, such as the Corona pandemic might not appear in the newsfeed. Online content is personalized and tailored for users‘ interests, but appears as legit, important news“, Apprich says.

Dr. Apprich names examples, such as the impeachment process against Donald Trump. Internet users that are identified by the algorithms as Trump supporters are more likely to be exposed to news against the impeachment. Research was done in which people used the same words for a Google search but the search results were different. The results were different because the Google algorithm delivers each Internet user individually tailored media content.

Algorithms filter the Internet and deliver content to users that are already in accordance with it in order to keep them attracted to the service. People are being being lured in by the idea of receiving the best content. They check their social media and see news of their favorite sports team, their favorite celebrity and their favorite political party. This creates the illusion of being informed, while the algorithm just delivers users with news that are tailored for them.

Leon Seidel is a media student and agrees with Dr. Apprich: “Most of the news that I encounter in social media seem to be tailored to me. It is difficult to access news which oppose my political orientation or challenge my norms and values. Just last week, I had multiple posts of football matches on my Facebook landing page but not a single post about the Corona pandemic. Sometimes, I feel like I am in an echo chamber and I want to stop using social media for news consumption.” 

This development is in contrast to traditional mass media and their one-size-fits-allnews reports. Even though a newspaper tends to speak to its readership as well, the idea is still to provide universal and public information. Dr Apprich adds: “Social media users engage merely with content that they are familiar with and which confirms their believes, as well as their values and interests. They are kept in so-called filter bubbles.” But most internet users don’t realize this. According to a study conducted by the Pew Research centre, 75% of US-Americansfeel better informed due to the internet.

“The Internet is showing us what it thinks we want to see, but not necessarily what we need to see,“ writes Eli Pariser in his book Filter Bubbles – What The Internet Is Hiding From You. The filter bubble is the world crafted by the shift from “human gatekeepers”, such as newspaper editors who curate importance by what is newsworthy and makes the front page, to the algorithmic ones employed by Facebook and Google, which present the content they believe a user is most likely to click on.

Pariser writes, that this new digital universe is “a cozy place, populated by our favorite people and things and ideas.” These unique universes are also dangerous, as they “alter the way we’d encounter ideas and information.” They prevent us from the kind of spontaneous encounters with ideas that promote creativity and, perhaps more importantly, throw our attention to matters of irrelevance, just because they suit our interests.

The picture by: https://miro.medium.com/max/3020/1*RiUTOOmGxl2uJLWYHLtYMA.png illustrates how filter bubbles restrict our exposure to new, differing content online. Gate keepers (algorithms) prevent certain information (here: colorful circles) from reaching us.

Dr. Apprich confirms this by saying that “people who are only using social media networks to get their news are missing important information. Online, the algorithm decides for you what to see and what not to see.” Before any news appear in your social media timeline, they have to pass the ‘gate-keeper’, in this case an algorithm. As a consequence, an Internet user who is primarily following her favorite comedian, for example, is exposed unevenly to content about make-up and make-up artists. Other information, such as EU immigration issues, presidential elections or international conflicts, are filtered away lowering the chances that such news would reach her.

The main problem with today’s online algorithms is that they follow a commercial, instead of a public interest. Do we maybe need a public social network, which merely exists to nurture the public sphere? This might be a good idea, argues Dr. Esteve Del Valle, assistant professor for Media and Journalism studies at the university of Groningen. Dr. Esteve Del Valle has done a lot of research on new media, digital democracy and social networks. “Television is regulated because of the strong influence that its content has on the public. Why don’t we regulate social media as well?” According to Dr. Esteve Del Valle, social media interfere in national issues of countries, such as Facebook interfered in the US presidential election 2016, by allowing commercials, ads and news content regarding the election in its network. Therefore, he says, social media companies should be subject to state regulations.

Dr. Apprich points out that “traditional mass media are responsible for the content that they publish, while social media websites such as Facebook are not. This lack of responsibility nurtures the spread of fake news, as social media websites claim to not be committed to delete ‘fake’ content from their website.” Traditional newspaper as an example can even be sued if they publish “fake” or inaccurate news. But as many social media firms claim to not be responsible for the news that they exhibit, they aren’t subject to such lawsuits. In fact, a study published by the scientific journal Nature: Human Behavior reveals that Facebook spreads fake news faster than any other social media platform.

Dr. Q. Zhu from the University of Groningen is an expert for social and political implications of new media technologies. She agrees with Dr. Apprich and adds that social media firms should be held responsible for fake news, primarily because of the big influence that they have on the public sphere. The worst thing, she says, is that social media and search engine algorithms keep on recommending and presenting content to other users and amplify the spread of fake news.

A case in point, Fabian Mulder consumes news online and remembers how he thought that a majority of Chinese are infected with the Corona Virus. “I read it on Facebook and didn’t check the source. I pressed ‘like’, as the article was interesting. That was stupid. Anyway, for weeks, I found many similar articles in my Facebook timeline. Even until today, I find such articles which are obviously fake.”

This example illustrates again the importance of public service broadcasting. Dr. Apprich adds: “We need public service broadcasting more than ever. All citizens should be informed and have access to verified information that is presented to them with as little influence of commercial interests as possible.” An example is the coverage of the Syrian civil war in Western media. Many atrocities and war crimes are reported by public broadcasting services, such as Das Erste (Germany) or the BBC (UK). Despite the fact that this reporting has all too often failed, it might not take place at all if it was for commercial interests only.

Clemens Apprich at the Leuphana University Lüneburg, Germany; Photo by: https://www.leuphana.de/forschung/aktuell/ansicht/datum/2018/05/08/ohne-eine-kritische-reflexion-der-digitalisierung-laufen-wir-vorwaerts-ohne-wirklich-zu-verstehen.html

Dr. Apprich calls on us, news consumer, “to take the media more seriously.” He says: “Engage with the content and think about where it is coming from. What are the economic interests of the services that provide me with news content? Above all, we should think more about how information is created. Actors in the news and media business always have to make decisions about whether to include or exclude certain information. Not just as media scholars, but also as media consumers we should keep these filtering processes in mind.”

Sources : 

Dr. Clemens Apprich, Dr. Q. Zhu and Dr. M. Esteve Del Valle from the University of Groningen, the Netherlands

 

https://www.journalism.org/2018/09/10/news-use-across-social-media-platforms-2018/

Retrieved on Wednesday, April 22, 2020 at 14:35h.

https://www.pewresearch.org/internet/2014/12/08/better-informed/

 Retrieved on Wednesday, April 22, 2020 at 14:35h.

https://www.forbes.com/sites/traversmark/2020/03/21/facebook-spreads-fake-news-faster-than-any-other-social-website-according-to-new-research/#6f7d20a56e1a

 Retrieved on Wednesday, April 22, 2020 at 14:35h.

https://research.msu.edu/the-effects-of-algorithms-on-internet-behaviors/

 Retrieved on Wednesday, April 22, 2020 at 14:35h.

https://www.washingtonpost.com/news/the-intersect/wp/2015/03/23/what-you-dont-know-about-internet-algorithms-is-hurting-you-and-you-probably-dont-know-very-much/

 Retrieved on Wednesday, April 22, 2020 at 14:35h.

https://www.pewresearch.org/internet/2017/02/08/code-dependent-pros-and-cons-of-the-algorithm-age/

 Retrieved on Wednesday, April 22, 2020 at 14:35h.

https://www.theverge.com/interface/2019/11/12/20959479/eli-pariser-civic-signals-filter-bubble-q-a

 Retrieved on Wednesday, April 22, 2020 at 14:35h.

https://sites.bu.edu/cmcs/2018/12/06/the-isolating-web-do-filter-bubbles-narrow-down-our-mind/

 Retrieved on Wednesday, April 22, 2020 at 14:35h.

http://snurb.info/files/2011/Gatekeeping,%20Gatewatching,%20Real-Time%20Feedback.pdf

Retrieved on Wednesday, April 22, 2020 at 14:35h.

Close Menu