The 2024 U.S. Elections and the strange new information environment

 by John MacBeath Watkins


News media have certain characteristics. They have paid news gatherers and paid editors who are in charge, respectively, of of gathering and curating the news presented in the newspaper, internet site, or television or radio broadcast. Editors and journalists are relatively easy to hold accountable for their accuracy and fairness.

In the wake of the 2024 U.S. Election, it soon became evident that most of the people who relied on sites that are held accountable for their accuracy and fairness tended to vote for Harris, while those who relied on social media, Youtube, or search engines tended to vote for Trump. Young people tended to rely more on social media, search engines, or Youtube for their news than older people.*

Fox News is a bit of an outlier here. It employs editors and journalists, but it isn't really selling news, it's selling its bias to an audience that prefers a right-leaning bias. Its adherence to scrupulous truth-telling was clearly lacking when it paid a record settlement in a defamation suite brought by Dominion, a company that makes voting machines.

It is odd that so many people get their news from social media, because they are not news media. The commodity they have to sell is attention, and they sell it to advertisers. As more and more news sites are locked behind a paywall, more people are inclined to rely on social media, which do not employ journalists or editors. They display, not news, but 'content,' which is supplied by people who don't work for the social media company. What people may view on the site is not curated by an editor, but by an algorithm, which is designed to maximize viewer's engagement.

If you click on a video or some other snippet, the algorithm learns that you like that content, and feeds you more of the same, encouraging bias confirmation. If you view the content all the way through, it learns that you will have viewed more advertisements. And what algorithms have learned gets the most engagement are things that evoke fear and anger. Anger is often sparked by resentment, so the things selected for people by algorithms tend to be those that get a response of anger, fear, and/or resentment.

The algorithm is designed to provide different content to different people, and is not transparent about how it makes decisions, but in any case, its ghostly presence in our social media is not held responsible for the factuality or fairness of the content it provides. Content providers, unless they are sued for libel, are seldom held accountable for the factuality or fairness of the content they provide.

This impoverished information environment is made worse by deliberate bad actors, such as influence campaigns by foreign governments. Russian influence campaigns have infamously targeted Facebook and Twitter (now awkwardly renamed X) users. While the established social media are getting better at detecting misinformation campaigns, they are getting pushback by political actors who benefit from those campaigns, who complain of censorship

Authoritarian countries, where governments try to tightly control information, have adopted the rhetoric of freedom, condemning as 'fake news' anything they do not want their people to hear, and American politicians inclined to feel a kinship with the Russian right-wing leadership use the same language in the same way.

Authoritarian control of the information space can pose problems even for the rulers who have shaped that space, From the Feb. 5, 2024 issue of Newsweek:


Subject matter experts have long questioned the veracity of Beijing's economic reports in general, including former Chinese premier Li Keqiang, who in 2007 dismissed his country's economic data as "man-made."


One study found that without the corrective influence of a free press, authoritarian regimes report growth 0.5 to 1.5 % higher than it actually is.* This has been so persistent and so prevalent that, for example, China's GDP may well be about 40% of what official statistics say it is, based on metrics such as energy consumption. Local officials are promoted when they report high growth, punished when they don't, and since they are writing their own report cards, they are inclined to be what might charitably be called 'optimistic' in their reports of economic growth.

In both social media and authoritarian regimes, we see perverse results from perverse incentives. In both cases, we are seeing market failures in the provision of accurate information to the public.






*https://www.pewresearch.org/journalism/2024/10/10/where-americans-turn-for-election-news/

*Reconsidering Regime Type and Growth: Lies, Dictatorships, and Statistics

Christopher S. P. Magee and John A. Doces

International Studies Quarterly

Vol. 59, No. 2 (June 2015), pp. 223-237 (15 pages)

Published By: Oxford University Press



Comments