Youtube self-radicalization as a bespoke transformative experience
/Philosopher Laurie Paul recently published a book about transformative experiences, which she understands as events that change someone's personality or values. If Nina Strohminger is right that one's self is to a large extent identified with one's values, then going through a transformative experience means becoming a different person.
Typical examples of transformative experiences could be classified as Big Honking Deals. Becoming a vampire. Going to war. Having a child. Enduring a severe mental disorder. But transformative experiences can also occur more slowly and without attracting attention. Though the typical examples are relatively short, time-stamped encounters characterized by trauma, drama, or melodrama, other transformative experiences happen more slowly. You move to a new town and slowly find yourself rooting for their football team, even though you used to despise the whole sport. You lose a friend and eventually realize that you deeply disagree with them about religion, even though you went to the same church. You go to college, major in sociology, and find yourself one day earnestly utterly the word 'differance'.
In this post, I'm interested in another such slow-burning transformative experience: self-radicalization on Youtube. Youtube serves videos to browsers. In some cases, it simply delivers the link someone enters in their url bar. In other cases, it delivers the Google-determined answer to a query the user enters in Youtube's search bar (Google owns Youtube). While the algorithm that determines the answer is proprietary, we know that it is highly similar to the PageRank algorithm, which in turn resembles a Condorcet voting procedure in a social network. In still other cases, Youtube suggests videos to a user based on the videos they previously watched and the videos subsequently watched by other users who also watched (most of) the videos they watched. Such individualized recommendation processes rely on what's called profiling: building up datasets about individual users that help predict what they think, like, and care about. The algorithms that power these recommendation systems are powerful, relying on hidden Markov models, deep learning, and/or neural networks.
These algorithms are built to optimize a variable chosen and operationalized by their coders. In most cases, that variable is engagement: the likelihood that the user will mouse-over, click on, like, comment on, or otherwise interact with an item. Eli Pariser and others have pointed to the ways in which optimizing for engagement (rather than, say, truth, reliability, sensitivity, safety, or some other epistemic value) leads to social and political problems. PageRank and its derivatives can be gamed by propagandists, unduly influencing election outcomes. Even when no nefarious plots are afoot, engagement is at best a loose proxy for epistemic value.
One especially worrisome consequence of optimizing for engagement is the possibility of creating bespoke transformative experiences that radicalize viewers. It's already been argued that conspiracist media such as Fox News has radicalized a large proportion of the Baby Boomer generation. Fed a little hate, they kept watching. The more they watched, the more hate they imbibed and the less connected with truth they become. Over time, Fox ceased to be the contemptible fringe and was usurped by Breitbart, Newsmax, and Infowars. Now Steve Bannon and Stephen Miller are in the White House advising the Trump administration.
I lay a great day of the blame for this at the feet of Rush Limbaugh and the Baby Boomers who half-intentionally poisoned their minds with his bluster and bullshit on AM radio throughout the 1990s. (Remember "America under siege?") But what worries me now is that the general-purpose, mind-poisoning transformation that the Baby Boomers suffered is being individualized and accelerated by the recommendation algorithms employed by Google. Engagement tracks people's emotions, which can be positive or negative. Recent studies suggest that both nascent right-wing white nationalists and nascent Islamist terrorists are increasingly learning to hate by following a string of Youtube recommendations that take them from incredulity to interest to fascination to zealotry. If this is right, then an additional side-effect of optimizing for engagement is the creation of a small but determined group of extremists bent on revanchist politics and revolutionary violence.
Somebody call Sergey Brin.
Mark Alfano. Cambridge University Press (2013).