Abstract
: A popular belief is that the process whereby search engines tailor their search results to individual users, so-called personalization, leads to filter bubbles in the sense of ideologically segregated search results that would tend to reinforce the user’s prior view. Since filter bubbles are thought to be detrimental to society, there have been calls for further legal regulation of search engines beyond the so-called Right to be Forgotten Act. However, the scientific evidence for the filter bubble hypothesis is surprisingly limited. Previous studies of personalization have focused on the extent to which different users get different results lists without taking the content on the webpages into account. Such methods are unsuitable for detecting filter bubbles as such. In this paper, we propose a methodology that takes content differences between webpages into account. In particular, the method involves studying the extent to which users with strong opposing views on an issue receive search results that are correlated content-wise with their personal view. Will users of with strong prior opinion that X is true on average have a larger share of search results that are in favor of X than users with a strong prior opinion that X is false? We illustrate our methodology at work, but also the non-trivial challenges it faces, by a small-scale study of the extent to which Google Search leads to ideological segregation on the issue of man-made climate change.