Reverse Engineering: Potenzielle Wirkmechanismen von Social Media Algorithmen experimentell erkunden

Poster-Präsentation zur GAL-Sektionentagung 2022

Kontakt:

Literatur zum Poster:

AlgorithmWatch. (2020, Januar 1). Undress or fail: Instagram’s algorithm strong-arms users into showing skin. https://algorithmwatch.org/en/story/instagram-algorithm-nudity/

Ananny, M., & Crawford, K. (2018). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society, 20(3), 973–989. https://doi.org/10.1177/1461444816676645

Anderson, K. E. (2020). Getting acquainted with social networks and apps: It is time to talk about TikTok. Library Hi Tech News, 37(4), 7–12. https://doi.org/10.1108/LHTN-01-2020-0001

Anstead, N., & O’Loughlin, B. (2015). Social Media Analysis and Public Opinion: The 2010 UK General Election. Journal of Computer-Mediated Communication, 20(2), 204–220. https://doi.org/10.1111/jcc4.12102

Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A., & Bonneau, R. (2015). Tweeting From Left to Right: Is Online Political Communication More Than an Echo Chamber? Psychological Science, 26(10), 1531–1542. https://doi.org/10.1177/0956797615594620

Beam, M. A., Child, J. T., Hutchens, M. J., & Hmielowski, J. D. (2018). Context collapse and privacy management: Diversity in Facebook friends increases online news reading and sharing. New Media & Society, 20(7), 2296–2314. https://doi.org/10.1177/1461444817714790

Beam, M. A., Hutchens, M. J., & Hmielowski, J. D. (2018). Facebook news and (de)polarization: Reinforcing spirals in the 2016 US election. Information, Communication & Society, 21(7), 940–958. https://doi.org/10.1080/1369118X.2018.1444783

Bechmann, A., & Nielbo, K. L. (2018). Are We Exposed to the Same “News” in the News Feed? Digital Journalism, 6(8), 990–1002. https://doi.org/10.1080/21670811.2018.1510741

Bessi, A. (2016). Personality Traits and Echo Chambers on Facebook. Computers in Human Behavior, 65, 319–324.

Bishop, S. (2018). Anxiety, panic and self-optimization: Inequalities and the YouTube algorithm. Convergence, 24(1), 69–84. https://doi.org/10.1177/1354856517736978

Boy, J. D., & Uitermark, J. (2020). Lifestyle Enclaves in the Instagram City? Social Media + Society, 6(3), 205630512094069. https://doi.org/10.1177/2056305120940698

Bozdag, E. (2013). Bias in algorithmic filtering and personalization. Ethics and Information Technology, 15(3), 209–227. https://doi.org/10.1007/s10676-013-9321-6

Bozdag, E., & van den Hoven, J. (2015). Breaking the filter bubble: Democracy and design. Ethics and Information Technology, 17(4), 249–265. https://doi.org/10.1007/s10676-015-9380-y

Bruns, A. (2019). Are filter bubbles real? Polity.

Bruns, A., Moon, B., Münch, F., & Sadkowsky, T. (2017). The Australian Twittersphere in 2016: Mapping the Follower/Followee Network. Social Media + Society, 3(4), 205630511774816. https://doi.org/10.1177/2056305117748162

Bryant, L. V. (2020). The YouTube Algorithm and the Alt-Right Filter Bubble. Open Information Science, 4(1), 85–90. https://doi.org/10.1515/opis-2020-0007

Carrington, V. (2018). The Changing Landscape of Literacies: Big Data and Algorithms. Digital Culture & Education, 10(1), 67–76.

Cinelli, M., Francisci Morales, G., Galeazzi, A., Quattrociocchi, W., & Starnini, M. (2021). The echo chamber effect on social media. Proceedings of the National Academy of Sciences of the United States of America, 118(9). https://doi.org/10.1073/pnas.2023301118

Conners, J. L. (2005). Understanding the Third-Person Effect. Communication Research Trends, 24(2), 3–22.

Davison, W. P. (1983). The Third-Person Effect in Communication. The Public Opinion Quarterly, 47(1), 1–15.

Dubois, E., & Blank, G. (2018). The echo chamber is overstated: The moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729–745. https://doi.org/10.1080/1369118X.2018.1428656

Gerlach, D. (2020). Einführung in eine Kritische Fremdsprachendidaktik. In D. Gerlach (Hrsg.), Kritische Fremdsprachendidaktik: Grundlagen, Ziele, Beispiele (S. 7–31). Narr Francke Attempto.

Gerlach, D., & Schildhauer, P. (2023). Sprachbewusstheit für digitale Diskurse: Verschwörungstheorien online erkennen, analysieren und entkräften. Der fremdsprachliche Unterricht Englisch, 184.

Guess, A., Lyons, B., Nyhan, B., & Reifler, J. (2018). Avoiding the Echo Chamber about Echo Chambers: Why selective exposure to like-minded political news is less prevalent than you think.

Haim, M., Graefe, A., & Brosius, H.-B. (2018). Burst of the Filter Bubble? Digital Journalism, 6(3), 330–343. https://doi.org/10.1080/21670811.2017.1338145

Jones, R. H. (2021). The text is reading you: Teaching language in the age of the algorithm. Linguistics and Education, 62. https://doi.org/10.1016/j.linged.2019.100750

Kemper, K., & Schildhauer, P. (2022). Beware of Filter Bubbles: Am Beispiel climate change Algorithmen auf Instagram und deren Einfluss auf die Meinungsbildung erforschen. Der fremdsprachliche Unterricht Englisch, 178, 36–44.

Krafft, T. D., Gamer, M., & Zweig, K. (2018, Januar 1). Wer sieht was?: Personalisierung,  Regionalisierung  und  die  Frage  nach  der  Filterblase  in  Googles Such-maschine. https://algorithmwatch.org/de/wp-content/uploads/2020/03/Bericht-Datenspende-Wer-sieht-was-auf-Google.pdf

Leander, K. M., & Burriss, S. K. (2020). Critical literacy for a posthuman world: When people read, and become, with machines. British Journal of Educational Technology, 51(4), 1262–1276. https://doi.org/10.1111/bjet.12924

Masrour, F., Wilson, T., Yan, H., Tan, P.-N., & Esfahanian, A. (2020). Bursting the Filter Bubble: Fairness-Aware Network Link Prediction. Proceedings of the AAAI Conference on Artificial Intelligence, 34(01), 841–848. https://doi.org/10.1609/aaai.v34i01.5429

Mehlhose, F. M., Petrifke, M., & Lindemann, C. (2021). Evaluation of Graph-based Algorithms for Guessing User Recommendations of the Social Network Instagram. In 2021 IEEE 15th International Conference on Semantic Computing (ICSC) (S. 409–414). IEEE. https://doi.org/10.1109/ICSC50631.2021.00075

Pariser, E. (2012). The filter bubble: What the internet is hiding from you. Penguin Books.

Parmelee, J. H., & Roman, N. (2020). Insta-echoes: Selective exposure and selective avoidance on Instagram. Telematics and Informatics, 52, 101432. https://doi.org/10.1016/j.tele.2020.101432

Rau, J. P., & Stier, S. (2019). Die Echokammer-Hypothese: Fragmentierung der Öffentlichkeit und politische Polarisierung durch digitale Medien? SocArXiv. https://doi.org/10.31235/osf.io/bu2mt

Schildhauer, P. (in press). When the Algorithm Sets the Stage: Participation, Identity, and the 2020 US Presidential Election on Instagram. In A. Brock, J. Russell, P. Schildhauer, & M. Willenberg (Hrsg.), Participation and Identity (S. 143–169). Peter Lang.

Schildhauer, P., & Kemper, K. (fc.). Towards a Critical Digital Literacy Framework. Exploring the Impact of Algorithms in the Creation of Filter Bubbles on Instagram. In S. Kersten & C. Ludwig (Hrsg.), Born-Digital Texts in Language Education. Multilingual Matters.

Simpson, E., Hamann, A., & Semaan, B. (2022). How to Tame „Your“ Algorithm: LGBTQ+ Users’ Domestication of TikTok. Proceedings of the ACM on Human-Computer Interaction, 6(GROUP), 22:1-22:27. https://doi.org/10.1145/3492841

Sunstein, C. R. (2018). #Republic: Divided democracy in the age of social media (Third printing, and first paperback printing). Princeton University Press.

Törnberg, P. (2018). Echo chambers and viral misinformation: Modeling fake news as complex contagion. PLOS ONE, 13(9), e0203958. https://doi.org/10.1371/journal.pone.0203958

van Eck, C. W., Mulder, B. C., & van der Linden, S. (2021). Echo Chamber Effects in the Climate Change Blogosphere. Environmental Communication, 15(2), 145–152. https://doi.org/10.1080/17524032.2020.1861048

Williamson, B. (2017). Big data in education: The digital future of learning, policy and practice. Sage. https://doi.org/10.4135/9781529714920