Visual Affect in Solidarity Hashtags on Twitter: Toward a New Methodology

Marloes Geboers, Chad Van De Wiele

Research output: Contribution to conferencePaperAcademic

6 Downloads (Pure)

Abstract

Studying images in social media poses specific methodological challenges, which in turn have directed scholarly attention towards the computational interpretation of visual data. When analyzing large numbers of images, both traditional content analysis as well as cultural analytics have proven valuable. However, these techniques do not take into account the circulation and contextualization of images within a socio-technical environment. As the meaning of social media images is co-created by networked publics, bound through networked practices, these visuals should be analyzed on the level of their networked contextualization. Although machine vision is increasingly adept at recognizing faces and features, its performance in grasping the meaning of social media images is limited. However, combining automated analyses of images - broken down by their compositional elements - with repurposing platform data opens up the possibility to study images in the context of their resonance within and across online discursive spaces. This paper explores the capacities of platform data - hashtag modularity and retweet counts - to complement the automated assessment of social media images; doing justice to both the visual elements of an image and the contextual elements encoded by networked publics that co-create meaning.
Original languageEnglish
Number of pages17
Publication statusPublished - 28 May 2019
EventInternational Communication Association: Beyond Boundaries - Hilton, Washington DC, United States
Duration: 24 May 201928 May 2019
Conference number: 69

Conference

ConferenceInternational Communication Association
CountryUnited States
CityWashington DC
Period24/05/1928/05/19

Fingerprint Dive into the research topics of 'Visual Affect in Solidarity Hashtags on Twitter: Toward a New Methodology'. Together they form a unique fingerprint.

Cite this