Algorithmic data analysis of social media content, especially the process of automatic verification, is a potent new tool of journalism, writes an UK-German research team. The team consists of Neil Thurman, of Ludwig Maximilian University of Munich, Steve Schifferes and Aljosha Karim Schapals, both of City University London, Richard Fletcher and Nic Newman, both of University of Oxford, and Stephen Hunt, of University College London (names not in original order).
The team discusses the utility of an analytics software called SocialSensor, which was partially developed by some of the authors. The software uses a sample of active social media content distributors, dubbed “newshounds”, to discover trending themes, for example.
The sample selection is problematic, as it emphasizes established news sources and influential (male) individuals, the authors admit. Nevertheless, the selection method is defensible as much of social media content gets disseminated by those nodes – for better or worse.
The article Giving computers a nose for news was published by the journal Digital Journalism. It is available online (abstract free).
Picture: Untitled by MAKY-OREL, licence CC0 1.0.