Talks

Talks:

  • Dr. Thomas Grote: “Ethics and Philosophy Lab” of the Cluster of Excellence “Machine Learning: New Perspectives for Science” at the University of Tübingen. (See Dr. Grote’s homepage for further information: https://sites.google.com/view/thomas-grote/startseite)

    Keynote talk: “Regulating AI Systems – Lessons from Clinical Medicine”

  • Paul C. Bauer: Mannheim Centre for European Social Research

    Talk: ‘Evaluating survey measures of democratic quality & social cohesion using machine learning tools.’

    Click to read abstract Trust has become one of the foundational concepts of contemporary social theory. Still,empirical research on trust relies on a relatively small set of measures which are increasingly debated. Using data from an online, self-administered questionnaire which was conducted amongst a US representative sample (N = 1,500) and relying on a combination of open-ended probing data and supervised machine learning, our study compares the validity of standard measures of generalized social trust with more recent, situation-specific measures of trust. We find that measures that refer to strangers generally better reflect the conceptual idea of measuring trust in unknown others. Moreover, situation-specific measures even further reduce variation in associations, i.e., produce a more similar frame of reference which is desirable from a measurement perspective. We also present evidence that individuals’ the association may differ in terms of sentiment, independently of the trustee category. Finally, we end with a discussion of the hard-to-solve challenge of formulating general but not too general survey measures ​
  • Dr. Simon Schaupp: is a sociologist, working at the chair of social structure analysis, University of Basel, Switzerland.

    Keynote Talk: ‘The technopolitics of algorithmic management’

    Click to read abstract Algorithms are organizing technologies. Like organizational rules or laws, they define procedures according to which certain actions are to be executed. Yet, while we commonly identify laws or organizational rules as political issues, which are negotiated in the context of conflicting interests, we often fail to see the political nature of algorithms. This lecture aims to develop a political perspective on technologies of algorithmic management in contexts of work. It reconstructs how algorithmic management gained the importance it has in today’s world of work, from wearable devices controlling individual workers to overarching resource planning systems. The lecture will show how technopolitical negotiations at various levels have shaped algorithmic management so that it often resembles the logic of cybernetic management with its emphasis on feedback-based self-organization. Drawing on ethnographic fieldwork in factories and delivery companies, the lecture will emphasize that digitalization is not only shaped by engineers and managers but also by the various appropriation strategies developed by workers in their everyday use of technology. Thus, the lecture argues that developing a political perspective on digitalization is a prerequisite for democratic deliberation on desirable technological futures.
  • Dr. Viktoria Spaiser: is an associate Professor in Sustainability Research and Computational Social Science, UKRI Future Leaders Fellow, School of Politics and International Studies (POLIS) at the University of Leeds

    Keynote: ‘AI and ML - Tools for tacklinggreat societal challenges’

    Click to read abstract The talk will start by providing an overview of some of the greatest challenges that societies around the world are facing right now, as well as an overview of AI and ML approaches that have been suggested or used to tackle some of these challenges. The focus will then turn specifically to the climate crisis challenge. Efforts to mitigate climate change will require not only engineering solutions but also social and political solutions that will allow for technological solutions to be effective. Hence, social sciences are key to tackling the climate crisis challenge. The advance of AI, ML and other computational methods has recently transformed the social sciences, creating a new field computational social science. Various examples will be reviewed during the talk where computational social science approaches have been used to study social change in response to the climate crisis. Mentioned will be also computational social science studies, which for instance uncovered powerful networks that have been undermining world’s climate mitigation efforts. Finally, we will discuss where computational social science can contribute to future social science research on climate change, for instance going beyond describing social change by researching mechanisms and dynamics to steer democratic climate-positive social change.
  • Prof. Dr. Andreas Jungherr: holds the Chair for the Governance of Complex and Innovative Technological Systems at the Institute for Political Science at the Otto-Friedrich-University Bamberg. (See Prof. Jungherr’s homepage for further information: https://andreasjungherr.net/)

    Keynote talk: “AI and democracy: AI in politics and the politics of AI”

    Click to read abstract As artificial intelligence technology features in ever more aspects of social and economic life, AI becomes political. The use of AI by politicians and states is associated with both hopes and fears. Will AI increase social inequalities and allow politicians to split and manipulate the public? Or will AI allow politicians and states to tackle important problems with better information and new solutions? These fears and hopes find expression in public debate and political contestation. Scholars have an important role in this debate. They are on the frontlines of AI development and its implementation. At the same time, excessive fears or hopes regarding the impact of AI might make for great movie-plots but not so much for good policy advice or ethical guidelines. So, while we need to discuss the potential uses and effects of AI in democracies, we should not be carried away by excessive fears or hopes and thereby accidentally block progress or sow distrust in politics and elections.