Skip to content

PAN-Metrics Seminars in June and July 2025

Two seminars in the PAN-Metrics series were held recently:

On June 16 Saskia Bartholomäus of GESIS – Leibniz Institute for the Social Sciences (Mannheim, Germany) presented her paper: Mitigating the Risk of Nonresponse and Measurement Errors in Political Science Surveys by Accounting for Respondents’ Political Interest.

Photo of Saskia Bartholomäus

Abstract: Political science surveys are subject to biased estimates of political attitudes and behavior due to measurement and nonresponse errors. These errors are especially pronounced among politically disengaged individuals, who are less likely to participate and more likely to give low-effort answers when they do. Improving respondents’ survey experience by offering a more interesting and varied questionnaire poses a possible solution to this problem. My research breaks ground by investigating whether including non-political survey content improves the survey experience of politically disengaged respondents and enhances data quality with respect to measurement and nonresponse errors in political science surveys. To this end, I conducted two survey experiments in probability- and nonprobability-based panels that varied the content of political science questionnaires. The results provide first evidence that offering a varied questionnaire rather than a purely political one lowers or even diminishes the gap in measurement and nonresponse errors between politically engaged and disengaged respondents. These findings open new directions for improving political science surveys by rethinking questionnaire design to address common sources of error. I conclude by outlining future research opportunities in this area.

The slides are available here.

On July 1 a seminar was held with researchers from Charles University in Prague. Hana Vonkova, Ondrej Papajoanu and Martin Bosko presented their research entitled Analyzing the Heterogeneity in Students’ Reporting Behavior in Questionnaires Using Response Times Analysis and the Anchoring Vignette Method.

Abstract: Self-report questionnaire data are commonly used in educational research, as well as in social science research in general, to gain information concerning respondents’ attitudes, beliefs, behaviors and other concepts of interest. Such data, however, could be biased due to heterogeneity in reporting behavior (RB) among respondents, who might, for example, systematically differ in the way they use response scales or in the amount of effort they put into filling in the survey. The presence of such biases in the data represents a threat to the accuracy of research findings, which might potentially lead to erroneous conclusions and negatively affect data-based decision-making (e.g., at the level of educational policy). Methodological approaches are being sought which could be used to identify differences in reporting behavior between respondents and adjust their self-reports for these differences. In the presentation, we will focus on two such approaches: response time analysis and the anchoring vignette method. The analysis of response times to questionnaire items has been suggested as a means to identify careless or insufficient effort responding, with responses too fast to be given much thought termed as speeding. Determining the time threshold for what is “too fast”, however, represents one of the methodological challenges, with multiple approaches being used in the current literature. In the presentation, we will present a response-time based approach which we introduced to cross-national research, which is built on an analysis of the relationship between a survey item and a related external variable. The anchoring vignette method (AVM) captures the differences in scale usage between respondents by comparing their ratings of short stories depicting hypothetical individuals with varying levels of the trait of interest (e.g., motivation). Using either the nonparametric or the parametric approach of the AVM, respondents’ self-reports of that trait can then be adjusted for the differences in the use of scale between respondents. In the presentation, we will explain these approaches and present the findings of our cross-national analyses of AVM data, as well as our novel use of AI in designing anchoring vignettes. Finally, other approaches to RB identification will be briefly introduced (response style identification, the overclaiming technique). The presentation will be concluded by suggestions for further research.

A photography of Hana Vonkova, Ondrej Papajoanu and Martin Bosko

The slides are available here.