qualitatively
Episodes of liberalization in autocracies: a new approach to quantitatively studying democratization
Political Science Research and Methods, 1-20 Abstract This paper introduces a new approach to the quantitative study of democratization. Building on the comparative case-study and large-N literature, it
The status of ethics in Swedish health care management: a qualitative study
BMC Health Services Research 2018 18:608, doi.org/10.1186/s12913-018-3436-8. Abstract BackgroundBy tradition, the Swedish health care system is based on a representative and parliamentary form of governm
David Grusky: Should scholars own data? The high cost of neoliberal qualitative scholarship
Welcome to this seminar with David Grusky, Professor of Sociology at Stanford University.The seminar is jointly organized by the Institute for Analytical Sociology and the Institute for Futures Studies.D Thursday, October 6 13:00-15:00 (CET) At the Institute for Futures Studies (Holländargatan 13, Stockholm), or onlineIf qualitative work were to be rebuilt around open science principles of transparency and reproducibility, what types of institutional reforms are needed? It’s not enough to mimic open science movements within the quantitative field by focusing on problems of data archiving and reanalysis. The more fundamental problem is a legal-institutional one: The field has cut off the development of transparent, reproducible, and cumulative qualitative research by betting on a legal-institutional model in which qualitative scholars are incentivized to collect data by giving them ownership rights over them. This neoliberal model of privatized qualitative research has cut off the development of public-use data sets of the sort that have long been available for quantitative data. If a public-use form of qualitative research were supported, it would not only make qualitative research more open (i.e., transparent, reproducible, cumulative) but would also expand its reach by supporting new uses. The American Voices Project – the first nationally-representative open qualitative data set in the US – is a radical test of this hypothesis. It is currently being used to validate (or challenge!) some of the most famous findings coming out of conventional “closed” qualitative research, to serve as an “early warning system” to detect new crises and developments in the U.S., to build new approaches to taking on poverty, the racial wealth gap, and other inequities, and to monitor public opinion in ways far more revealing than conventional forced-choice surveys. The purpose of this talk is to discuss the promise – and pitfalls – of this new open-science form of qualitative research as well as opportunities to institutionalize it across the world.
The normality assumption in coordination games with flexible information acquisition
Journal of Economic Theory, vol. 203, 2022. Abstract Many economic models assume that random variables follow normal (Gaussian) distributions. Yet, real-world variables may be non-normally distributed.
The Complexity of Mental Integer Addition
in: Journal of Numerical Cognition, Volume 6 (1). AbstractAn important paradigm in modeling the complexity of mathematical tasks relies on computational complexity theory, in which complexity is measur
William MacAskill: Should I donate now, or invest and donate later?
William MacAskill, Associate Professor in Philosophy at Lincoln College, Oxford ABSTRACTSuppose you are a philanthropist, and want to help others by as much as possible with your money. Should you dona
Lobbying the Client? The Role of Policy Intermediaries in Corporate Political Activity
Organization Studies Abstract Traditionally, CPA scholarship has either assumed away policy intermediaries completely, or depicted them as corporate mouthpieces. Meanwhile, research on policy intermedia
Successful and failed episodes of democratization: conceptualization, identication, and description
Varieties of Democracy Institute: Working Paper No. 97. Abstract What explains successful democratization? This paper makes four contributions towards providing more sophisticated answers to this questishowing that while several established covariates are useful for predicting outcomes, none of them seem to explain the onset of a period of liberalization. Fourth, it illustrates how the identification of episodes makes it possible to study processes quantitatively using sequencing methods to detail the importance of the order of change for liberalization outcomes.

Should Scholars Own Data? David Grusky About the American Voices Project
If qualitative work were to be rebuilt around open science principles of transparency and reproducibility, what types of institutional reforms are needed? It’s not enough to mimic open science movemen
Lukas H. Meyer: Fairness is most relevant for country shares of the remaining carbon budget
Lukas H. Meyer, Professor of Philosophy at the University of Graz, Austria, and Speaker of the Field of Excellence Climate Change Graz, the Doctoral Programme Climate Change, and the Working Unit MoraIn my talk I argue that fairness concerns are decisive for eventual cumulative emission allocations shown in terms of quantified national shares.I will show that major fairness concerns are quantitatively critical for the allocation of the global carbon budget across countries. The budget is limited by the aim of staying well below 2°C. Minimal fairness requirements include securing basic needs, attributing historical responsibility for past emissions, accounting for benefits from past emissions, and not exceeding countries’ societally feasible emission reduction rate. The argument in favor of taking into account these fairness concerns reflects a critique of both simple equality and staged approaches, the former demanding the equal-per-capita distribution from now on, the latter preserving the inequality of the status-quo levels of emissions for the transformation period. I argue that the overall most plausible approach is a four-fold qualified version of the equal-per-capita view that incorporates the legitimate reasons for grandfathering.