konventionell
Incommensurability: Vagueness, Parity and other Non-Conventional Comparative Relations
The workshop will focus on how one can account for value incommensurability, its implications for ethical theory and decision theory.
Incommensurability: Vagueness, Parity and other Non-Conventional Comparative Relations
The workshop will focus on how one can account for value incommensurability, its implications for ethical theory and decision theory.
The Democratic Inclusion of Artificial Intelligence? Exploring the Patiency, Agency and Relational Conditions for Demos Membership
Philos. Technol.35, 24 Abstract Should artificial intelligences ever be included as co-authors of democratic decisions? According to the conventional view in democratic theory, the answer depends on the
Prioritarianism, timeslices, and prudential value
Australasian Journal of Philosophy ABSTRACT This paper shows that versions of prioritarianism that focus at least partially on well-being levels at certain times conflict with conventional views of prud
Social Exclusion among Peers: The Role of Immigrant Status and Classroom Immigrant Density.
Journal of Youth and Adolescence. Advanced online publication. DOI: 10.1007/s10964-016-0564-5. Abstract Increasing immigration and school ethnic segregation have raised concerns about the social integrat
A more plausible collapsing principle
Theoria, Volume 84, Issue 4. doi.org/10.1111/theo.12166 Abstract In 1997 John Broome presented the Collapsing Argument that was meant to establish that non‐conventional comparative relations (e.g., “par

Completed: Firms as Political Activists: The Scope and Nature of Corporate Political Responsibility
This project explores the changing political role of corporations in the 21st century by combining political science, sociology, and business science.
Applying spatial regression to evaluate risk factors for microbiological contamination of urban groundwater sources in Juba, South Sudan
Hydrogeology Journal 25(4) pp. 1077-1091, doi: 10.1007/s10040-016-1504-x Abstract This study developed methodology for statistically assessing groundwater contamination mechanisms. It focused on microbiahumanitarian aid organisation Médecins Sans Frontières in 2010. The factors included hydrogeological settings, land use and socio-economic characteristics. The results showed that the residuals of a conventional probit regression model had a significant positive spatial autocorrelation (Moran’s I =3.05, I-stat = 9.28); therefore, a spatial model was developed that had better goodness-of-fit to the observations. The mostsignificant factor in this model (p-value 0.005) was the distance from a water source to the nearest Tukul area, an area with informal settlements that lack sanitation services. It is thus recommended that future remediation and monitoring efforts in the city be concentrated in such low-income regions. The spatial model differed from the conventional approach: in contrast with the latter case, lowland topography was not significant at the 5% level, as the p-value was 0.074 in the spatial model and 0.040 in the traditional model. This study showed that statistical risk-factor assessments of groundwater contamination need to consider spatial interactions when the water sources are located close to each other. Future studies might further investigate the cut-off distance that reflects spatial autocorrelation. Particularly, these results advise research on urban groundwater quality.
David Grusky: Should scholars own data? The high cost of neoliberal qualitative scholarship
Welcome to this seminar with David Grusky, Professor of Sociology at Stanford University.The seminar is jointly organized by the Institute for Analytical Sociology and the Institute for Futures Studies.D Thursday, October 6 13:00-15:00 (CET) At the Institute for Futures Studies (Holländargatan 13, Stockholm), or onlineIf qualitative work were to be rebuilt around open science principles of transparency and reproducibility, what types of institutional reforms are needed? It’s not enough to mimic open science movements within the quantitative field by focusing on problems of data archiving and reanalysis. The more fundamental problem is a legal-institutional one: The field has cut off the development of transparent, reproducible, and cumulative qualitative research by betting on a legal-institutional model in which qualitative scholars are incentivized to collect data by giving them ownership rights over them. This neoliberal model of privatized qualitative research has cut off the development of public-use data sets of the sort that have long been available for quantitative data. If a public-use form of qualitative research were supported, it would not only make qualitative research more open (i.e., transparent, reproducible, cumulative) but would also expand its reach by supporting new uses. The American Voices Project – the first nationally-representative open qualitative data set in the US – is a radical test of this hypothesis. It is currently being used to validate (or challenge!) some of the most famous findings coming out of conventional “closed” qualitative research, to serve as an “early warning system” to detect new crises and developments in the U.S., to build new approaches to taking on poverty, the racial wealth gap, and other inequities, and to monitor public opinion in ways far more revealing than conventional forced-choice surveys. The purpose of this talk is to discuss the promise – and pitfalls – of this new open-science form of qualitative research as well as opportunities to institutionalize it across the world.

Should Scholars Own Data? David Grusky About the American Voices Project
If qualitative work were to be rebuilt around open science principles of transparency and reproducibility, what types of institutional reforms are needed? It’s not enough to mimic open science movemen