Notes from the 2019 Annual BITSS Conference

[Excerpts taken from the article “The 2019 BITSS Annual Meeting: A barometer for the evolving open science movement” by Aleksandar Bogdanoski and Katie Hoeberling, posted at]
“Each year we look forward to our Annual Meeting as a space for showcasing new meta-research and discussing progress in the movement for research transparency.”
“…we’ve historically focused our efforts on research conduct, rather than on publishing. It’s become abundantly clear in recent years, however, that access is a critical component in the production and evaluation of social science.”
“Acknowledging this, we’ve forged fruitful partnerships with stakeholders on the “other end” of the scholarly communication cycle, including the Journal of Development Economics and the Open Science Framework Preprints platform.”
“Wide access to training resources is similarly critical for normalizing open research practices. Open Science (with a capital O and S) has only recently entered the social science curriculum.”
“Speakers on the meeting’s final panel discussed the challenges they’ve faced in trying to institutionalize open science curriculum and supporting their students in applying open principles outside of the classroom, as well as approaches and resources they’ve found helpful.”
“Instructors and students looking to teach or learn transparent research practices can start at our Resource Library or this growing list of course syllabi on the OSF.” 
“Having organized eight of these annual events, a few other patterns have begun to emerge. One of the most exciting developments we’ve seen is that our meetings have shifted focus from diagnosing problems in research, to testing interventions and assessing adoption and wider change.”
“There is no longer a need, at least in this community, to debate whether or not publication bias exists or that perverse incentives lead researchers to use questionable research practices, for example. How we measure and correct for them, however, remain open questions. Such questions were discussed during the first block of research presentations, which proposed sensitivity analysis in meta-analysis and revised significance thresholds compatible with researcher behavior to correct for publication bias, plus a framework to translate open practices for observational research.”
“Relatedly, the use of pre-registration and pre-analysis plans (PAPs) is becoming more normative than cutting edge.”
“Finally, it’s become clear that the reach and efficacy of many open science tools can benefit from, and often requires, the support of diverse stakeholders, as well as rigorous evaluation components integrated in interventions from the beginning. The final block of presentations explored the application of open science principles in novel contexts, including Institutional Review Boards and qualitative research with sensitive data, and offered a general framework for designing and evaluating open science interventions.”
“If you missed the meeting, or want to revisit any of the sessions, you can find slides on this OSF page, watch videos on our YouTube channel, and find open access versions of the papers in the event agenda. The summaries of each session can be found below…” [go to link below].
To read the article, click here.

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: