Einstein Foundation Institutional Award 2023

The Berkeley Initiative for Transparency in the Social Sciences

The social sciences are driving the open science revolution full steam ahead. This year’s institutional winner of the Einstein Foundation Award, the Berkeley Initiative for Transparency in the Social Sciences, has been a pioneer in the transformation towards more transparency, reproducibility and ethics in social science research, and plans to step up efforts in the future. 

 

The open science movement has gained momentum over the last decade, but much attention has focused on application in the natural sciences and less on the social sciences. Just over 10 years ago, several investigations revealed that many experiments in the fields of psychology, economics, political science, and sociology were not reproducible, meaning their results could not be proved independently. On the other hand, there was a growing demand for policy based on evidence rather than on theories and unsupported assumptions. So, social science researchers were increasingly keen to close this gap and introduce more open scientific practices to improve their research. 

The Berkeley Initiative for Transparency in the Social Sciences (BITSS), the winner of the Einstein Foundation’s 2023 Institutional Award, is one of the foremost proponents of this transformation towards more transparency and reproducibility in the social sciences. Since its foundation in 2012, it has provided open science education and guidance for researchers, set up a sustainable and inclusive community to support transparent practices, and supported meta-science research into improving scientific credibility. 

It all started when Edward Miguel, a development economist and a professor at the University of California, Berkeley, and co-founder of BITSS, was working in Sierra Leone about 20 years ago, right after the civil war had ended. He and his colleagues were researching the best ways to improve local governance in the region. “We decided to write a pre-analysis plan, a detailed document laying out the analysis we intended to do across all these dozens of measures that we would combine in the statistical analysis,” said Miguel, who explained that they were worried their research may be misinterpreted or disregarded in the absence of positive results. 

The use of a pre-analysis plan proved vital for validating the research and helped ensure the results were still published, despite the measures not showing positive effects in the region. When Miguel, who also leads the Center for Effective Global Action (CEGA) — a hub for research on global poverty headquartered at the University of California, Berkeley, within which BITSS is based — presented this example at a political science conference in 2012 there was a lot of interest from other researchers who were eager to explore it for their own studies. 

Miguel and colleagues then set up what would become the first BITSS annual meeting in Berkeley where social science researchers were invited to come and join the discussion. “By the end of that full day, we came up with the name for BITSS and understood that there were specific issues and policies and practices that could benefit social science across multiple fields.” 

Summarizing the findings of that meeting, Miguel and other attendees published a paper in 2014 in Science that outlined the goals they hoped to achieve at the newly founded BITSS. For example, bringing transparency to the fore by improving disclosure of research methods, encouraging the registering of trials in a formal registry before the intervention is carried out (preregistration) together with the submission of pre-analysis plans, as well as open science practices such as data and material sharing. 
 

“BITSS has been at the forefront of helping to spread good practices and one of the ways they do that is by promoting study replication, which is a good thing for science.” (Alvin Roth) 


“Today, thousands of pre-analysis plans are written a year, not just in development economics, but in other fields,” said Miguel. “That's one of our accomplishments, really advancing that agenda from being the first to do one of these plans in economics, up to the point where it's become standard practice within a decade.” 

Another crucial facet of BITSS's efforts to enhance the quality of social science research is its role in encouraging study replications to assess the validity of earlier study results. “In many social sciences, engaged researchers have been constructing what is sometimes called a credibility revolution. The idea is to make research more likely to be reliable, reproducible, scientific knowledge rather than just an artifact of the way a study was done or how the statistics were conducted,” said jury member Alvin Roth, an economist and professor at Stanford University. “BITSS has been at the forefront of helping to spread good practices and one of the ways they do that is by promoting study replication, which is a good thing for science.” 

The Social Science Reproduction Platform, which was set up as part of the Accelerating Computational Reproducibility in Economics project led by BITSS, is helping to spread the word about study replications. To date, more than 580 users in 45 countries have logged more than 180 reproductions on the platform. 

Step by step, BITSS is building a worldwide community of transparency in the social sciences. Since its initiation in 2012, the annual meeting has continued each year with presentations from hundreds of researchers across various disciplines covering topics related to open science practices and meta-science research. Training has been a strong focus from the beginning, and to this day, it has equipped thousands of researchers across five continents with transparent research methods
 

“Our feeling is that grassroots, groundup training and change of attitudes and norms is really going to transform the field, so we focus on young scholars and really get them excited about these ideas.” (Edward Miguel) 
 

BITSS encourages social science researchers, faculty and graduate students around the world to become ‘catalysts’ and join a network of scientists championing research transparency, reproducibility and ethics. These individuals lead training sessions on open science at more than 100 institutions around the world – from Cairo to California. 

“There's a whole range of training activities, in person, online, and the training courses done by the catalysts. That’s a key pillar of BITSS, because our feeling is that grassroots, ground-up training and change of attitudes and norms is really going to transform the field, so we focus on young scholars and really get them excited about these ideas,” explained Miguel. 

It can be hard to measure the impact of transparency and open science practices such as those encouraged by BITSS, but a recent article in Nature Communications, which Miguel co-authored, shows a significant rise in adoption of open science practices such as study preregistration, data or instrument sharing, from 25 per cent in 2009 to over 80 percent in 2020. 

Despite significant improvements, there is still more to be done. Preregistrations of studies are now listed on an economics trial registry and forecasts of study impacts are filed on the Social Science Prediction Platform run by BITSS on a regular basis, but BITSS is working hard to make sure proposed trials are also completed and published to achieve wider transparency and open science goals. “There are still a lot of preregistrations where several years after the study was meant to take place results have never been made public. They haven't been published or been posted on the registry. There's still a lot of results that are disappearing,” explained Miguel. 

He and his team are now following up on some of the early work on pre-analysis plans. They plan to do a randomized controlled trial using different types of outreach methods to authors to try to understand motivations behind publication of results. They hope to find out why registered projects may not have been completed and to incentivize and encourage people to make any results that were achieved public. 

Looking at the future, the BITSS team plans to enhance its training efforts, foster community building, publish more working papers on the preprint server MetaArXiv it hosts, and carry out meta-science experiments to assess the best ways to practice open science. To enable more accurate meta-science, an “Impact Data and Evidence Aggregation Library database” is being built to document study designs, features of the interventions being evaluated, study data, and estimated impacts. 

“Fourteen years ago, when we were starting to do this work, it was a very non-mainstream thing to do, but I think the work has become much more mainstream” said Miguel. This kind of recognition makes you appreciate that and the award funding will help us advance all these objectives.”