Einstein Foundation Early Career Award: 2024 Finalists

From a pool of 109 applicants, five pioneering research projects were nominated to present their work during the Berlin Science Week 2024. The Einstein Foundation Award jury chose the project PixelQuality as winners. 
We are delighted to introduce you to the other finalists:

Opening the black box

IMPACT OF RESEARCH BIASES IN ECOLOGY ia a novel, replication-based approach which aims to investigate the impact of research biases in ecology by Antica Culina (Ruder Boskovic Institute, Croatia)

Ecology research generates a lot of output, including datasets, analytical codes, and results derived from these analyses. Only a small percentage of these outputs are published, and the published results are likely to be only a biased subset of all results. The data collection and analysis they are based upon may also be biased. “We still have not quantified the true extent of research biases, nor their impact on our ability to reach valid conclusions,” says Antica Čulina, an ecologist at the Ruđer Bošković Institute in Zagreb and a pioneer of open science in ecology. 

One main reason is that the research life cycle is not transparent: In ecology and other fields, potential biases are commonly estimated retrospectively, inferring the processes that underlie them from published results. Čulina wants to change that by studying the influence of research biases more holistically. “What I propose is to open the black box to observe and study the process directly, from data analysis to result publication,” says Čulina. 

In her project “Impact of research biases in ecology – a novel, replication-based approach,” she will use datasets of the SPI-Birds database on individually marked birds to conduct an innovative publication bias experiment Up to 50 teams of researchers will test a case-study hypothesis using different datasets, choosing and later sharing their analytical codes and submitting their findings, which will be reviewed by external researchers – thus mimicking a publication process. “If successful, the approach will change the way research and publishing is done and help us test some potential solutions, such as the benefits of pre-registration or data quality checks.” 
 

→ Watch the project presentation here 

Avoiding harm to patients

IPD INTEGRITY TOOL – An Innovative Tool for Detecting Untrustworthy Medical Research, by Kylie Hunter and Anna Lene Seidler (University of Sydney, Australia)

Around two percent of scientists admit to engaging in scientific misconduct, while 34 percent admit having used questionable research practices. Misconduct, fraud, and low-quality data are especially dangerous in medical studies, since they can expose patients to potential harm. They also fuel an integrity crisis in medical research. As part of the peer review process, usually only the summary data at the publication level are checked, while the important underlying raw data (known as individual participant data, or IPD) is ignored. 

The project “The IPD Integrity Tool – An Innovative Tool for Detecting Untrustworthy Medical Research” aims to improve the standard of quality control in the medical sciences and beyond. “We aim to finalize, validate, disseminate and implement an integrity tool for individual participant data to identify studies with integrity concerns,” says Kylie Hunter, a Research Fellow at the University of Sydney. Hunter joined forces with biostatistician Anna Lene Seidler at Universitätsmedizin Rostock to develop the IPD Integrity Tool. 

The tool, which will be validated with a broad range of data - including fake data sets generated by artificial intelligence - will be able to identify untrustworthy studies and remove them from the evidence base. It can be used by journals editors, meta-analysts and anyone who wants to assess the trustworthiness of research. “Our overarching goal is to protect the integrity of the evidence base that informs clinical guidelines and practice,” Hunter says, “thereby protecting patients and fostering trust in research.” 


→ Watch the project presentation here

Making cosmic simulations reproducible

Virtual Universe(s) - An open unified research environment for astrophysics and beyond, by Dylan Nelson (University of Heidelberg, Germany)

Astronomy and astrophysics simulate the fundamental forces of our universe. These digital simulations include recreations of dark matter, gas, stars, and supermassive black holes, all producing enormous amounts of data that can only be handled by large supercomputers, making them hard to analyze and reproduce. 

“Most of those immense data sets and the development of the models happen behind closed doors,” says Dylan Nelson, an astrophysicist and Emmy Noether Research Group Leader at Heidelberg University. “This results in a crisis of reproducibility in astrophysics and astronomy, with many key results being ambiguous.” For example, the code for models simulating cosmic events like supernovae explosions is often not published together with the findings – making it impossible to confirm them. So far, few international collaborations in the field have managed to transparently document, homogenize, and publish large simulation data sets. One of them is IllustrisTNG. 

Based on the success of IllustrisTNG, the Virtual Universe(s) project will develop a new data platform combining large simulation models, homogenized datasets, analysis tools, and reproducible analysis workflows. “We aim to create an ecosystem that incentivizes researchers to make not only their results available, but also the corresponding methodology and the code,” says Nelson, who also aims to open the platform for other fields like Earth sciences and computational sociology. “It will be a field-leading demonstration of open science in action across disciplines.” 
 

→ Watch the project presentation here 

Being prepared for the next epidemic

GLOBAL INFECTIOUS DISEASE DATA STANDARD (GIDS) FOR EPIDEMIC SETTINGS – to revolutionize outbreak response and improve global health outcomes, by Emily Ricotta (Uniformed Services University, Bethesda, USA)

When infectious diseases spread, it is of utmost importance to base timely policy decisions and practical instructions for healthcare on high-quality data to avoid high human cost. Observational population studies, if well designed, can facilitate a comprehensive picture of disease progression in natural contexts while also identifying high-risk populations for prevention and treatment. “Their speed, affordability, and flexibility make them widely implementable,” says Emily Ricotta, an Assistant Professor of Epidemiology at the Uniformed Services University of the Health Sciences (USUHS) in Bethesda. 

In contrast to randomized controlled trials (RCTs), efforts to standardize data collection using observational studies are surprisingly neglected, especially in epidemic settings. This can lead to studies which are biased, lack validity, are small and underpowered, and not possible to compare. The project “The Global Infectious Disease Data Standard (GIDS) for Epidemic Settings,” led by Ricotta, aims to establish the first global data standard for enhancing the quality of observational studies on infectious diseases. 

Ricotta has been working on improving methodological rigor and improving the validity of clinical studies on infectious diseases for many years. In a comment in Nature in 2023, she pointed out the methodological shortcomings in observational studies and offered ways out. With GIDS, she now aims to initiate a broad research process, including a scoping literature review and expert consultations, to establish a global standard. “I am confident we can significantly improve the reliability and generalizability of research findings, ultimately translating to better health outcomes for all,” she says.  


→ Watch the project presentation here