Einstein Foundation Early Career Award 2024
PixelQuality
“Research images are the proof of scientific findings, not just visuals. PixelQuality has set new standards for their reproducibility and transparency. This award will help further strengthen these efforts and drive change in the field.”
Helena B. Nader, President of the Brazilian Academy of Sciences and member of the award jury
The project 'PixelQuality – best practices for publishing images' was selected from 109 global applications as the winner of this year’s Early Career Award. The volunteer-based initiative brings together almost 150 researchers to enhance the quality of research images in the life sciences. PixelQuality has established guidelines and checklists for publishing clear and reproducible images. It now aims to disseminate and refine them to handle AI-assisted image generation and analysis.
The project by Christopher Schmied at the Leibniz-Forschungsinstitut für Molekulare Pharmakologie Berlin and Helena Jambor (University of Applied Sciences of the Grisons, Chur, Switzerland), is awarded €100,000.
Closing the gap
Improving the Image of Science – Einstein Foundation Early Career Researcher Award winners seek to provide the tools researchers need to publish better images. Scientists learn a lot of things in their years of graduate school and lab work, but how to present images and visual data is often overlooked. “It’s just not something that’s covered in the curriculum,” says Helena Jambor, a professor of data visualization at the University of Applied Sciences of the Grisons in Switzerland.
Too often, the result is illustrations and images that don’t live up to their potential: confusing, hard to compare, and difficult for people with common disabilities like color-blindness to access. Even in top journals, the figures that accompany research papers often lack key information or are presented in ways that are not easy to understand. The winners of the Einstein Foundation 2024 Early Career Award would like to change that. As part of a project called “PixelQuality: Best Practices for Publishing Images,” Jambor and data scientist Christopher Schmied are working to create a common set of guidelines researchers in different fields can use to maximize the potential of the images they publish.
For Jambor, the effort can be traced back to her post-doctoral work. In preparing to turn her PhD research into publications, Jambor was faced with a huge task: she had 50 terabytes of images, gene sequences and protein data, and had to condense it all into just a few figures in a paper. Rather than picking out the ones she liked best, she set about tackling the task with scientific rigor, applying lessons learned from art and cartography to sort her images into spreadsheets and find the ones that best illustrated the points she was trying to make.
“We have rules on formatting of text, on how to reference other people’s work, but no rules on how figures are to be represented? That’s a huge gap.” (Helena Jambor)
The work made her more aware of how other researchers were presenting their images. Too often, she says, they seemed to be afterthoughts, or confusing. To quantify the problem, she decides to screen hundreds of images from high-impact publications and found that 4 out of 5 failed in some basic way: scales were missing, making it hard to understand how big the cell or cell structure in the microscope image was. Images were presented out of context or missing key details. Sometimes color legends were missing, making it hard to understand what the image was showing. And often images were presented in shades of red and green, colors that color blind people can’t distinguish. “I was appalled to see how many incomprehensible pictures and visualizations are published, and how many images aren’t accessible to people who are colorblind,” Jambor says.
Jambor asked editors at a well-known, high-impact journal if they had a set of minimum standards to hand out to prospective authors – and was surprised to find out they did not. “We have rules on formatting of text, on how to reference other people’s work, but no rules on how figures are to be represented? That’s a huge gap,” she says. “We can do better.”
“In reality, the problem here isn’t that people fake data, it’s that we have a training gap – which we can fill.” (Helena Jambor)
The issue, Jambor says, isn’t one of scientific misconduct, laziness or ignorance. “Ninety-nine percent of scientists are striving to do good science and work very, very hard to do everything right,” she says. “In reality, the problem here isn’t that people fake data, it’s that we have a training gap – which we can fill.”
To address the problem, Jambor teamed up with Schmied, a biologist specializing in microscopy and digital imaging now working at the EU-OPENSCREEN project based in Berlin. The pair’s first step was to collaborate on a set of “cheat-sheets” for time-pressed researchers, published in 2021. The guidelines caught on quickly, passed from researcher to researcher and even posted on lab walls. “We had a lot of feedback,” Schmied says. “People were sharing it widely and it was clear they really wanted these tools.” But Jambor and Schmied – who met as labmates at the Max Planck Institute in Dresden – were conscious of the paper’s limitations. Both of them were biologists, and their expertise shaped their approach to data and presenting it. “Those were just our opinions,” Jambor says. “It was important to talk to people in different communities about how to make this better.”
To make their guidelines as broadly applicable as possible, Jambor and Schmied have gathered dozens of colleagues into a working group from a variety of fields as part of QUAREP, or the Consortium for Quality Assessment and Reproducibility for Instruments and Images in Light Microscopy. Collaborators include over 150 researchers from 27 countries, visualization experts and journal editors. Last year, the team co-authored a well-received paper in Nature Methods entitled “Community-developed checklists for publishing images and image analyses.” The work is already having a quantifiable impact. Several journals have adopted their guidelines as submission requirements, and since the publication of their Nature Methods paper the Nature Springer group has expressed interest in adopting them as well.
Both researchers came to the problem from different places. Before going into biology, Jambor grew up surrounded by visual thinkers, with family members who work as artists, architects, cartographers and designers. Schmied, meanwhile, focused on microscopy and image analysis as part of his PhD in biology. When he approached Jambor for help presenting the visual aspects of his PhD dissertation, they found a common interest in communicating science better.
“With the award, we have more freedom and initiative to write, publish and create training material without begging for someone to support us.” (Christopher Schmied)
The Einstein Foundation Early Career Award gives the PixelQuality team key resources to expand their impact. Both Schmied and Jambor have full-time jobs elsewhere, and the project has relied on the goodwill of publishers and unpaid collaborators to move forward. “With the award, we have more freedom and initiative to write, publish and create training material without begging for someone to support us,” Schmied says.
And they hope the award makes people outside the biology and microscopy communities more aware of the importance
of properly presenting visual data. Images are important components of research in everything from physics and chemistry to digital humanities – and researchers across all those fields can benefit. “People want to do it, and want to do it better,” Schmied says. “We just have to give them the right tools.”