Albert Li in Science: It sounded like the right thing to do. I was a first-year Ph.D. student in educational psychology, and my research adviser told me I should consider practicing open science—“being open and above board,” as he put it. He suggested I make my first-year research project a preregistered report. We would publish our planned methods and analysis in advance, an approach meant to minimize questionable research practices such as cherry-picking of results. I found myself at a crossroads. On one hand, the promise of enhancing transparency and reproducibility was compelling. On the other, I was frightened about potential negative repercussions.
An ethos of secrecy had colored my academic training up to that point. When I was an undergraduate student in China, a respected mentor cautioned, “Do not rush to publish your data in preprints, as others might scoop your ideas. Do not share your code, as it invites scrutiny and criticism. And try not to share your raw data—it makes us vulnerable.” He insisted that nothing leave the lab—not data sets, code, methodologies, or even the challenges we encountered. In papers we published, I wrote that the data remained confidential or was only available upon reasonable request, knowing that we would often opt not to share. The arrangement made me feel a bit uneasy, but I mostly accepted it as the way things had to be done to protect against intellectual theft.
But when I moved to the United States to pursue my Ph.D., the value of open science began to grow increasingly clear. In my first semester I took a course on quantitative and experimental methods, and the professor—a staunch advocate for transparent research practices—underscored the importance of registered reports and preanalysis plans to combat issues like p-hacking and publication bias. “Replicability,” he emphasized, “is not just an ideal but the bedrock of scientific integrity.”
The professor’s arguments were compelling, but I still felt uncertain. What if being open enabled others to steal aspects of my work? Didn’t I need to protect myself? Why should I sacrifice my own work for the sake of some highfalutin ideal? During the class, I challenged the professor with these questions. He responded that true security lies not in obscurity, but in having your work validated and built upon by the community. This exchange, and many others during which we debated the balance between protection and transparency, gradually helped me understand the broader implications of open science.
More here.