Science Processes Subscribe to Science Processes

RELATED RESEARCH PROJECTS

workinggroup
National Science Foundation

This project examines the impact of different research funding structures on the training of future scientists, particularly graduate students and postdoctoral fellows, and the impact on their subsequent outcomes. Our proposed research begins by examining the way in which research (and most training) is funded and done. We classify projects by whether they are large or small scale (by funding size); multiple researchers; or multiple institutions. We construct different measures of project teams, and capture the subsequent trajectories of the students and postdoctoral fellows during and after their contact with the teams. We make use of a natural experiment and quasi experimental statistical techniques to separate the effect of funding structures from the other factors contributing to...

TransformingTaxonomicInterfaces
National Science Foundation

The goal of this research is to help researchers develop and use relatively simple tools to describe species in a way that make those descriptions easier to share with other scientists and easier for computers to process and analyze. The approach is bottom-up and iterative, involving the rapid prototyping of tools, combining of existing tools, and the tailoring of applications developed for one purpose but now being reused for this scientific activity. Innovation from this project is applicable to the long-term development of open source software initiatives serving labs throughout the world. The project provides rich, real-world training for graduate students in library and information sciences, training them to be much needed cross-disciplinary researchers in a field desperate for...

dataone
National Science Foundation

Data Observation Network for Earth (DataONE) is a collaborative, global project that is laying the groundwork for a new, innovative approach to conducting environmental science research. DataONE is a distributed framework and sustainable infrastructue poised to resolve many of the key challenges that hinder the realization of more global, open, and reproducible science, through four interrelated cyberinfrastructure (CI) activities:

  • significantly expanding the volume and diversity of data available to researchers for large-scale scientific innovation and discovery;
  • incorporating innovative and high-value science-enabling features into the DataONE CI;
  • maintaining and improving core software and...
1024px-insect_safari_-_beetle_24a
National Science Foundation

Taxonomists are scientists who describe the world’s biodiversity. These descriptions of millions of species allow scientists to do many different kinds of research, including basic biology, environmental science, climate research, agriculture, and medicine. The problem is that describing any one species is not easy. The language used by taxonomists to describe their data is complex, and typically not easily understandable by computers nor even other scientists. This situation makes it harder to search for patterns across millions of species documented by thousands of researchers over many decades of work worldwide.

The goal of this research is to help researchers develop and use relatively simple tools to describe species in a way that make those descriptions easier to share...

IN THE NEWS

Dec. 9, 2016

Associate Professor Victoria Stodden will present her research at A University Symposium: Promoting Credibility, Reproducibility and Integrity in Research on December 9 at Columbia University. Hosted by Columbia's Office of the Executive Vice President for Research and other New York City research institutions, the symposium will bring together leading experts, journal editors, funders, and researchers to discuss how issues of reproducibility and research integrity are being handled by institutions, journals, and federal agencies.  

Stodden will participate in the session, "Repeat After Me: Current Issues in Reproducibility," with Jeffrey Drazen, editor-in-chief of The New England Journal of Medicine; Hany Farid, professor and chair of computer science at Dartmouth; Leonard Freeman, president of the Global Biological Standards Institute; and Londa Schiebinger, John L. Hinds Professor of History of...

Dec. 8, 2016

Reporting new research results involves detailed descriptions of methods and materials used in an experiment. But when a study uses computers to analyze data, create models or simulate things that can’t be tested in a lab, how can other researchers see what steps were taken or potentially reproduce results?

A new report by prominent leaders in computational methods and reproducibility lays out recommendations for ways researchers, institutions, agencies and journal publishers can work together to standardize sharing of data sets and software code. The paper "Enhancing reproducibility for computational methods" appears in the journal Science.

"We have a real issue in disclosure and reporting standards for research that involves computation – which is basically all research today," said Victoria Stodden, a University of Illinois professor of information science and the lead author of the paper. "The standards for putting enough...

Nov. 16, 2016

Three iSchool students will participate in the Library and Information Technology Association (LITA) Forum, which will be held November 17-20 in Fort Worth, Texas. The LITA Forum is the annual conference for professionals in archives, libraries, and other information services.

Nicholas Wolf, master's student and research data management librarian at New York University (NYU), will give a talk with Vicky Steeves, NYU librarian for research data management and reproducibility, titled "Using Openness as Foundation for Research Data Management Services." 

Abstract: This talk will describe the building and scaling up of research data management services at NYU solely using open source tools and data for instruction and best practices recommendations. Through demonstrating the applicability of tools such as OpenRefine, the Open Science Framework, ReproZip, and languages such as Python and R in library instruction...

Oct. 7, 2016

Assistant Professor Vetle Torvik has been named the iSchool's Centennial Scholar for 2016-2017. The Centennial Scholar award is endowed by alumni and friends of the School and given in recognition of outstanding accomplishments and/or professional promise in the field of library and information science.

Torvik expressed surprise and gratitude at receiving this honor. "I am in awe of colleagues who received it before me; their caliber is off the charts," he said. "I hope to use the award to open new doors—a stamp of approval from colleagues who know you well goes a long way to establish new collaborations necessary to solve the increasingly complex problems facing science and society today.”

Torvik joined the faculty in 2011. His current research addresses problems related to scientific discovery and collaboration using complex models and large-scale bibliographic databases. He is the author of articles in journals such as Proceedings of the National Academy of...

Oct. 3, 2016

Provenance information describes the origin and history of artifacts. Because of the vital role played by data and workflow provenance in support of transparency and reproducibility in computational and data science, creating tools for capturing and using provenance information is an important yet challenging task.

Post-doctoral Research Associate Yang Cao and Professor Bertram Ludäscher recently presented joint work on data provenance at the Data Observation Network for Earth (DataONE) All Hands Meeting in Santa Ana Pueblo, New Mexico. In their poster and system demonstration, jointly authored by a team of University of Illinois students and staff as well as collaborators from the UK, Cao and Ludäscher demonstrated how the YesWorkflow tool is "Revealing the Detailed History of Script Outputs with Hybrid Provenance Queries."1

In an earlier article for the Winter 2015/6 issue of DataONE News, "Your Data has a History,...

Sep. 14, 2016
10538922754_fa956254a0_o

Policies and practices in data management—including data preservation and sharing—are increasingly important and complicated aspects of research today. Scientific research and data centers as well as universities and academic libraries are leading the way in developing and implementing best practices in data management. But how do they integrate data management strategies and experts into their workflows?

It is at this intersection of people and institutions that doctoral candidate Cheryl Thompson is conducting her research. Specifically, she explores how organizations develop data expertise and services to support science.

“My research focuses on the role of institutions in data use and access in scientific and research environments. By studying organizations and professions, I investigate the conditions that advance or hinder data-intensive research as well as the emerging data profession and its required expertise,” said Thompson.

“As the need for quality...

Sep. 7, 2016

Reproducibility is a hot topic in the scientific community and is considered by many researchers to be an important challenge. But the term reproducibility holds different meanings for different researchers, causing confusion and a lack of shared understanding.

Associate Professor Victoria Stodden, whose research focuses on enabling reproducibility in the computational sciences, spoke to Nature about this issue. She considers three types of reproducibility, including empirical, in which enough information is provided for an experiment to be physically repeated and verified, and computational and statistical, which allow for repetition and verification of findings.

Stodden is a leading figure in the area of reproducibility in computational science, exploring how we can better ensure the reliability and usefulness of scientific results in the...

Pages