As policymakers consider complex scientific issues such as genetic engineering or climate change, they rely on the work of international scientific panels and assessments to inform their decisions. New research from the University of Maryland (UMD) uncovers how the nominations process for one such organization works, and how it may influence who serves on other, similar global scientific bodies.
Dr. Dana R. Fisher from the UMD Department of Sociology and Dr. Philip Leifeld from the University of Glasgow studied membership recruitment for the Millennium Ecosystem Assessment (MA) which synthesized research on ecosystem services between 2001 and 2005, utilizing the knowledge of 1,360 expert members. Fisher and Leifeld discovered that nominations to the MA were largely driven by pre-existing membership in other international organizations, as well as by personal relationships. Their findings were published September 25 in Nature Climate Change.
“Essentially, we discovered a snowball nomination process that skews participation in favor of scientists who are already engaged in global organizations,” Fisher said. “While there is certainly a risk associated with putting all the agenda- and composition-setting power in the hands of a few transnational elites, we also found this structure is not necessarily harming the quality of work or diversity of scientists represented in the assessment.”
Despite the cyclical nominations process, the researchers found that neither gender nor field of expertise were statistically over- or under-represented in the MA’s membership. They also determined that some core individuals in leadership roles were particularly influential in shaping the group’s overall composition.
Congress is currently debating whether to continue to fund the Intergovernmental Panel on Climate Change (IPCC), which is preparing to issue its sixth assessment report on the latest climate change research. Fisher and Leifeld note that the IPCC’s nominations process is similar to the MA’s, but not identical. Even with the differences, the researchers say there is much to learn from their findings about how recruitment to these assessments is related to who eventually serves.
“We need to look more closely at how nominations occur and what criteria are used to ensure that the best possible scientists are brought into these assessments,” said Leifeld. Fisher adds: “Transparency is key to making science better and more effective in shaping policy.”