skip navigation
Environmental Factor, July 2012

Workshop on informed risk assessment ponders new directions

By Cindy Loose

Kim Boekelheide, M.D., Ph.D.

Boekelheide is a pioneer in mixtures assessment, through his work with the Superfund Research Program Center at Brown and through a grant (http://projectreporter.nih.gov/project_info_description.cfm?aid=8375119&icde=12942195&ddparam=&ddvalue=&ddsub=&cr=1&csb=default&cs=ASC)  from NIEHS studying the effects of exposure to toxic substances in mixtures. (Photo courtesy of Steve McCaw)

John Balbus, M.D.

Balbus is the chair of the Emerging Science for Environmental Health Decisions Government Liaisons. (http://nas-sites.org/emergingscience/government-liaisons/)  The next workshop in the series, Exploring Human Genomic Plasticity and Environmental Stressors: Emerging Evidence on Telomeres, Copy Number Variation, and Transposons, (http://nas-sites.org/emergingscience/workshops/genomic-plasticity/)  will be held Oct. 4-5 in Washington, D.C. (Photo courtesy of Steve McCaw)

Emerging scientific advances could transform the way scientists analyze the risk of toxic substances to humans, allowing both more efficient and more exact risk assessments.

“Making that leap, however, is harder than it sounds, and it sounds pretty hard,” said John Balbus, M.D., NIEHS senior advisor for public health. His remarks came during the opening of a two-day workshop (http://nas-sites.org/emergingscience/workshops/omics-informed-risk-assessment/)  June 14-15 sponsored by NIEHS. The workshop was the 11th workshop in the Emerging Science for Environmental Health Decisions Workshop on Systems Biology-Informed Risk Assessment (http://nas-sites.org/emergingscience/about/)  series begun in 2009.

Scientists from around the world gathered for the meeting in a continuing attempt to bridge the gap between traditional toxicology assessment — testing of animals to examine one chemical at a time — and new techniques that research the complex actions of chemicals in the human organism using molecular and systems biology, toxicogenomics, computational toxicology, and other emerging sciences.

Confronting the backlog of untested chemicals

“This is the next step in a long, deliberate march,” said Balbus, describing the purpose of the Systems Biology — Informed Risk Assessment workshop held at the National Academy of Sciences in Washington, D.C. “There are a lot of chemicals out there that still need to be tested, and we’re trying to create a better system for public health protection.”

The vast majority of the more than 100,000 chemicals estimated to be in use have never been tested for toxicity and, each year, thousands of new man-made chemical compounds are created.

Keynote speaker Kim Boekelheide, M.D., Ph.D., (http://research.brown.edu/myresearch/Kim_Boekelheide)  a professor at Brown University and a veteran NIEHS grantee, warned that changing risk assessment by using new biology at the molecular level will require breaking the traditional framework that has been in place for 50 years. The current system, he said, is simplistic and linear, with the goal of setting one number as an exposure threshold.

A dynamic, interactive model for toxicology

Systems biology, Boekelheide said, is active and interactive, with lots of moving parts and a resulting complexity. Eventually, however, further scientific advances will allow the process to become simpler. “But we will have the simplicity of knowledge, rather than the simplicity of ignorance,” he said.

Systems biology is an interdisciplinary field that not only identifies biological components and their interactions, but also offers explanations for how these actions take place. Systems biology, therefore, holds the promise of predicting the toxicity of an entire group of chemicals that share similar mode-of-action pathways and mechanisms of causing toxicity.

Speaker Maurice Whelan, Ph.D., of the European Commission Joint Research Centre, referred indirectly to the difference between traditional and emerging risk assessment, by saying, “We spend billions generating data. We need to understand, and not just measure.”

The urgency for finding more efficient toxicity testing is perhaps greatest in Europe, where animal testing will be restricted, and animal testing of cosmetics banned, next year. Speaker Derek Knight, Ph.D., of the European Chemicals Agency, outlined the kinds of guidance being given for using nontraditional data, but noted that the European Union is still trying to get a consensus view of the challenges of using nonstandard data.

Looking at biochemical pathways and effects of mixtures

The issue is not just efficiency, but improvement of toxicity risk assessment. Whelan underscored the intricacies of the process by which substances cause a toxic response through biochemical pathways.

One hope for systems biology is that it will also allow scientists to make greater inroads into understanding the effect of mixtures of chemicals — a more realistic view of the risks to humans living in a chemical soup.

Presentations ranged from an historical view of the U.S. Environmental Protection Agency’s struggle to establish traditional risk assessment, to case studies of new risk assessment approaches.

Many present seemed to agree that emerging sciences require a greater-than-ever need for scientists from varied disciplines to work together. A second consensus was that new risk assessment approaches require a paradigm shift that will be painful and controversial, but there is no going back.

“Change,” said one attendee, “is going to happen whether we lead it or not.”

(Cindy Loose is a contract writer with the NIEHS office in Bethesda, Md.)




"Confronting the issue of ..." - previous story Previous story Next story next story - "NTP board supports systematic ..."
July 2012 Cover Page

Back to top Back to top