摘要:Toxicology is changing its experimental approaches from animal testing to less expensive, more ethical and relevant methods. From the beginning of this century, various regulations and research programs on both sides of the Atlantic have pushed and contributed to this change. Modern toxicology relies on two main components: in vitro testing and in silico analyses. Toxicology has also entered a world of “big data” production, switching from a low-throughput to a high-throughput mode of screening. Complementary to the assessment of toxicological impact, a large effort has also been made to evaluate human exposure to chemicals: new human and field surveys, analytical measurements, computational capacities, and the use of mathematical modeling have open new possibilities for exposure assessment. Accounting for several sources and routes of exposures, estimating combined exposure to mixtures, integrating exposure variability, and simulating long-term exposure are new challenges on their way to be solved. In addition, biomonitoring data, internal exposure biomarkers, and toxicokinetics are all adding to the list of tools and techniques helping to link the pieces of the yet incomplete puzzle of high-throughput risk assessment. Yet, high-throughput applications in toxicology have been criticized, for their inadequate representation of the biological interactions at the organism level, for the experimental noise they suffer from, for the complexity of the in vivo to in vitro extrapolation and for their yet undefined validation protocols. We propose here a brief panorama of those developments.
关键词:toxicology; risk assessment; computational biology; in vitro technique; environmental exposure