Toxicogenomics
Safety is a frequent reason for failure of a drug. Drug attrition often occurs late, with approximately 90% failing in the clinic. Early identification of these liabilities improves efficiency in the drug discovery and development process by decreasing the potential of late-stage failure and reducing the escalating costs associated with developing new drugs.
The translation of certain types of toxicity also remain difficult to predict. For example, only 50% of human hepatic toxicities are picked up by preclinical animal studies in the case of drug-induced liver injury (DILI). Therefore, more reliable in vitro human relevant models and advanced technologies are required to identify these safety liabilities.
One area of toxicology which is generating considerable interest within the pharmaceutical and chemical industry is toxicogenomics (TGx). This technology provides vast amounts of information and it shows great promise in defining mechanistic pathways (adverse outcome pathways) and toxicity prediction.
What Is Toxicogenomics?
Toxicogenomics (TGx) is a branch of toxicology which uses techniques such as DNA sequencing, epigenomics, transcriptomics, proteomics and metabolomics in combination with bioinformatics to understand how pharmaceuticals and chemicals alter gene expression, protein expression, metabolite production and how these could impact on human health.
Toxicity & Mechanisms Prediction
Application of Toxicogenomics
- Drive to Reduce and Replace Animal Testing: Ethical issues around animal testing combined with a lack of validated translation to humans is driving science and innovation in human relevant cell-based models and NAMs (new approach methodologies). At the end of 2022, final approval was given to the FDA Modernization Act 2.0 which aims to reduce the reliance on animal testing and provides impetus to these alternative approaches. Due to this new Act, pharmaceutical and chemical companies are investing heavily in evaluating new technologies such as TGx to eventually replace animal testing.
- Mechanistic Information: Toxicogenomics is an information-rich technology which creates vast amounts of informative data. The first step in our understanding of drug- or chemical-induced toxicity is linking the molecular initiating events (MIEs), through key events (KEs) to identify potential mechanisms and relating these to in vivo organ toxicity and assessment-based approach on the adverse outcome pathway (AOPs). These mechanisms can be used for hazard identification and risk assessment.
- Prediction: Although toxicogenomics is often used to investigate mechanistic pathways, it also plays an important role in identifying toxicological biomarkers. These biomarkers have the potential to be used in early screening of new chemical entities and prediction of organ specific toxicity. Also similarity analysis or read across profiling can be used to compare the data from known toxic compounds with new chemical entities being screened.
- Sensitivity: As toxicogenomics detects changes at the molecular level, it is highly sensitive at identifying early stage molecular initiating events (MIEs) which may impact on cellular, tissue and organ-specific toxicity.
- Personalized Medicine: Toxicogenomics may be used to identify individuals who are more susceptible to toxic effects due to genetic polymorphisms and, conversely, highlight non-susceptible individuals who are more likely to benefit from a specific therapy.
Challenges of Toxicogenomics
- Quantity of Data: Often the techniques for generating analyzing samples have been scaled-up and industrialized to provide a more data-rich examination of the molecular phenotype of the cell can be achieved. This generates vast quantities of data which requires storage and analysis.
- Interpreting Data: Sophisticated bioinformatics are required to analyse the data and identify clusters and patterns in the data and determine their relevance. Machine learning and artificial intelligence are further used to interrogate the data and allow complicated questions to be easily resolved. By monitoring early cellular transcriptional response following exposure to a chemical, a sensitive in-depth view into early MIE/KIs with respect to efficacy or toxicity can be investigated. The toxicogenomics data can be combined with alternative source from clinical studies, chemical data, meta-data and other data source to provide further mechanistic insight.
Cardiotoxicity Case Study
Resources
Why Partner with Evotec?
- Experience in the range of cell culture systems (2D/3D models, co-culture, iPSC-derived, custom models)
- Advanced high-throughput PanOmics platforms
- Leading bioinformatics platforms for interactive multivariate data analysis
- Dedicated bioinformatics experts designing unique data analysis processes
- Sophisticated machine learning capabilities
- Experienced data scientists interpreting large and complex data sets
It’s important to note that transcriptomics is where the majority of the research in toxicogenomics has been focused. This is due to technological advances such as the introduction of RNA-seq which is able to provide a quantitative measurement of the entire transcriptome of the cell. However, Evotec has a wealth of experience in all aspects of PanOmics including genomics, transcriptomics, proteomics and metabolomics.
Furthermore, Evotec have developed the sophisticated PanHunterTM platform which streamlines the entire process by storing and managing the sequencing data, performing quality control and differential expressions analysis, and evaluating the implications for downstream processes such as pathway regulation or gene network analysis. The platform can combine data from various sources including, not only transcriptomics, but also genomics, proteomics, metabolomics and other screens. PanHunter is a unique and powerful system which transforms the workflow process - improving efficiency, reducing cost and ultimately enhancing the quality of the data through robust analysis and interpretation.