[MUSIC] Hello, this is Thomas Hartung from the John Hopkins Bloomberg School of Public Health. Welcome to another lecture of our, Toxicology for the 21st Century Application Series. I'm going to talk today about another project, which is trying to implement this revolution in Regulatory Toxicology, Revamping Toxilogical Sciences. It is the Human Toxome Project, which I have the privilege to lead. And you can find more information on our consortium and on progress, on the website displayed on the bottom of this slide. Let's get us started. In essence, the Human Toxome Project tries to bring mechanistic toxicology into regulatory use. because it is mechanism, which is the basis for the correspondence of different test systems. We can observe the same in an animal, in the human who is exposed. But also in the cell culture system mechanisms are shared. In the past, toxicology was mainly based on phenomena. We were phenomenologically describing what happened to an animal, we were describing what happened to human. But we did not bother really about mechanism, but the technologies which became available over the last two decades, which allows us to interrogate what is happening actually in an organism. The molecular biology, the biochemistry behind all of this, allows us increasingly to understand and agree, on mechanisms of toxicity. And they form an important pillar of what you have got to know already as the vision of toxicology or toxicity testing for the 21st century. I only want to remind you, this is all started by this visionary report from the National Academy of Sciences, the National Research Counsel in 2007. And it has really created an atmosphere of depart in toxicology, because it suggests that to finally take up the technologies from the biotech and bioinformatics revolution. And just to give you bit of a feeling how much enthusiasm there is and how much change there is. This is a quote from Francis Collins. Whom you might know as the head of the NIH, and in 2008 he first authored an editorial in the Journal of Science and wrote. We propose a shift from primarily in vivo animal studies to in vitro assays, in vivo assays with lower organisms, and computational modeling for toxicity assessments. Very similarly in another editorial, the same Journal Science. Margaret Hamburg heading at the time, the FDA, the Food and Drug Administration, wrote. With an advanced field of regulatory science, new tools, including as she lists a couple of technologies we will encounter in this lecture. Functional genomics, proteomics, metabolomics, high-throughput screening, systems biology. We can replace current toxicology assays with tests that incorporate the mechanistic underpinnings of disease, and of underlying toxic side effects. And last but not least, Lisa Jackson, at the time head of the Environmental Protection Agency, already in 2010 adopted the new framework of Tox21 for the safety assessments of chemicals. I'm going to talk about implementation initiatives, and at the moment, there is three major ones to be discussed. The first one is dealing with the US EPA and NIH program of ToxCast and Tox21. These are high throughput testing screening programs, which aim to profile biologically a lot of substances. You have heard in this lecture series about these efforts, which are remarkable both in the number of chemicals and the number of biological endpoints the signatures that are actually measured. And just to recall, these are programs for example at ToxCast which have, already in the first two phases started more than 2,000 chemicals in 700 biological assays. And this very complex program is one of the pillars of the implementation of toxicity testing, by the 21st century. As I said, you have dedicated lectures on this topic. But, this is all linked to the availability of technologies, technologies which are truly technologies of the 21st century, because they became available over the last one or two decades. Many of the unknown as Omics technologies. We're talking about genomics, transcriptomics, proteomics, metabolomics. They have in common that we're measuring a large number of, if possible the totality of all gene expression. A large comprehensive look into the metabolites. A large and comprehensive assessment of the proteins and their modifications. So Omics technologies are producing a picture which is very rich in information. You could speak about the image analysis tools, the automated microscopy, as a tool which is generating a lot of information. ToxCast and Tox21 use a different technology. They use robots, they use automated testing, as thousands of substances are piped in multi weld plates and are subjected to relatively simply test systems. So, in one case, we are talking about high content. And on the other hand, about high throughput methods, which means I either get a lot of information on one substance in one system. Or I get a tiny piece of information for very many substances by high throughput testing. They have in common that they are producing an information rich situation. A situation where we are suddenly no longer are dealing with things we can handle without a computer, we can no longer count deal cells or dead animals but we have so many dimensions of our data. That we actually need to use computer bioinformatics and data mining to make sense of this data. And here the accumulating knowledge on pathways, which gene is linked to which gene, which enzyme is metabolizing which substance. This type of information is informing our data mining and bioinformatics. And we have a vision of ultimately modeling this, modeling a human being, modeling what is happening in an organism, and the term of a system biology, for toxicology, or systems toxicology, has been kind for this. So, these technologies of the 21st century have in common that they are creating big data. And big data is the basis for a lot of disruptive technologies at the moment, which really make change happen in our field of safety sciences. And as I said, ToxCast and Tox21 are producing such big data from the high throughput angle. The other two activities are also part of our lecture series. You will hear from Mel Anderson from the Hamner Institute, about their case study approach which is very simple. They're trying to say, can we use mechanistic information for some test cases and they want to try to show that, a classical risk assessment can be done with this. You will hear more from him. I am going to talk today about the third of implementation activities, which is an NIH transformative research brand of which am heading as a PI, the principle investigator. Which is starting to map pathways the mechanism of toxicity. And the idea is that we can create a catalogue of such mechanisms which we call the human toxome. So, in our scheme, this is actually just on the other slide. We're employing Omics technologies with the human toxome project, so we're trying to use these to deduce what is actually happening in our model systems. And as we can see, this is close linked also with parallel activities on the exposome, which I will address at the end. These are approaches which are dealing with exposure, but using very similar type of technologies, Omics technologies, in order to understand what these exposures are actually imprinting on an organism. The big problem is not the generation of big data, both technologies can do this. But it is about how to make big sense of this big data, and this is what we're trying to do. And you'll see that the knowledge on pathways and the way it is organised and all scientific literature is sometimes a big problem actually. So how did the Human Toxome Project come about? It started with my inauguration here in the beginning of January, 2009, at the Johns Hopkins School of Public Health. At the time, coming out of the experience of the European Commission, their chemical regulations, the assessment of new methodologies and the validation for safety sciences. I started to think, what can we do from an academic environment to help implement Tox21, the new trend, the new hype of how to revamp toxicity. So I wrote an article, A Toxicology for the 21st Century mapping the road head, and as you can see it was published actually just around the time of my start here. And in this article, this is part of the up streak. I suggested for an implementation activity for the first time, I was talking about the mapping of pathways of toxicity. That we would need in order to implement this vision of a mechanistic toxicology, that we would need a catalog of mechanisms we can agree on. In order to allow mechanistic thinking to enter regulatory implementation. And I put forward a series of requests how to challenge this or what are the hurdles in order to Implement toxicity test for the 21st century. And these pathways of toxicity or POT as we'll abbreviate them in the lecture sometimes. They form the basis for the human toxome project. So, the ideas that the human Toxome is the theoretical entirety of all of this mechanisms these pathways of toxicity. This is another picture from the article which shows the ten challenges which are put ahead. And the important one was we need some conceptual steering how these different steps of the ladder towards, the global acceptance of the mechanism based toxicology can be approached. And this is exactly what we tried to do. And I will show you on the course of this lecture quite a few activities, how we are aiming implement this vision. And this resonated. A few months later I was invited to write this article in nature on Toxicology for the 21st Century. And this article has led to a lot of discussion and resonance, trying to conceive the idea that we need to change that there is actually technologies available which are opening for the different type of doing migratory sciences. Shortly after, the second development took place which took place in the context of OECD. The Organization for Economic Collaboration and Development, which is the place where chemical safety assessments are internationally harmonized. The OECD, is having a repository of test methods which is agreed on between the certain something member states of the OECD. And here in 2010, you can see a very important paper was published, which introduced a very similar concept, the concept of adverse outcome pathways. And adverse outcome pathway is to some extent what I call a pathway of toxicity. However, their concept is larger. It starts with the exposure, the toxicant's chemical properties. It is describing then the molecular interactions these substances have with the cells. The cellular responses, the organ responses, the organism responses, and thus and only population responses, which shows you that it's very much stimulated by the ecotoxigological studies. Which the group it phrased the adverse outcome pathway concept is actually interested in. So what I call a pathway of toxicity is, very much the cellular interaction with a chemical, possibly the organ responses, so the intracellular things which are happening locally. This is the pathway of toxicity. So it is the concept which is a little bit more restricted in it's scope, it is aiming for a much more detailed description than the adverse outcome pathway concept as you will see in a second. This is all about changing the level of resolution. And with this change in level of resolution away from just phenomenologically describing the human or animal and what the experiment, or rough idea from mode of action. We've talked for example about mode of action of genotoxicity. We're moving with this toxicity pathways and then the molecular pathways of toxicity moving towards a molecular description, what it's actually happening. Our hope would be that at some point we understand the system. The network which is put up behind it but we have to say that at the very moment, the state of the art is actually at the level of mode of action and phenomenology. And the pathways of toxicity are only the next step which is coming into reach. So to profile these two different concepts against each other once again. The adverse outcome pathway concept is something which is at the moment very much on the level of textbook knowledge. It is actually the best organized textbook of toxicology, but it is narrative. It is describing with a low level of detail what is happening in order to make toxic effects happen. It is completely based on existing information. So, it is organizing our scientific literature by experts. And, for this reason, it is biased by existing knowledge. Because a lot of our literature is simply wrong, it is not reproducible and it requires a lot of sorting to understand what is the high quality and what is the low quality. Because of this nature, it is also not quantitative, or typically not quantitative. It does normally not discuss the flux of the system, the dynamics, the system's aspects of the toxicology. And there's not yet any agreement on how to quality assure this, how to validate an adverse outcome pathway. The pathways of toxicity as proposed in Human Toxome project are different, they are molecular. We're describing the gene changes, we're describing the metabolites which are increasing and decreasing. And a lot of these is based on Omics technology so its emerging information which is experimentally produced. And it is in many aspects untargeted. It does not come with the hypothesis, I want to measure gene X or metabolite epsilon. It is just describing what is happening in the meaningful system. And it is the causality we prove, that a gene change is actually impacting on the outcome, which is then making it acceptable knowledge. And we're aiming for quantitative relations, for fluxes of metabolites, for example. And there is, we have to show that the concept of causality, the interruption of certain pathways, is actually allowing us to conclude whether a certain suggested pathway of toxicity is true or not.