scispace - formally typeset
Search or ask a question

Showing papers in "ALTEX-Alternatives to Animal Experimentation in 2018"


Journal ArticleDOI
TL;DR: Suggestions are made on how DNT NAMs may be assembled into an integrated approach to testing and assessment (IATA), and a vision is presented on how further NAM development may be guided by knowledge of signaling pathways necessary for brain development, DNT pathophysiology, and relevant adverse outcome pathways (AOP).
Abstract: Multiple non-animal-based test methods have never been formally validated. In order to use such new approach methods (NAMs) in a regulatory context, criteria to define their readiness are necessary. The field of developmental neurotoxicity (DNT) testing is used to exemplify the application of readiness criteria. The costs and number of untested chemicals are overwhelming for in vivo DNT testing. Thus, there is a need for inexpensive, high-throughput NAMs, to obtain initial information on potential hazards, and to allow prioritization for further testing. A background on the regulatory and scientific status of DNT testing is provided showing different types of test readiness levels, depending on the intended use of data from NAMs. Readiness criteria, compiled during a stakeholder workshop, uniting scientists from academia, industry and regulatory authorities are presented. An important step beyond the listing of criteria, was the suggestion for a preliminary scoring scheme. On this basis a (semi)-quantitative analysis process was assembled on test readiness of 17 NAMs with respect to various uses (e.g. prioritization/screening, risk assessment). The scoring results suggest that several assays are currently at high readiness levels. Therefore, suggestions are made on how DNT NAMs may be assembled into an integrated approach to testing and assessment (IATA). In parallel, the testing state in these assays was compiled for more than 1000 compounds. Finally, a vision is presented on how further NAM development may be guided by knowledge of signaling pathways necessary for brain development, DNT pathophysiology, and relevant adverse outcome pathways (AOP).

131 citations


Journal ArticleDOI
TL;DR: To more broadly address the challenges in toxicology, Tox21 has developed a new strategic and operational plan that expands the focus of its research activities and addresses key challenges to advance toxicology testing.
Abstract: The traditional approaches to toxicity testing have posed multiple challenges for evaluating the safety of commercial chemicals, pesticides, food additives/contaminants, and medical products.The challenges include number of chemicals that need to be tested, time and resource intensive nature of traditional toxicity tests, and unexpected adverse effects that occur in pharmaceutical clinical trials despite the extensive toxicological testing.Over a decade ago, the U.S. Environmental Protection Agency (EPA), National Toxicology Program (NTP), National Center for Advancing Translational Sciences (NCATS), and the Food and Drug Administration (FDA) formed a federal consortium for "Toxicology in the 21st Century" (Tox21) with a focus on developing and evaluating in vitro high-throughput screening (HTS) methods for hazard identification and providing mechanistic insights.The Tox21 consortium generated data on thousands of pharmaceuticals and datapoor chemicals, developed better understanding of the limits and applications of in vitro methods, and enabled incorporation of HTS data into regulatory decisions. To more broadly address the challenges in toxicology, Tox21 has developed a new strategic and operational plan that expands the focus of its research activities. The new focus areas include developing an expanded portfolio of alternative test systems, addressing technical limitations of in vitrotest systems, curating legacy in vivo toxicity testing data, establishing scientific confidence in the in vitrotest systems, and refining alternative methods for characterizing pharmacokinetics and in vitro assay disposition.The new Tox21 strategic and operational plan addresses key challenges to advance toxicology testing and will benefit both the organizations involved and the toxicology community.

130 citations


Journal ArticleDOI
TL;DR: The economic landscape of especially regulatory use of animal testing is reanalyzed and a picture emerges of globally regulated industries that are subject to stark geographic and sectorial differences in regulation, which determine their corresponding animal use.
Abstract: For a long time, the discussion about animal testing vs its alternatives centered on animal welfare. This was a static warfare, or at least a gridlock, where life scientists had to take a position and make their value choices and hardly anyone changed sides. Technical advances have changed the frontline somewhat, with in vitro and in silico methods gaining more ground. Only more recently has the economic view begun to have an impact: Many animal tests are simply too costly, take too long, and give misleading results. As an extension and update to previous articles in this series written a decade ago, we reanalyze the economic landscape of especially regulatory use of animal testing and this time also consider respective alternative tests. Despite some ambiguity and data gaps, which we have filled with crude estimates, a picture emerges of globally regulated industries that are subject to stark geographic and sectorial differences in regulation, which determine their corresponding animal use. Both animal testing and its alternatives are industries in their own right, offering remarkable business opportunities for biotech and IT companies as well as contract research organizations. In light of recent revelations as to the reproducibility and relevance issues of many animal tests, the economic consequences of incorrect results and the reasons for still maintaining often outdated animal test approaches are discussed.

92 citations


Journal ArticleDOI
TL;DR: Recent key developments in in vitro cell culture are summarized and addressed and the issues resulting for GCCP, e.g. the development of induced pluripotent stem cells (iPSCs) and gene-edited cells are addressed.
Abstract: A major reason for the current reproducibility crisis in the life sciences is the poor implementation of quality control measures and reporting standards. Improvement is needed, especially regarding increasingly complex in vitro methods. Good Cell Culture Practice (GCCP) was an effort from 1996 to 2005 to develop such minimum quality standards also applicable in academia. This paper summarizes recent key developments in in vitro cell culture and addresses the issues resulting for GCCP, e.g. the development of induced pluripotent stem cells (iPSCs) and gene-edited cells. It further deals with human stem-cell-derived models and bioengineering of organo-typic cell cultures, including organoids, organ-on-chip and human-on-chip approaches. Commercial vendors and cell banks have made human primary cells more widely available over the last decade, increasing their use, but also requiring specific guidance as to GCCP. The characterization of cell culture systems including high-content imaging and high-throughput measurement technologies increasingly combined with more complex cell and tissue cultures represent a further challenge for GCCP. The increasing use of gene editing techniques to generate and modify in vitro culture models also requires discussion of its impact on GCCP. International (often varying) legislations and market forces originating from the commercialization of cell and tissue products and technologies are further impacting on the need for the use of GCCP. This report summarizes the recommendations of the second of two workshops, held in Germany in December 2015, aiming map the challenge and organize the process or developing a revised GCCP 2.0.

85 citations


Journal ArticleDOI
TL;DR: Comparative data suggest that complementary in vitro tests can pick up a broad range of toxicants, and that multiple test results might help to predict organ specificity patterns.
Abstract: The (developmental) neurotoxicity hazard is still unknown for most chemicals. Establishing a test battery covering most of the relevant adverse outcome pathways may close this gap, without requiring a huge animal experimentation program. Ideally, each of the assays would cover multiple mechanisms of toxicity. One candidate test is the human LUHMES cell-based NeuriTox test. To evaluate its readiness for larger-scale testing, a proof of concept library assembled by the U.S. National Toxicology Program (NTP) was screened. Of the 75 unique compounds, seven were defined as specifically neurotoxic after the hit-confirmation phase and additional ten compounds were generally cytotoxic within the concentration range of up to 20 micromolar. As complementary approach, the library was screened in the PeriTox test, which identifies toxicants affecting the human peripheral nervous system. Of the eight PeriTox hits, five were similar to the NeuriTox hits: rotenone, colchicine, diethylstilbestrol, berberine chloride, and valinomycin. The unique NeuriTox hit, methyl-phenylpyridinium (MPP+) is known from in vivo studies to affect only dopaminergic neurons (which LUHMES cells are). Conversely, the known peripheral neurotoxicant acrylamide was picked up in the PeriTox, but not in the NeuriTox assay. All of the five common hits had also been identified in the published neural crest migration (cMINC) assay, while none of them emerged as cardiotoxicant in a previous screen using the same library. These comparative data suggest that complementary in vitro tests can pick up a broad range of toxicants, and that multiple test results might help to predict organ specificity patterns.

56 citations


Journal ArticleDOI
TL;DR: It is argued that a systematic approach to integrating existing knowledge as exemplified by systematic reviews and other evidence-based approaches is needed to achieve a paradigm change in studying systemic effects.
Abstract: A biological system is more than the sum of its parts - it accomplishes many functions via synergy. Deconstructing the system down to the molecular mechanism level necessitates the complement of reconstructing functions on all levels, i.e., in our conceptualization of biology and its perturbations, our experimental models and computer modelling. Toxicology contains the somewhat arbitrary subclass "systemic toxicities"; however, there is no relevant toxic insult or general disease that is not systemic. At least inflammation and repair are involved that require coordinated signaling mechanisms across the organism. However, the more body components involved, the greater the challenge to reca-pitulate such toxicities using non-animal models. Here, the shortcomings of current systemic testing and the development of alternative approaches are summarized. We argue that we need a systematic approach to integrating existing knowledge as exemplified by systematic reviews and other evidence-based approaches. Such knowledge can guide us in modelling these systems using bioengineering and virtual computer models, i.e., via systems biology or systems toxicology approaches. Experimental multi-organ-on-chip and microphysiological systems (MPS) provide a more physiological view of the organism, facilitating more comprehensive coverage of systemic toxicities, i.e., the perturbation on organism level, without using substitute organisms (animals). The next challenge is to establish disease models, i.e., micropathophysiological systems (MPPS), to expand their utility to encompass biomedicine. Combining computational and experimental systems approaches and the chal-lenges of validating them are discussed. The suggested 3S approach promises to leverage 21st century technology and systematic thinking to achieve a paradigm change in studying systemic effects.

45 citations


Journal ArticleDOI
TL;DR: Investigative Toxicology describes the de-risking and mechanistic elucidation of toxicities, supporting early safety decisions in the pharmaceutical industry, and identifies key challenges and perspectives.
Abstract: Investigative Toxicology describes the de-risking and mechanistic elucidation of toxicities, supporting early safety decisions in the pharmaceutical industry. Recently, Investigative Toxicology has contributed to a shift in pharmaceutical toxicology, from a descriptive to an evidence-based, mechanistic discipline. This was triggered by high costs and low throughput of Good Laboratory Practice in vivo studies, and increasing demands for adhering to the 3R (Replacement, Reduction and Refinement) principles of animal welfare. Outside the boundaries of regulatory toxicology, Investigative Toxicology has the flexibility to embrace new technologies, enhancing translational steps from in silico, in vitro to in vivo mechanistic understanding to eventually predict human response. One major goal of Investigative Toxicology is improving preclinical decisions, which coincides with the concept of animal-free safety testing. Currently, compounds under preclinical development are being discarded due to the use of inappropriate animal models. Progress in Investigative Toxicology could lead to humanized in vitro test systems and the development of medicines less reliant on animal tests. To advance this field a group of 14 European-based leaders from the pharmaceutical industry founded the Investigative Toxicology Leaders Forum (ITLF), an open, non-exclusive and pre-competitive group that shares knowledge and experience. The ITLF collaborated with the Centre for Alternatives to Animal Testing Europe (CAAT-Europe) to organize an "Investigative Toxicology Think-Tank", which aimed to enhance the interaction with experts from academia and regulatory bodies in the field. Summarizing the topics and discussion of the workshop, this article highlights Investigative Toxicology's position by identifying key challenges and perspectives.

44 citations


Journal ArticleDOI
TL;DR: iPSC-derived cardiomyocytes exhibited reproducible donor-specific differences in baseline function and drug-induced effects, demonstrating the feasibility of using a panel of population-based organotypic cells from healthy donors as an animal replacement experimental model to quantify inter-individual variability in xenobiotic responses.
Abstract: Assessing inter-individual variability in responses to xenobiotics remains a substantial challenge, both in drug development with respect to pharmaceuticals and in public health with respect to environmental chemicals Although approaches exist to characterize pharmacokinetic variability, there are no methods to routinely address pharmacodynamic variability In this study, we aimed to demonstrate the feasibility of characterizing inter-individual variability in a human in vitro model Specifically, we hypothesized that genetic variability across a population of iPSC-derived cardiomyocytes translates into reproducible variability in both baseline phenotypes and drug responses We measured baseline and drug-related effects in iPSC-derived cardiomyocytes from 27 healthy donors on kinetic Ca2+ flux and high-content live cell imaging Cells were treated in concentration-response with cardiotoxic drugs: isoproterenol (β-adrenergic receptor agonist/positive inotrope), propranolol (β-adrenergic receptor antagonist/negative inotrope), and cisapride (hERG channel inhibitor/QT prolongation) Cells from four of the 27 donors were further evaluated in terms of baseline and treatment-related gene expression Reproducibility of phenotypic responses was evaluated across batches and time iPSC-derived cardiomyocytes exhibited reproducible donor-specific differences in baseline function and drug-induced effects We demonstrate the feasibility of using a panel of population-based organotypic cells from healthy donors as an animal replacement experimental model This model can be used to rapidly screen drugs and chemicals for inter-individual variability in cardiotoxicity This approach demonstrates the feasibility of quantifying inter-individual variability in xenobiotic responses, and can be expanded to other cell types for which in vitro populations can be derived from iPSCs

33 citations


Journal ArticleDOI
TL;DR: A set of computational approaches has been developed to model both particular toxic response and the homeostasis of human liver as a whole; these approaches pave a way to enhance the in silico stage of assessment for a potential toxicity.
Abstract: Most common drug development failures originate from either bioavailability problems, or unexpected toxic effects. The culprit is often the liver, which is responsible for biotransformation of a majority of xenobiotics. Liver may be modeled using "liver on a chip" devices, which may include established cell lines, primary human cells, and stem cell-derived hepatocyte-like cells. The choice of biological material along with its processing and maintenance greatly influence both the device performance and the resultant toxicity predictions. Impediments to the development of "liver on a chip" technology include the problems with standardization of cells, limitations imposed by culturing and the necessity to develop more complicated fluidic contours. Fortunately, recent breakthroughs in the development of cell-based reporters, including ones with fluorescent label, permits monitoring of the behavior of the cells embed into the "liver on a chip" devices. Finally, a set of computational approaches has been developed to model both particular toxic response and the homeostasis of human liver as a whole; these approaches pave a way to enhance the in silico stage of assessment for a potential toxicity.

31 citations


Journal ArticleDOI
TL;DR: In silico analyses take advantage from the huge amount of data already available from human studies for identifying and modeling molecular pathways involved in skin pathophysiology without further animal testing.
Abstract: Despite widely used for basic and preclinical studies in dermatology, available animal models only partly recapitulate human skin features often leading to disappointing outputs when preclinical results are translated to the clinic. Therefore, the need to develop alternative, non-animal models is widely recognized to more closely recapitulate human skin pathophysiology and to address the pressing ethical demand of reducing the number of animals used for research purposes, following the globally accepted 3Rs principle (Replacement, Reduction and Refinement). Skin is the outermost organ of the body, and, as such, easily accessible. Different skin cell types can be propagated in vitro and skin can be reconstructed for therapeutic transplantation as well as for in vitro modeling of physiopathological conditions. Bioengineered skin substitutes have been developed and evolved from elementary to complex systems, more and more closely resembling complete skin architecture and biological responses. In silico analyses take advantage from the huge amount of data already available from human studies for identifying and modeling molecular pathways involved in skin pathophysiology without further animal testing. The present review recapitulates the available non-animal models for dermatological research and sheds lights on their prospective technological evolution.

27 citations


Journal ArticleDOI
TL;DR: Although still requiring further validation, the HET-CAM assay seems an ideal prospect for in vitro vaginal irritancy testing.
Abstract: The HET-CAM (Hen’s Egg Test-Chorioallantoic Membrane) assay is an in vitro alternative to the in vivo Draize rabbit eye test. This qualitative method assesses the irritancy potential of chemicals. The chorioallantoic membrane responds to injury with an inflammatory process similar to that in the rabbit eye’s conjunctival tissue. Regarding topical toxicity assessment of medical devices, ISO 10993-10 states that any skin or eye irritant material shall be directly labelled as a potential vaginal irritant without animal testing, suggesting that the irri­tation potentials for the eye and the vaginal epithelia are similar. The aim of this work was to apply the HET-CAM assay to test the irritancy potential of vaginal formulations. Vaginal semisolid medicines and lubricants currently marketed were tested along with the Universal Placebo formulation that has been shown to be clinically safe. Nonoxynol-9 (N-9), a known vaginal irritant, was enrolled as positive control (concentrations ranging from 0.001 to 100% (v/v)). The assay was conducted according to the ICCVAM – Recommended Test Method (NIH Publication No. 10-7553 – 2010). Formulations were then classified according to irritation score (IS), using the analysis methods (A) and (B). The studied vaginal formulations showed low potential for irritation. N-9 was classified as a severe irritant at concentrations above 2%, which is in line with clinical data, envisaging a possible in vitro/in vivo correlation. IS (B) was considered a more detailed classification output. Although still requiring further validation, the HET-CAM assay seems an ideal prospect for in vitro vaginal irritancy testing.

Journal ArticleDOI
TL;DR: Good in vitro practice guidance for the re-normalization procedure is provided so that data of higher fidelity can be generated and presented.
Abstract: Many types of assays in cell biology, pharmacology and toxicology generate data in which a parameter is measured in a reference system (negative control) and then also under conditions of increasing stress or drug exposure. To make such data easily comparable, they are normalized, i.e., the initial value of the system (e.g., viability or transport function) is set to 100%, and all data are indicated relative to this value. Then, curves are fitted through the data points and summary data of the system behavior are determined. For this, a benchmark response (BMR) is given (e.g., a curve drop by 15 or 50%), and the corresponding benchmark concentration (BMC15 or BMC50) is determined. Especially for low BMRs, this procedure is not very robust and often results in incorrect summary data. It is often neglected that a second normalization (re-normalization) is necessary to make the data suitable for curve fitting. It is also frequently overlooked that this requires knowledge of the system behavior at very low stress conditions. Here, good in vitro practice guidance for the re-normalization procedure is provided so that data of higher fidelity can be generated and presented.


Journal ArticleDOI
TL;DR: It is shown that cancer cells derived from leukemia and colon cancer grow very similarly in culture media with FCS or outdated hPL, and hPL is a moderately priced substitute for FCS in various experimental settings.
Abstract: Experiments with cultured mammalian cells represent an in vitro alternative to animal experiments. Fetal calf serum (FCS) is the most commonly used media supplement worldwide. FCS contains a mixture of largely undefined growth factors and cytokines, which support cell proliferation. This undefined nature of FCS is a source of experimental variation, undesired immune responses, possible contaminations, and because of its way of production an ethical concern. Thus, alternative, defined, valid, and reliable media supplements should be characterized in a large number of experiments. Human platelet lysate (hPL) is increasingly appreciated as an alternative to FCS. Since it is unclear whether cells respond differentially to clinically relevant chemotherapeutics inducing replicative stress and DNA damage (Hydroxyurea, Irinotecan), induction of reactive oxygen species (ROS), the tyrosine kinase inhibitor (TKi) Imatinib, and novel epigenetic modifiers belonging to the group of histone deacetylase inhibitors (HDACi), we investigated these issues. Here we show that cancer cells derived from leukemia and colon cancer grow very similarly in culture media with FCS or outdated hPL. Notably, cells have practically identical proteomes under both culture conditions. Moreover, cells grown with FCS or hPL respond equally to all types of drugs and stress conditions that we have tested. In addition, the transfection of blood cells by electroporation can be achieved under both conditions. Furthermore, we reveal that class I HDACs, but not HDAC6, are required for the expression of the pan-leukemic marker WT1 under various culture conditions. Hence, hPL is a moderately priced substitute for FCS in various experimental settings.

Journal ArticleDOI
TL;DR: A review of the current guidance for testing food additives shows a very rigorous system for higher concern levels, but also many waiving options.
Abstract: The US Food and Drug Administration (FDA) has premarket review authority over food additives, but a food manufacturer may, according to the legislation, intentionally add a substance to human food or animal food without theirpremarket review or approval if the substance is generally recognized, among qualified experts, to be safe under the conditions of its intended use. Generally recognized as safe (GRAS) implies that the current scientific communityagrees on the adequacy of how data is generated. This system has come under public pressure because of doubts as to its efficiency and the FDA’s recent GRAS rule is part of the response. The FDA guidance for testing food additives,known as the “Redbook”, is about two decades old. Work toward a new “Redbook” is on the way, but the US Grocery Manufacturer Association (GMA) also has initiated the development of an independent standard on how to performGRAS determinations.This review of the current guidance shows a very rigorous system for higher concern levels, but also many waiving options. Opportunities and challenges for safety evaluations of food additives are discussed. Where scientific progresshas allowed improving existing and adapting new methods, these should be adopted to improve product safety and animal welfare. The continuous adaptation of such improved methods is therefore needed. Especially, there are opportunitiesto embrace developments within the toxicity testing for the 21st century movement and evidence-based toxicology approaches. Also, the growing understanding of the limitations of traditional tests needs to be considered.

Journal ArticleDOI
TL;DR: In line with global efforts to reduce the use of research animals, an in vitro monocyte activation test (MAT) has the potential to replace the RPT and be used to detect substances that activate human monocytes to release cytokines.
Abstract: Pyrogenicity presents a challenge to clinicians, medical device manufacturers, and regulators. A febrile response may be caused by endotoxin contamination, microbial components other than endotoxin, or chemical agents that generate a material-mediated pyrogenic response. While test methods for the assessment of endotoxin contamination and some microbial components other than endotoxin are well-established, material-mediated pyrogens remain elusively undefined. This review presents the findings of literature searches conducted to identify material-mediated pyrogens associated with medical devices. The in vivo rabbit pyrogen test (RPT) is considered to be the “gold standard” for medical device pyrogenicity testing, despite the fact that few medical device-derived material-mediated pyrogens are known. In line with global efforts to reduce the use of research animals, an in vitro monocyte activation test (MAT) has the potential to replace the RPT. The MAT is used to detect substances that activate human monocytes to release cytokines. This review will also describe the potential opportunities and challenges associated with MAT adoption for the detection of material-mediated pyrogens in medical device testing.

Journal ArticleDOI
TL;DR: A semi-automated process for selecting and annotating reference chemicals across many targets in a standardized format allows rapid development of candidate reference chemical lists for a wide variety of targets that can facilitate performance evaluation of in vitro assays as a critical step in imparting confidence in alternative approaches.
Abstract: Instilling confidence in use of in vitro assays for predictive toxicology requires evaluation of assay performance. Performance is typically assessed using reference chemicals--compounds with defined activity against the test system target. However, developing reference chemical lists has historically been very resource-intensive. We developed a semi-automated process for selecting and annotating reference chemicals across many targets in a standardized format and demonstrate the workflow here. A series of required fields defines the potential reference chemical: the in vitro molecular target, pathway, or phenotype affected; and the chemical's mode (e.g. agonist, antagonist, inhibitor). Activity information was computationally extracted into a database from multiple public sources including non-curated scientific literature and curated chemical-biological databases, resulting in the identification of chemical activity in 2995 biological targets. Sample data from literature sources covering 54 molecular targets ranging from data-poor to data-rich was manually checked for accuracy. Precision rates were 82.7% from curated data sources and 39.5% from automated literature extraction. We applied the final reference chemical lists to evaluating performance of EPA's ToxCast program in vitro bioassays. The level of support, i.e. the number of independent reports in the database linking a chemical to a target, was found to strongly correlate with likelihood of positive results in the ToxCast assays, although individual assay performance had considerable variation. This overall approach allows rapid development of candidate reference chemical lists for a wide variety of targets that can facilitate performance evaluation of in vitro assays as a critical step in imparting confidence in alternative approaches.

Journal ArticleDOI
TL;DR: A completely defined co-culture system allows for the serum-free setup of adipocyte/EC co-cultures and thereby represents a valuable and ethically acceptable tool for the setup of vascularized adipose tissue models.
Abstract: Vascularized adipose tissue models are highly demanded as alternative to existing animal models to elucidate the mechanisms of widespread diseases, screen for new drugs or asses corresponding safety levels. Standardly used animal-derived sera therein, are associated to ethical concerns, the risk of contaminations and many uncertainties in their composition and impact on cells. Therefore their use should be completely omitted. In this study we developed a serum-free, defined co-culture medium and implemented it to set up an adipocyte-endothelial cell (EC) co-culture model. Human adipose-derived stem cells were differentiated under defined conditions (diffASCs) and, like human microvascular ECs (mvECs), cultured in a developed defined co-culture medium in mono-, indirect or direct co-culture for 14 days. The developed defined co-culture medium was superior to compared mono-culture media and facilitated the functional maintenance and maturation of diffASCs including perilipin A expression, lipid accumulation and glycerol and leptin release. The medium equally allowed mvEC maintenance, confirmed by the expression of CD31 and vWF and acLDL uptake. Thereby mvECs showed a strong dependency on EC-specific factors. Additionally the development of vascular structures by mvECs was facilitated when directly co-cultured with diffASCs. The completely defined co-culture system allows for the serum-free setup of adipocyte/EC co-cultures and thereby represents a valuable and ethically acceptable tool for the setup of vascularized adipose tissue models.

Journal ArticleDOI
TL;DR: A human-derived microphysiological system incorporating primary human proximal tubule epithelial cells is used as a suitable replacement for animal models for quantitative pharmacology and physiology research, providing specific insights related to the importance of renal megalin in vitamin D homeostasis.
Abstract: The role of megalin in the regulation of renal vitamin D homeostasis has previously been evaluated in megalin-knockout mice and rat proximal tubule epithelial cells. We revisited these hypotheses that were previously tested solely in rodent models, this time using a 3-dimensional proximal tubule microphysiological system incorporating primary human proximal tubule epithelial cells. Using this human cell-derived model, we confirmed that 25OHD3 is transported into the human proximal tubule epithelium via megalin-mediated endocytosis while bound to vitamin D binding protein. Building upon these findings, we then evaluated the role of megalin in modulating the cellular uptake and biological activity of 1α,25(OH)2D3. Inhibition of megalin function decreased the 1α,25(OH)2D3-mediated induction of both cytochrome P450 24A1 protein levels and 24-hydroxylation activity following perfusion with vitamin D binding protein and 1α,25(OH)2D3. The potential for reciprocal effects from 1α,25(OH)2D3 on megalin expression were also tested. Contrary to previously published observations from rat proximal tubule epithelial cells, 1α,25(OH)2D3 did not induce megalin gene expression, thus highlighting the potential for meaningful interspecies differences in the homeostatic regulation of megalin in rodents and humans. These findings challenge a recently promoted hypothesis, predicated on the rodent cell data, that attempts to connect 1α,25(OH)2D3-mediated regulation of renal megalin expression and the pathology of chronic kidney disease in humans. In addition to providing specific insights related to the importance of renal megalin in vitamin D homeostasis, these results constitute a proof-of-concept that human-derived microphysio­logical systems are a suitable replacement for animal models for quantitative pharmacology and physiology research.

Journal ArticleDOI
TL;DR: Two underappreciated aspects of cell culture systems regarding sex are highlighted: how cell culture media alters the sex hormone environment, and how the innate sex of the cell is often not factored into the overall analysis.
Abstract: Cell culture has enhanced our understanding of cellular physiology and constitutes an important tool in advancing mechanistic insight. Researchers should be reminded, however, that there are limitations in extrapolating data derived from cultured cells to questions focusing on the impact of sex. In this Opinion, we highlight two underappreciated aspects of cell culture systems regarding sex: how cell culture media alters the sex hormone environment, and how the innate sex of the cell is often not factored into the overall analysis. By paying careful attention to these areas, researchers can facilitate reproducibility of their cell culture models, which is consistent with the mandate from the National Institutes of Health to improve scientific rigor and reproducibility in research.

Journal ArticleDOI
TL;DR: The results showed that all PS extracts induced concentration-dependent in vitro PDT, as quantified in the ZET and that this potency is associated with their 3-5 ring PAH content, which confirms the hypothesis that PAHs are the major inducers of PDT by some PS.
Abstract: The present study evaluates the applicability of the zebrafish embryotoxicity test (ZET) to assess prenatal developmental toxicity (PDT) potency of the DMSO-extracts of 9 petroleum substances (PS), with variable polycyclic aromatic hydrocarbon (PAH) content, and 2 gas-to-liquid (GTL) products, without any PAHs but otherwise similar properties to PS. The results showed that all PS extracts induced concentration-dependent in vitro PDT, as quantified in the ZET and that this potency is associated with their 3-5 ring PAH content. In contrast and as expected, GTL products did not induce any effect at all. The potencies obtained in the ZET correlated with those previously reported for the embryonic stem cell test (EST) (R2=0.61), while the correlation with potencies reported in in vivo studies were higher for the EST (R2=0.85) than the ZET (R2=0.69). Combining the results of the ZET with those previously reported for the EST (Kamelia et al., 2017), the aryl hydrocarbon (AhR) CALUX assay (Kamelia et al., 2018), and the PAH content, ranked and clustered the test compounds in line with their in vivo potencies and chemical characteristics. To conclude, our findings indicate that the ZET does not outperform the EST as a stand-alone assay for testing PDT of PS, but confirms the hypothesis that PAHs are the major inducers of PDT by some PS, while they also indicate that the ZET is a useful addition to a battery of in vitro tests able to predict the in vivo PDT of PS.

Journal ArticleDOI
TL;DR: The modified method showed equivalence to the validated reference method (VRM), as all proficiency substances were correctly classified and data generated using the adapted method may be used in European REACH submissions, provided the proficiency data is included.
Abstract: Skin sensitisers are substances that can elicit allergic responses following skin contact and the process by which this occurs is described as skin sensitisation. Skin sensitisation is defined as a series of key events, that form an adverse outcome pathway (AOP). Key event three in the AOP is dendritic cell activation that can be modelled by the human Cell Line Activation Test (h-CLAT) and is typified by changes in cell surface markers CD54 and CD86 in dendritic cells. The h-CLAT is accepted at a regulatory level (OECD Test-Guideline (TG)442E) and can be used to assess skin sensitisation potential as part of an integrated approach to testing and assessment (IATA). Stakeholders in the cosmetics and chemical industries have scientific and ethical concerns relating to use of animal derived material and have communicated a strong preference for fully human based in vitro methods. Therefore, we adapted the h-CLAT to animal-product-free conditions and validated the adapted method with the proficiency panel substances in Annex II of TG442E, using 3 independent batches of pooled human serum. The modified method showed equivalence to the validated reference method (VRM), as all proficiency substances were correctly classified. Comparable values for CV75 (concentration yielding 75% cell viability), EC150 and EC200 (concentration yielding RFI of ≥150 for CD86 and ≥200 for CD54) were obtained. Data generated using the adapted method may be used in European REACH submissions, provided the proficiency data is included. We are seeking formal inclusion of the adaptation into TG442E, enabling compliance with global regulations.

Journal ArticleDOI
TL;DR: An ex vivo, porcine spleen perfusion model was established to study the early events occurring in the spleen prior to the onset of bacterial sepsis, using organs retrieved from animals slaughtered for food production and found to have utility in the replacement of experimental animals in infection research.
Abstract: An ex vivo, porcine spleen perfusion model was established to study the early events occurring in the spleen prior to the onset of bacterial sepsis, using organs retrieved from animals slaughtered for food production. Porcine spleens were harvested from adult pigs and connected to a normothermic extracorporeal perfusion circuit. A constant perfusion of heparinized blood was performed for 6 hours. After injection of Streptococcus pneumoniae to the circuit serial samples of both blood and spleen biopsies were collected and analysed. Functionality of the perfused organs was assessed by monitoring the blood-gas parameters, flow rate and filtering capability of the organ. Interestingly, we observed full clearance of bacteria from the blood and an increase in bacterial counts in the spleen. Classical histology and immunohistochemistry on biopsies also confirmed no major damages in the organ architecture and changes in the immune cell distribution, other than the presence of clusters of pneumococci. A time-course study confirmed that each focus of infection derived from the replication of single pneumococcal cells within splenic macrophages. The model proposed - in line with the 3Rs principles - has utility in the replacement of experimental animals in infection research. Murine models are prevalently used to study pneumococcal infections, but are often not predictive for humans due to substantial differences in the immune systems of the two species. This model is designed to overcome these limitations, since porcine immunology and splenic architecture in particular, closely resemble those of humans.

Journal ArticleDOI
TL;DR: A strategy to assess the capacity of UV-filters to counteract UVA/UVB stress in the human keratinocyte HaCaT and the wildtype Fibs E6/E7 fibroblast cell lines can complement and extent existing in vitro testing strategies.
Abstract: Chemical UV-filters are frequently applied as active ingredients in sunscreen to protect from detrimental effects of UV radiation. Regardless, many of these compounds are not well characterized concerning their capacity to counteract UV induced reactive oxygen species (ROS). Intracellular ROS release is an early event upon UV exposure and a crucial trigger of reaction cascades that may provoke adverse effects both in short- and long-term. We report a strategy to assess the capacity of UV-filters (ecamsule, oxybenzone and menthyl anthranilate) to counteract UVA/UVB stress in the human keratinocyte HaCaT and the wildtype Fibs E6/E7 fibroblast cell lines. The reduction of ROS levels was taken as primary endpoint. The effect of treatment on the cells' metabolic activity was analyzed as an indicator of viability post-treatment, to investigate potential immediate and late (photo)toxicity. Additionally, the compounds' antioxidative capacity was investigated using an azo-based radical generator. Established antioxidants, quercetin and N-acetylcysteine, were used as controls. Data showed remarkable differences in the mode of action of the chemical UV-filters, ranging from protective to pro-oxidative properties, indicating the need for more detailed mode of action-based investigations. Certainly, additional consideration and evaluation will be necessary to further extrapolate these in vitro data for the assessment of in vivo exposure situations. However, the presented approach enables parallel investigations of photoprotective and phototoxic effects of UV-filters, and thus can complement and extent existing in vitro testing strategies.

Journal ArticleDOI
TL;DR: The functionality of the epidermal-melanin-unit could be shown by the transfer of melanin to the surrounding keratinocytes, and a significantly increased melanin content of models stimulated with either UV-radiation or the melanin precursor dihydroxyphenylalanine.
Abstract: To protect the human skin from extensive solar radiation, melanocytes produce melanin and disperse it via melanosomes to keratinocytes in the basal and suprabasal layers of the human epidermis. Moreover, melanocytes are associated with pathological skin conditions such as vitiligo and psoriasis. Thus, an in vitro skin model that comprises a defined cutaneous pigmentation system is highly relevant in cosmetic, pharmaceutical and medical research. Here, we describe how the epidermal-melanin-unit can be established in vitro. Therefore, primary human melanocytes are implemented in an open source reconstructed epidermis. Following 14 days at the air liquid interface, a differentiated epidermis was formed and melanocytes were located in the basal layer. The functionality of the epidermal-melanin-unit could be shown by the transfer of melanin to the surrounding keratinocytes, and a significantly increased melanin content of models stimulated with either UV-radiation or the melanin precursor dihydroxyphenylalanine. Additionally, an UV50 assay was developed to test the protective effect of melanin. In analogy to the IC50 value in risk assessment, the UV50 value facilitates a quantitative investigation of harmful effects of natural UV-radiation to the skin in vitro. Employing this test, we could demonstrate that the melanin content correlates with the resilience against simulated sunlight, which comprises 2.5 % UVB and 97.5 % UVA. Besides demonstrating the protective effect of melanin in vitro, the assay was used to determine the protective effect of a consumer product in a highly standardized setup.

Journal ArticleDOI
TL;DR: A seminar and interactive workshop on “In silico Methods – Computational Alternatives to Animal Testing” was held in Berlin, Germany, aimed at experts, interested researchers and PhD-students interested in the use of in silico as alternative methods to promote the 3Rs.
Abstract: A seminar and interactive workshop on “In silico Methods – Computational Alternatives to Animal Testing” was held in Berlin, Germany, organized by Annemarie Lang, Frank Butt- gereit and Andrea Volkamer at the Charite-Universitatsmedizin Berlin, on August 17-18, 2017. During the half-day seminar, the variety and applications of in silico methods as alternatives to animal testing were presented with room for scientific discus- sions with experts from academia, industry and the German fed- eral ministry (Fig. 1). Talks on computational systems biology were followed by detailed information on predictive toxicology in order to display the diversity of in silico methods and the potential to embrace them in current approaches (Hartung and Hoffmann, 2009; Luechtefeld and Hartung, 2017). The follow- ing interactive one-day Design Thinking Workshop was aimed at experts, interested researchers and PhD-students interested in the use of in silico as alternative methods to promote the 3Rs (Fig. 2). Forty participants took part in the seminar while the workshop was restricted to sixteen participants.

Journal ArticleDOI
TL;DR: The study shows that the probability of infection in cattle tongue is high even when a lower challenge dose is used, which makes the variability between strains less important, and proposes to change the standard dose for cattle challenge to 105.4 PFU.
Abstract: Titration of foot-and-mouth disease cattle challenge virus in cattle tongue has been the standard for many years in many countries, although titration in animals has been replaced by in vitro methods for all other applications The objective of the analysis was the replacement of in vivo titration of cattle challenge virus by in vitro titration Using data from 32 in vivo titration experiments together with the in vitro titration results of the same samples obtained by plaque count on primary lamb or pig kidney cells, as well as data from the virus isolation control chart used in the laboratory, we show that the reproducibility of the in vitro titration is much higher than that of the in vivo titration The titer on primary kidney cells was on average 14 log10 higher than the titer determined by titration in cattle tongue (PFU/ml compared to bovine ID50/ml), but the difference varied among different strains The study also shows that the probability of infection in cattle tongue is high even when a lower challenge dose is used, which makes the variability between strains less important Based on these results, we propose to change the standard dose for cattle challenge from 104 bovine ID50 to 1054 PFU, and to replace the in vivo cattle tongue titration method with the in vitro titration method

Journal ArticleDOI
TL;DR: Suggestions are made on how DNT NAMs may be assembled into an integrated approach to testing and assessment (IATA), and a vision is presented on how further NAM development may be guided by knowledge of signaling pathways necessary for brain development, DNT pathophysiology, and relevant adverse outcome pathways (AOP).
Abstract: Multiple non-animal-based test methods have never been formally validated. In order to use such new approach methods (NAMs) in a regulatory context, criteria to define their readiness are necessary. The field of developmental neurotoxicity (DNT) testing is used to exemplify the application of readiness criteria. The costs and number of untested chemicals are overwhelming for in vivo DNT testing. Thus, there is a need for inexpensive, high-throughput NAMs to obtain initial information on potential hazards, and to allow prioritization for further testing. A background on the regulatory and scientific status of DNT testing is provided showing different types of test readiness levels, depending on the intended use of data from NAMs. Readiness criteria, compiled during a stakeholder workshop that united scientists from academia, industry and regulatory authorities, are presented. An important step beyond the listing of criteria was the suggestion of a preliminary scoring scheme. On this basis a (semi)-quantitative analysis process was assembled on test readiness of 17 NAMs with respect to various uses (e.g., prioritization/screening, risk assessment). The scoring results suggest that several assays are currently at high readiness levels. Therefore, suggestions are made on how DNT NAMs may be assembled into an integrated approach to testing and assessment (IATA). In parallel, the testing state in these assays was compiled for more than 1000 compounds. Finally, a vision is presented on how further NAM development may be guided by knowledge of signaling pathways necessary for brain development, DNT pathophysiology, and relevant adverse outcome pathways (AOP).

Journal ArticleDOI
TL;DR: The current animal-free strategy proved useful for the priority ranking of printed paper and board FCM substances, but it can also be considered to prioritize other substances of emerging concern.
Abstract: Due to the exponentially growing number of substances requiring safety evaluation, efficient prioritisation strategies are needed to identify those of highest concern. To limit unnecessary animal testing, such strategies should respect the 3R principles (Replacement, Reduction, Refinement). In the present study, a strategy based on non-animal approaches was developed to prioritize non-evaluated printed paper and board food contact material (FCM) substances for further in-depth safety evaluation. Within the strategy, focus was put on genotoxicity, a key toxicological endpoint when evaluating safety. By combining in silico predictions with existing in vitro and in vivo genotoxicity data from publicly available literature sources and results from in vitro gene mutation experiments, the 106 study substances could all be assigned to one of the four priority classes (ranging from low to very high concern). Importantly, 19 substances were considered of very high concern due to in vivo genotoxicity. Five of these are furthermore listed as a Substance of Very High Concern (SVHC) by the European Chemicals Agency (ECHA), in addition to demonstrating physicochemical properties linked to a high migration potential as well as oral bioavailability and being used in primary food packaging materials. The current animal-free strategy proved useful for the priority ranking of printed paper and board FCM substances, but it can also be considered to prioritize other substances of emerging concern.

Journal ArticleDOI
TL;DR: The key components of a methods paper are summarized and elements of the overall framing of the method description are highlighted, which include the scientific, technical and, e.g., toxicological rationale for the method, and also the prediction model.
Abstract: Methods papers are important for the progress of biomedical research, as they provide the essential tools to explore new questions and help to better answer old ones However, it is often not clear how a methods paper differs from a methods protocol Confusion between these two very different types of publication is widespread The resultant misunderstanding contributes to a relatively poor reputation of methods research in biology despite the fact that many Nobel prizes have been awarded specifically for method development Here, the key components of a methods paper are summarized: (i) methods description, (ii) performance standards, (iii) applicability domains, (iv) evidence for advances compared to the state-of-the-art, (v) exemplification of the method by practical application In addition, information domains are discussed that are desirable but may be provided on a case-by-case basis or over the course of a series of papers: (vi) method robustness, (vii) accuracy and (viii) precision measures, including various quantifications of method performance, and (ix) measures of uncertainty, including a sensitivity analysis Finally, elements of the overall framing of the method description are highlighted These include the scientific, technical and, eg, toxicological rationale for the method, and also the prediction model, ie, the procedure used to transform primary data into new information