Program Header

Podium Abstract Compendium

Podium presentations are organized into 10 educational tracks. Podium abstracts and speaker information are organized first by track and then by session below.

To search for a specific speaker use the 'Find' functionality in your browser (usually Ctrl + F).

To view a complete schedule of podium presentations and schedule of events for SLAS2019 and to view speaker bios and photos, please visit the SLAS2019 Event Scheduler.

Advances in Bioanalytics and Biomarkers

Track Chairs: Shaun McLoughlin, Abbvie and Andreas Luippold, Boehringer Ingelheim (Germany)

Biomarker Discovery in Disease Relevant in vitro and Related in vivo Models

Session Chair: Martin Giera, Leiden University Medical Center

  • Application of acoustic mist ionisation mass spectrometry for metabolic profiling – case study in hepatic toxicology
    Delyan Ivanov, Discovery Sciences

    Acoustic Mist Ionization Mass Spectrometry (AMI-MS) is the hyphenation of a Waters Xevo G2XS time of flight (ToF) mass detector with an acoustic sampling interface. This new technology enables direct injection of samples from a standard 384 well plate into the mass spectrometer at very high-throughput. There are several advantages of using acoustics to load samples into the mass spectrometer, firstly it is non-contact so there is no carry over between sampling events. Secondly, the sample volume is very small, typically 15 Nano-Litres per second and thirdly, the acoustics can fire very quickly (1400Hz). This technology has been broadly applied within AstraZeneca to support biochemical high-throughput screening (HTS), we routinely process in excess of 100,000 samples per day.

    Having established this technology within HTS we have recently looked to expand the application of this technology into cell screening. Cellular applications offer significant challenges to AMI-MS, since this is a direct infusion MS system, there is no chromatography or separation technology between the acoustic sampling and the mass detector, suppression can be a significant issue. Preliminary experiments were carried out with adherent cells grown in standard 384 well plates. The culture medium is removed and the cells washed with ammonium formate before the cells are lysed in 50 micro-Litre of water. Using the time of flight MS scanning across ~2000Da range it is possible to generate a “finger print” spectra from the cells. This “finger print” contains examples of the most abundant metabolites present in the lysate including lipids, amino acids, sugars and nucleotides. Typically, less than 10,000 cells per well are required to generate a “finger print” and sampling times are typically in the range of 6-10 seconds per well (90-150nL). Since only very small sample volumes are taken from each well it is possible to generate significant numbers of technical replicates from each well. In addition, since we are working with adherent cell lines it is possible to have multiple biological replicate on the same plate.

    While it was interesting to demonstrate the ability to generate reproducible “finger prints” from cell lysates, the ability to demonstrate that compound treatment could perturb the “finger print” in a biologically relevant manor was our ultimate goal. We will share some “finger print” data from or our early work using hepatocyte cells treated with known DILI compounds. There are multiple examples where metabolic “finger prints” change on treatment and these changes are consistent with the known mechanism of action of the DILI compounds.

  • SLAS2019 Innovation Award Finalist: Live-cell Gene Imaging Nanotechnology for Cells, Tissue and Pre-clinical Abnormal Scarring Challenges
    David Yeo, Nanyang Technological University

    Live-cell imaging is critical to advancing biomedicine. For example, reporter constructs rely on (viral) integration to enable real-time gene monitoring. However, these result in: viral-induced mutations, laborious clonal selection processes and gene reporter re-design. In addition, fluorescence proteins have similar emission wavelengths to tissue auto-fluorescence, hence suffer poor signal-background ratio. On the other hand, contrast agent cell-labelling often lacks molecular specificity, resulting in highly misleading false positive signals. Our experience suggests that nanotechnology tools readily enable gene expression imaging and increase biomarker detection specificity. We have shown they are easy-to-use, have great selectivity and are highly versatile (simple biomarker re-configuration and near-infrared imaging).Abnormal scars are characterized by excessive fibrosis due to dysfunctional wound healing. Despite occurring in 1:12 of the developed world’s population, no satisfactory therapy exists. Furthermore, no reliable method prognosticates their emergence during early wound recovery. In response, we developed nanotechnology biosensors (nanosensors) to facilitate the following: 1) efficient drug screening; and 2) non-invasive, early scar detection and monitoring.1) To date, no drug screening study has identified suitable anti-scarring drugs. We developed a Fibroblast activation protein(FAP)-? Probe: FNP1, which is specifically and rapidly activated by gelatinases to trigger NIR fluorescence. We demonstrate screening utility with abnormal scar fibroblasts, TGF-?1, anti-fibrotic drugs, inhibitors and stimulants with undefined properties. Following validation against known anti-fibrotic treatment, compounds ‘R’ and ‘T’ were discovered to possess anti-scarring properties and further validated with gene expression and immunoassay analysis.2) Abnormal scar prognosis prior to full manifestation can only be achieved by skin biopsies in addition to further processing and analysis. However, biopsies are limited by: invasiveness, pain, inconvenience, and further scarring and infection complications. In response, we pioneered the concept of topically-applied nanoparticles to probe mRNA non-invasively. NanoFlares - highly-ordered nucleic acids surrounding a nanoparticle core, were chosen for their skin-penetrative properties. These comprise recognition and reporter elements that alter fluorescence emission properties upon target hybridization. NanoFlares targeting connective tissue growth factor (CTGF) demonstrated specificity in solution, cells, ex vivo (human) engineered tissue and animal models (mice, rabbits). Notably, NanoFlare performance was validated with non-coding, uptake NanoFlares, gene expression analysis against functional measures of abnormal scarring.I will elaborate on the critical role nanotechnology can play in abnormal scar therapy and diagnostic development. Specifically, FNP1 is an easy-to-use nanosensor that rapidly identifies novel anti-scarring drug or drug combinations. We also demonstrate the first-ever instance of biopsy-free skin diagnosis using topically-applied NanoFlares validated by several abnormal scarring models. Crucially, gene-based molecular imaging with nanotechnology may dramatically alter healthcare paradigms for skin diseases.

  • Screening a secretome library to discover novel biology and targets relevant to drug discovery
    Lovisa Holmberg Schiavone, IMED Biotech Unit, AstraZeneca, Gothenburg, Sweden

    Secreted proteins regulate human physiology by transducing signals from the extracellular environment into cells and regulating different cellular phenotypes. The human secretome represents a small (~2200 proteins) and biologically relevant screening library that can be used in phenotypic assays. Here, we have used a high-throughput mammalian cell factory approach to generate separately purified and quality assured human secreted proteins. A sample storage and handling process has been established to enable screening of the proteins, at known concentrations, in different cell-based assays. Screening 1000 proteins from the human secretome we show that the FGF9 subfamily, FGF9 and FGF16, are strong proliferators of cardiac progenitor cells. Using the library, we demonstrate that the effect of FGF16 is specific to the cardiac progenitor cells, with no observed effect on cardiac fibroblast proliferation. Additional biophysical binding experiments, using cardiac fibroblasts and cardiac progenitor cells immobilized on a biosensor surface, showed that the interaction of FGF16 and FGF9 with cells on the surface was additive. This suggests that the proteins are signaling through different receptors. Altogether, the data demonstrates how a secretome library can be used across a panel of assays to uncover novel functional information and to aid the discovery of novel signaling pathways and targets relevant to drug discovery.

  • Lipidomics and Metabolomics in drug research, a case study for dehydrocholesterol reductase 24
    Martin Giera, Leiden University Medical Center

    Over the last decades metabolomics and lipidomics technologies have mainly been applied to biomarker type of studies, aimed at the discovery of novel diagnostic means. However, just recently our view on metabolomics technologies and the metabolome itself, is changing. With advanced data sciences, network analysis and an increasing understanding of metabolism and physiology more and more studies are focusing on the application of omics-techniques in the context of activity metabolomics, target identification, drug effect evaluation as well as the metabolome as a possible drug target.

    In this lecture we will discuss the use of advanced analytical techniques in drug research aimed at altering bioactive endogenous metabolites. Using the example of dehydrocholesterol reductase 24, a membrane bound enzyme in distal cholesterol biosynthesis and its role in inflammation, we will discuss target identification and selectivity assessment using gas chromatography mass spectrometry in a whole cell screening assay. Subsequently we will discuss phenotypic observations obtained with an optimized chemical probe coming from this screening effort, showing promising results in a murine zymosan A induced peritonitis model. We will discuss the observed drug effects and show how advanced lipidomics and metabolomics techniques were used in order to decipher the mechanism of action of the chemical probe under investigation. Ultimately a comprehensive picture of the obtained phenotypic findings based on the combined use of lipidomics, genomics, protein array analysis and the use of a radioactively labeled probe will be sketched, allowing to illustrate how modern bioanalytical techniques go hand in hand and allow to obtain a detailed understanding of drug, gene, protein, metabolite and phenotype interactions.

Label Free Bioanalytical Techniques

SamdiSession Chair: Daniel Bischoff, Boehringer Ingelheim

  • Chemoproteomics and Thermal Protein Profiling in Target Discovery and Validation
    Friedrich Reinhard, Cellzome GmbH - a GSK Company

    Understanding the mode of action and the selectivity of small molecules in a complex cellular system comprise a major challenge in drug discovery. This knowledge is key to interpret their pharmacological effect and thus to identify and validate their molecular target(s). This presentation discusses state-of-the-art methods for the direct identification of small molecule targets and provides an overview of successful applications in our labs.The common theme of these methods is the use of quantitative mass spectrometry as unbiased way to investigate the interaction of small molecules with the proteome of either intact cells or cell extracts. We have established a two-pronged strategy using the orthogonal methods of thermal proteome profiling (TPP), based on the cellular thermal shift assay (CETSA), and affinity- or activity-based chemoproteomics with functionalized analogues of active compounds as capturing matrix. The functionalized analogues can subsequently help to validate the target(s) further by conjugating them to different functional tags e.g. to enable target localization (imaging-based) or target degradation in cells.We have used these chemoproteomics technologies to reveal unkown and unexpected binding partners of known drugs helping to understand the toxicity and side effects. Following phenotypic screens, these approaches have been applied successfully in our labs to identify therapeutically valuable targets. Therefore, chemoproteomic studies have opened up new avenues for drug discovery.

  • Exploiting the Potential of Ultra High Throughput Mass Spectrometry Approaches to Drug Discovery
    Melanie Leveridge, GlaxoSmithKline, Stevenage UK

    As a direct analysis, label free technology, mass spectrometry enables assays to be generated that monitor native analytes without the requirement for substrate/product modifications or indirect detection methods. It can have a dramatic impact on hit to lead and lead optimization stages of drug discovery by eliminating false positive/negative results typically associated with fluorescent screening technologies.

    Traditionally, however, MS-based techniques have been relatively slow and thus not suited for high throughput applications. Recent advances in mass spectrometry instrumentation, automation, software and low volume dispensing have enhanced its potential to be adapted to higher throughput approaches, under physiologically relevant conditions, and at sample volumes compatible with hit identification/lead generation screening.

    Here we describe the application of a variety of high throughput mass spectrometry approaches to lead discovery and our strategy for deploying them in a complementary way to create a suite of label free assay formats to address questions in discovery. This includes the application of Affinity Selection Mass Spectrometry to prioritise small molecule drug targets entering the discovery pipeline, the development of an automated Matrix-Assisted Laser Desorption / Ionization Time-Of-Flight (MALDI-TOF) MS platform for screening and compound profiling, and the evaluation of Acoustic-Mist MS to study kinetics.

  • High-Throughput ESI-MS Enabled by the Acoustic Droplet Ejection to the Open-Port Probe Sampling Interface
    Chang Liu, Sciex

    Label-free Liquid Chromatography/Mass Spectrometry (LC/MS) based screening technology is routinely used in early drug discovery, especially for the high throughput ADME screening. Although the current analysis speed of <30 seconds per sample is quite promising, it still cannot match the throughput provided by plate-reader based High Throughput Screening (HTS) platforms. Acoustic droplet ejection (ADE) is a droplet transfer technology capable of high speed, reproducibility, and absolute accuracy. In this work, we couple the ADE and the standard Electrospray Ionization (ESI) ion source of a mass spectrometer with the open-port probe (OPP) sampling interface. Screening speeds as fast as 0.4 seconds-per-sample are demonstrated with high sensitivity, high reproducibility, wide linear dynamic range, good quantitation capability, no ion suppression from various biological/reaction matrix, and broad compound coverage. The continuous-flow of carrier solvent for the OPP maintained the ionization stability and actively cleaned the entire flow system resulting in no observed carry-over. The advantages of this integrated system have been demonstrated with various drug discovery workflows.

  • Ultrahigh-Throughput Screening of Chemical Reactions Using MALDI-TOF MS and Nanomole Synthesis
    Sergei Dikler, Bruker Corporation

    There is an extremely large published body of work in synthetic organic chemistry describing reactions with high yield. However, negative results when chemical reactions do not generate the desired product or the product has low yield are rarely published or presented. The knowledge of types of starting materials and conditions that do not work for the selected reaction type is very important. This knowledge can be quickly generated by ultrahigh-throughput screening (uHTS) of many starting materials in multiple reaction conditions using MALDI-TOF mass spectrometry. In this work we focused on Buchwald-Hartwig reaction, which is a C-N coupling between cyclic secondary amines and N-heterocycle-containing aryl bromides using four different catalysts.

    Nanomole-scale reactions were run in glass 1536 well plates using Cu catalyst, Pd catalyst, Ir/Ni photoredox catalyst and Ru/Ni photoredox catalyst as four different conditions for each of the reactions. The reaction mixtures were spotted on HTS MALDI targets in 1536 format using a 16-channel positive displacement liquid handling robot. These targets were analyzed on the new generation MALDI-TOF instrument equipped with a 10 kHz scanning beam laser, significantly faster X, Y stage and faster target loading/unloading cycle. The readout speed for a MALDI target in 1536 format was in 8-11 min range depended on the number of laser shots.

    In the first screening approach we selected a reaction between the simplest cyclic secondary amine and the simplest N-heterocycle-containing aryl bromide and added 383 simple and complex fragment molecules to evaluate catalyst poisoning using four catalytic methods (1536 experiments). Deuterated form of the product was added for ratiometric quantitation of the MALDI product response. The fragment molecules were identified as catalyst poisons (>50% signal knockdown) and non-poisons (2 was 0.85.

    In the second screening approach the simplest cyclic secondary amine was reacted with 192 aryl bromides of increasing complexity and the simplest N-heterocycle-containing aryl bromide was reacted with 192 cyclic secondary amines of increasing complexity using the same 4 catalytic methods (1536 experiments). Direct correlation with UPLC-MS data was lower since MALDI signals of structurally diverse products were normalized against single internal standard. Nonetheless the normalized MALDI signal was successfully used to create binary reaction success/failure threshold of 20% and the detected trends were essentially identical to those from the UPLC-MS data. This novel uHTS workflow for synthetic reactions based on MALDI-TOF MS is the first step on a road to predicting chemical reactivity and reaction success, which has potential to decrease the number of unsuccessful experiments for organic chemists.

Target and Mechanism Identification After Phenotypic Screens

Session Chair: Jonathan Lee, PDD4Patients LLC

  • Mechanism of Action and Molecular Target follow-up following a Phenotypic Screen
    Jonathan Lee, PDD4Patients LLC

    Empirical or phenotypic drug discovery (PDD) is a target agnostic strategy which contributes a disproportionate number of first in class drugs. The approach also arguably provides a unique means to identify and develop tool compounds to investigate the function of the "unliganded" human proteome (estimated to be >90%). Although development of first in class medicines is of high commercial value, difficulties in translational target validation, has encouraged a "me too" drug development philosophy where pharma efforts tend to focus on a relatively small number of "highly validated" molecular targets. Significantly, broad adoption of PDD is limited by difficulties in obtaining information on the mechanism of action and/or molecular target of phenotypic actives.

    The presentation will provide a context and perspective of these issues by examining (1) the status of the unliganded proteome and its impact on drug discovery, (2) results from a phenotypic angiogenesis screen which identified non-kinase inhibitor leads, (3) identification of novel molecular targets and molecular mechanisms modulating angiogenesis through database mining and functional profiling of phenotypic actives, and (4) presentation/discussion of a literature study that successfully identified a molecular target of a phenotypic active but interestingly failed to reveal detailed mechanism of action information.

  • Hit Dissection and Target Identification from a Cell Viability Chemogenomic Screen
    Shaun McLoughlin, AbbVie

    Presentation information will be posted shortly.

  • SLAS2019 Innovation Award Finalist: Digging into molecular MOA’s with high-content imaging and deep-learning
    Sam Cooper, Phenomic AI

    Machine and deep learning models demonstrate incredible performance when it comes to extrapolating what we know already, in what are collectively called supervised approaches. For example, we’re now able to reduce raw imaging data from large high-content screens, where positive and negative control data exists, into accurate readouts of activity in mere minutes, as well as accurately predict a compounds MOA if it’s already been seen. However, when we’re presented with an unknown MOA, machine learning approaches will typically scan right over it, either missing it entirely or incorrectly assigning it to an existing MOA. By developing novel ‘unsupervised’ deep-learning models alongside high-content assays tailored for computational analysis, we’re able to group compounds by the similarity of their molecular MOA, with no prior knowledge, against a disease phenotype; thus, improving our ability to select the most exciting and novel hits for follow-on development.

  • Navigating the Small Molecule Target Deconvolution Challenge within Unprecedented Target Space
    Scott Warder, AbbVie

    Phenotypic Drug Discovery (PDD) is an attractive strategy for accessing novel target space, but successful implementation has been significantly hampered due to the long cycle times required to translate hits into targets. In the Pharma industry, hit progression in the absence of a defined target has been reserved for unique opportunities, and has not been widely adopted. With this challenge in hand, the two major themes we have focused on are PDD hit prioritization, and technology integration for identifying small molecule (SM) mechanisms and targets. In addition to assay panels (kinome, ion channels, etc.), we have committed to broad-endpoint biological assays, such as L1000 and BioMAP profiling. The breadth of perturbations captured in these companion reference databases not only enable ruling out common mechanisms of action, but provide data to support a measure of compound uniqueness. This also ensures that SMs with a stronger biological rationale are nominated for target deconvolution campaigns. For target identification, we advocate for label-free approaches, even when we have committed to synthesizing linked probes for target enrichment technologies such as Affinity Capture-MS, and SM-Phage display. Current label-free approaches that have delivered strong value include positive selection pooled whole genome CRISPR/CRISPRa screens to reveal nodes that assist functionalizing deep-dive time course phospho-proteomic profiling. Case studies to support the progression of hits into unprecedented target space will be presented.

    Disclosures: All authors are employees of AbbVie. The design, study conduct, and financial support for this research were provided by AbbVie. AbbVie participated in the interpretation of data, review, and approval of the publication.

Back to Top

Assay Development and Screening

Track Chairs: Ralph Garripa, MSKCC and Deb Nguyen, Novartis - GNF

Advanced Imaging-based Assays and Phenotypic Profiling

Session Chair: Shannon Mumenthaler, University of Southern California

  • Single cell-based investigations of endocrine disrupting chemicals by high content analysis
    Fabio Stossi, Baylor College of Medicine

    Our lab has a longstanding interest in single cell analysis-based transcription studies.We have developed novel mechanistic and phenotypic approaches to study transcription within a cellular context, but with sensitive, high throughput approaches.Our main platform allows us to quantify transcription using high throughput microscopy and image analytics that are designed to link, at the single cell level, mRNA synthesis to DNA binding and promoter occupancy of nuclear receptors (NRs) and coregulators,histone modifications and large-scale chromatin modeling. A growing list of endocrine disrupting chemicals (EDC) from the environment have been shown to target NRs, including estrogen and androgen receptors (ER, AR), with large scale efforts to develop and test environmental samples for EDC activities.Our multiplex assays with engineered cells or tumor cell lines endogenously-expressing ER/AR are currently being used to assess individual or mixtures of known hormones and EDCs via machine learning approaches.These studies are currently being applied to environmental samples obtained Galveston Bay and the Houston Ship Channel via new funding from the NIEHS Superfund Research Program.

  • Mechanical Trap Surface-Enhanced Raman Spectroscopy for Live Three-Dimensional Molecular Imaging of Single Cells
    Santosh Paidi, Johns Hopkins University

    Label-free three-dimensional (3D) chemical imaging of live cells is critical to map molecular distributions and determine their dynamics in various physiological and pathological transformations of single cells. Yet, existing optical tools are limited by their inability to offer the desired combination of 3D structural information and endogenous molecular contrast. To facilitate such non-perturbative monitoring of single live cells (without loss of information due to averaging in population analyses), we report an approach for trapping arrays of single live cells and profiling intrinsic molecular signatures that collectively enable the monitoring of intracellular events without targeting specific epitopes. Our approach, named mechanical trap surface-enhanced Raman spectroscopy (MT-SERS), employs a combination of nanoparticle coated self-folding microgripper shaped devices and surface-enhanced Raman spectroscopy (SERS). These microgrippers, which roll-up due to tailored residual stresses induced during fabrication, offer a facile biocompatible platform to precisely trap single live cells for longitudinal analysis without the need for any batteries or external power sources. By leveraging functionalization of the inner surfaces of these traps with plasmonic nanostars, the MT-SERS method permits excellent SERS enhancement, which facilitates label-free molecular interrogation without any photodamage to the cells. We show that the developed platform reliably detects intrinsic chemical signatures over trapped microbeads as well as over a single trapped cell, and thus providing a multiplex volumetric distribution of analytes, such as lipids and nucleic acids. Taken together with the demonstrated ability to track compositional changes in dry, fluid and untethered environments, our findings underscore the potential of MT-SERS to furnish biologically interpretable and quantitative molecular maps, and therefore also opens the door for the elucidation of intercellular variability in normal and diseased cell populations.

  • Combining machine learning and microscopy in drug discovery: from high throughput screening to cell activity prediction
    Lina Nilsson, Recursion Pharmaceuticals

    This talk will share novel techniques for combining imaging assays and machine learning techniques to speed up, de-risk, and reduce costs across the pharmaceutical drug discovery pipeline. We will describe how at the start of the drug discovery process, high throughput screening (HTS) assays and computer vision tools can be developed hand-in-hand to design unbiased, autonomous assay platforms capable of detecting nuanced signals. From this combined data science - phenotypic profiling approach, other powerful novel tools can be developed. As one example, we will show how a small set of initial microscopy assays can be combined with reinforcement learning models for powerful in silico prediction of future drug compound activity. Finally, we will share recent progress on combining machine learning and phenotypic profiling to predict drug compound behavior along the rest of the drug discovery pipeline, such as warnings against likely toxicology.

    This talk will share novel techniques for combining imaging assays and machine learning techniques to speed up, de-risk, and reduce costs in the pharmaceutical drug discovery pipeline. We will describe how at the start of the drug discovery process, high throughput screening (HTS) assays and computer vision tools can be developed hand-in-hand to design unbiased, autonomous assay platforms capable of detecting nuanced signals. From this combined data science - phenotypic profiling approach, other powerful novel tools can be developed. As one example, we will show how a small set of initial microscopy assays can be combined with reinforcement learning models for powerful in silico prediction of future drug compound activity. Finally, we will share recent progress on combining machine learning and phenotypic profiling to predict drug compound behavior along the rest of the drug discovery pipeline, such as warnings against likely toxicology.

  • Screening patient-derived colorectal cancer models to interrogate tumor-stromal interactions and drug response
    Shannon Mumenthaler, University of Southern California

    There are scores of effective treatments available for cancer; however, thousands of patients lose the fight against cancer each year because these therapies eventually fail. Resistance often arises from intrinsic genomic instability and the inherent ability of tumors to clonally evolve. However, such evolution is also affected by the tumor microenvironment (TME), which promotes transient survival of tumor cells giving them time to develop into a permanently resistant state. In vivo cancer cells are in physical and biochemical contact with many different stromal cell types native to the host environment. Cancer-associated fibroblasts (CAFs) are the dominant stromal cell type within the TME and have been linked with increased tumor cell survival and protection against drug-induced apoptosis.

    In colorectal cancer (CRC), one of the world’s deadliest malignancies, scientists have a strong understanding of the molecular contribution of tumor cells to disease progression. However, the overall TME involvement remains less clear. Preclinical treatment studies largely focus on drug-induced changes to tumor cells with little investigation into the impact on surrounding stromal cells. In order to improve the success of drug discovery and drug screening studies, we argue the need for biomimetic preclinical models that incorporate tumor-stromal cross-talk, which often drives drug resistance in patients.

    We have been studying patient-derived CAFs isolated from CRC tumors and have shown their presence to result in increased tumor cell growth rates and reduced sensitivity to anti-EGFR therapy (cetuximab) in vitro. We co-cultured tumor and CAF cells in the presence of drugs and developed a high-content screening (HCS) assay to study therapeutic effects in 2D. Using this assay, we discovered a novel mechanism of resistance whereby CAFs treated with cetuximab render CRC tumor cells resistant through increased epidermal growth factor secretion. Currently we are scaling this work to include patient-derived tumor organoids, a more physiologically relevant in vitro 3D model system amenable to patient-specific screening. We highlight several 3D imaging-based workflows to phenotypically profile tumor organoids across scales from single cells to multicellular objects. Image analyses, including supervised machine learning, were implemented to accurately classify cell types and cell behaviors in 3D. Our imaging-based approaches have advantages over traditional drug screening methods (e.g., ATP measurements, phototoxic dyes) by capturing the dynamics and heterogeneity of patient-specific drug responses. We are implementing these workflows to better understand the interactions between cancer cells and their microenvironment in the context of drug response.

Biochemical, Biophysical, and Label-Free Technologies

SamdiSession Chair: Bruce Koch, Stanford University School of Medicine

  • On-chip Membrane Protein Cell-free Expression: Enables Direct Binding Assay
    Amit Vaish, Discovery Research, Amgen

    Though integral membrane proteins (IMPs) play a pivotal role in the drug discovery process, developing a direct binding assay for monitoring their interactions with therapeutic candidates, particularly small ligands, has been extremely challenging. IMPs are commonly expressed in a cell-based system, and after undergoing a cumbersome multistep process involving extraction, purification, and in vitro stabilization in a soluble format, they could be interfaced with a standard biophysical technique such as surface plasmon resonance (SPR) for binding analysis. To circumvent this traditional limitation, we envisaged combining cell-free technology with a SPR biosensor for performing real time on-chip protein expression monitoring, and subsequently binding detection. SPR functionalized surfaces were used as a template to capture cell-free expressed IMPs. This bottom-up methodology was employed to express a G-protein coupled receptor, the ?2 adrenergic receptor, and the chimeric ion channel, KcsA-Kv1.3, for binding characterization. Described herein, we investigated two different in situ strategies: (1) a solid-supported lipid bilayer for incorporating nascent protein directly into a native membrane mimicking environment, and (2) an anti-green fluorescent protein (GFP)-functionalized surface for capturing in situ expressed GFP-fused protein. Phospholipid-bilayer-functionalized surfaces didn’t render any control over the IMP’s orientation in the bilayer, and demonstrated minimal or no binding to the binding partners. In contrast, the anti-GFP-functionalized capture surfaces, with a defined IMP surface orientation, have shown binding to both large and small molecules.

  • Residence time and affinity of tightly bound drug-target complexes by label-free biosensing
    Melinda Mulvihill, Genentech

    During early drug discovery inhibitors are generally optimized from low-affinity target hits towards high-affinity leads, where binding and inhibition are defined by the affinity constant (KD) and inhibition constant (Ki), respectively. Beyond affinity, the kinetic properties of drug-target complex formation are increasingly being recognized as important in themselves [1] but also as a means to measure the affinity of so called “tight-binding” (i.e. long residence time) drugs. This is important because “tight-binding” is difficult to measure by steady-state methods since progress towards equilibrium is proportional to residence time necessitating prohibitively long pre-incubations. Fortunately, kinetic assays provide an alternative approach by allowing KD (or Ki) to be measured without pre-incubation from the fundamental kinetic rate constants. Real-time, label-free biophysical techniques, such as surface plasmon resonance (SPR), have become the gold standard [2] in kinetic analyses because they are quantitative, apply to almost all target classes, and can measure binding/unbinding in real-time and without reporter labels. State-of-the-art systems address an increasingly wide kinetic/affinity range and have proven well suited to small molecule applications such as fragment screening, kinetic characterization, mechanistic assays, and lead optimization. However, kinetic analysis of “tight-binding” inhibitors by conventional SPR has remained challenging in terms of throughput and kinetic range. Here we present an overview of our recent assay development [3] work in this area supporting pipeline projects and we share perspectives on future potential impact throughout the drug discovery pipeline.

    [1] Copeland, R. A. The Drug–Target Residence Time Model: A 10?year Retrospective, Nature, 87:95 (15) (2016).
    [2] Poda, S. B.; Kobayashi, M.; Nachane, R.; Menon, V,; Gandhi, A. S.; Budac, D. P.; Li, G.; Campbell, B. M.; Tagmose, L. Development of a Surface Plasmon Resonance Assay for the Characterization of Small-Molecule Binding Kinetics and Mechanism of Binding to Kynurenine 3-Monooxygenase. Assay Drug Dev. Technol. 466:475 (13) (2015)
    [3] Quinn, J. G.; Pitts, K. E.; Steffek, M.; Mulvihill, M. M. Determination of Affinity and Residence Time of Potent Drug-Target Complexes by Label-free Biosensing, J. Med. Chem. 5154:5161 (61) (2018).

  • The evolution of a multidimensional approach combining existing and next wave technologies in delivering binding kinetics in early drug discovery
    Daniel Thomas, GSK

    Since the drug-target residence time concept was first proposed more than 10 years ago, it has received much attention across drug discovery research. The central principle is that rates of drug?target complex formation (kon) and breakdown (koff) describe target engagement in open systems and that target occupancy is better predicted by kinetic (kon, koff) rather than thermodynamic (IC50, Kd) parameters in the dynamic in vivo setting. Consequently, the incorporation of binding kinetics into PK-PD models is proposed to lead to better prediction of drug efficacy. A large number of marketed drugs are characterized by long residence times, often equating with extreme potency of the binding interaction. This can confound accurate equilibrium potency measurements through both the length of time needed to reach equilibrium and tight binding limitations. Kinetic measurements are not constrained by these limitations, and can therefore offer a more practical and accurate measure of intrinsic potency. However, historical methods for determining binding kinetics are of low to medium throughput and limited to certain target classes. Recent advances in plate reader technology and the emergence of new methodologies have enabled the development and prosecution of higher throughput screens that have enabled real time measurement of on rates, off rates, and affinity.

    Multidimensional approaches to determining mechanism of action at different points across early drug discovery from hit validation to candidate selection will be presented. These case studies will focus on data from complementary biochemical and cellular systems, generated using a variety of well established technologies and emerging next wave methods. Opportunities to profoundly impact drug discovery through the value of decision-making combined with an ability to effectively reduce attrition at each stage will be critical in defining future successes as an industry. Improving our understanding of both on and off-target kinetics that underpin potency will be an important component in achieving this goal.

  • Utilizing a modified RapidFire 365 system improves throughput and enables simultaneous mechanistic evaluation of multiple Histone Acetyltransferase enzymes and their small molecule inhibitors
    Patrick Bingham, Pfizer

    Utilizing a modified RapidFire 365 HTMS (High Throughput Mass Spectrometry) system from Agilent, coupled to a triple quadrupole mass spectrometer, we have built a platform for the rapid biochemical characterization of the lysine acetyltransferase (KAT) enzyme family. Lysine acetyltransferases are enzymes that catalyze the transfer of an acetyl group from acetyl-CoA to the ?-amino group of lysine residues on a variety of nuclear and cytoplasmic proteins. Acetylation of lysine residues on histone tails facilitates transcriptional access to DNA and thus contributes to the regulation of gene expression. Utilizing varying substrates and small molecule inhibitors, we can extensively evaluate the acetyltransferase activity of multiple specific KAT enzymes and tailor our Medicinal Chemistry compound design efforts to target the KAT activity of interest. Our approach using Mass Spectrometry to interrogate native peptides, proteins and nucleosomes for KAT activity allows for rapid evaluation of multiple substrates as well as the individual quantitation of multiple acetylated marks via a single analysis. This enables us to directly compare enzyme activity across a series of different substrates with varying acetylation or methylation marks and allows us to tailor the substrate sequences to target specific activities of interest. The platform is enzyme and substrate independent and allows swift characterization of a variety of different enzymes utilizing many different substrates from peptides to proteins to nucleosomes. Speed of analysis is enabled through the implementation of a modified RapidFire 365 HTMS system which greatly reduces the cycle time to 3 seconds/sample compared to 15 seconds/sample using traditional RapidFire HTMS. Because of the high sample throughput achieved from the modified system, we are able to perform detailed mechanistic studies as well as compound screening campaigns under timelines not afforded by traditional RapidFire HTMS.

Higher Dimension 4D, 3D, and Complex Multicellular Cell-to-Cell Formats

Session Chair: Jason Ekert, GlaxoSmithKline

  • Pre-clinical drug development progressing towards implementation of multi-dimensional cellular models with increased throughput and enhanced translational relevance
    Jason Ekert, GlaxoSmithKline

    The pharmaceutical industry is continuing to face high R&D costs and low overall success rates of clinical compounds during drug development. There is a surge in demand for development and validation of disease relevant and physiological human cellular models that can be implemented in early stage discovery, thereby shifting attrition of future failures to a point in discovery where the costs are significantly lower. The current drug discovery paradigm involves lengthy and costly lead discovery and optimization campaigns, often using simple cellular models with weak translational relevance to human disease or safety. This exemplifies an inability to effectively and efficiently reproduce human disease relevant states at an early stage to steer target and compound selection. Therefore, a fundamental question is how do we recapitulate human biological complexity of a disease state in robust translational in vitro assays for interrogation to increase our success rate in late stage drug discovery programs. The majority of new complex in vitro technologies that promise to be influential and more predictive in the next few years need to be qualified and repurposed for drug discovery efforts. I will provide examples of where we have utilized time-dependent, multiplexed and multi-dimensional 3D cell culture approaches in both safety and efficacy to better characterize targets and progress molecules earlier in discovery. Furthermore, I will discuss how we have automated and miniaturized 3D Oncology models but still mimic certain characteristics of the tumor microenvironment. I will also elaborate on the importance and methodology to qualify these 3D human in vitro models and the translatability of these models to the clinic.

  • SLAS2019 Innovation Award Finalist: Integrated 3D in vitro models of human bone marrow and cancer
    Steven George, University of California, Davis

    For decades the primary model to understand and treat cancer has been the mouse. While we have become proficient at curing cancer in the mouse, survival rates for cancer in humans has improved very slowly, and has been essentially stagnant in some cancers such as pancreatic ductal adenocarcinoma (PDAC). The reason for this discrepancy is simply differences between the biology of cancer in the mouse and the human (e.g., immune response, scale). Cancer begins local, but becomes a systemic disease during progression; our understanding of this process in humans will advance when we have created 3D in vitro models that mimic how human cancer manipulates systemic organs such as the bone marrow. Our lab is working to design, build, and validate an integrated “organ-on-a-chip” platform of human primary tumor and human bone marrow. The foundation of this effort is built on two premises: 1) all tumors engage the immune system, and thus the bone marrow (primary source of immune cells); and 2) a wide range of tumors coopt the bone marrow to the advantage of the tumor during progression (e.g., PDAC) and latency (e.g., breast cancer). The “bone marrow-on-a-chip” model is characterized by a perfused vascular network, endosteal and perivascular niches, and hematopoietic stem cells (HSC, CD34+). We have demonstrated the controlled differentiation of HSC to myeloid, erythroid, and lymphoid lineages, as well as the maintenance of the HSC in the stem cell phenotype for up to 10 days. The “cancer-on-a-chip” also includes a perfused vascular network, and encompasses models of pancreatic, colorectal, and breast cancer using both primary human cancer cells and cell lines. Here we have demonstrated a robust angiogenic response from the tumor, intravasation of tumor cells, tumor cell migration and proliferation, and an anti-cancer drug response (Paclitaxel) following delivery of the drug through the vascular network. These model systems could impact our fundamental understanding of how cancer manipulates the bone marrow at a new level of spatiotemporal control, and also be used to assess efficacy and toxicity of existing and new anti-cancer drug candidates.

  • Rapid measurement and comparison of multiple cytokines and immune checkpoint molecules from 2D and 3D cell cultures of immune and breast cancer cell populations grown in a complex co-culture model
    Jeanine Hinterneder, PerkinElmer

    Breast cancer tumors can adapt to immune cell infiltration by upregulating immune checkpoint proteins such as Programmed death-ligand 1 (PD-L1) in response to increased local concentrations of cytokines and inflammatory markers secreted by invading T lymphocytes. This allows a tumor to evade immune targeting and reduce the immune response. We developed a culture system to examine the effects of interactions between immune and cancer cell populations on the expression and secretion of a variety of immune checkpoint proteins and cytokines. Further, to determine the influence of cytoarchitecture on the tumor response, we co-cultured human peripheral blood mononuclear cells (PBMCs) with HCC38s, a triple-negative breast cancer cell line, in both traditional monolayer format and as 3D spheroids. PBMCs were stimulated with Dynabeads® coated with CD3 and CD28 antibodies to mimic the effects of antigen presenting cells and to induce the differentiation and expansion of T lymphocytes. Once activated T-cells upregulate a variety of immune checkpoint molecules and secrete cytokines into the tumor microenvironment which can have varying effects on target cell populations. To investigate the mechanisms involved in the induction of PD-L1 in these cultures, we measured select cytokines and immune checkpoint markers using AlphaLISA no-wash homogeneous assays. Alpha technology is a useful tool for the rapid screening of multiple biomarkers from the same well of a culture plate, requiring very low sample volume (5 µL or less) and being highly amenable to automation. Using this assay, we were able to discriminate between effects on the HCC38 cells caused by secreted factors as opposed to direct cellular contact by including treatment with PBMC-conditioned media. For the most part, different treatments produced similar effects on biomarker expression and cytokine secretion for 2D and 3D cultures, with some notable exceptions which will be discussed.

    The addition of 3D cell culture models into discovery workflows can reduce downstream costs such as secondary testing, but the adoption in regular screening programs has been hindered by uneven culture growth, high variability, and lack of methods for high-throughput analysis. 3D cultures shown here were grown in CellCarrier Spheroid Ultra-Low Attachment (ULA) microplates, high quality clear plates that enable the formation of consistently round spheroids from various cell types. Cellular health and proliferation were assessed by with ATPlite 1step Luminescence assays and through cellular imaging using the Opera Phenix high content imaging system. These data illustrate and address some of the benefits as well as challenges in developing a biologically relevant culture system for investigating the complex mechanisms involved in tumor evasion of the innate immune response. We show here how AlphaLISA assays can easily be used to screen for compounds that modulate interactions between immune and cancer cell populations.

  • Fully Automated Large Scale uHTS using 3D Cancer Organoid Models for Phenotypic Drug Discovery Applications
    Virneliz Fernandez Vega, Scripps Research

    Pancreatic cancer remains a leading cause of cancer-associated death, with a median survival of ~ 6 months and 5-year survival rate less than 8%. The tumor microenvironment promotes tumor initiation and progression, and is associated to cancer metastasis and drug resistance. Traditional high throughput screening (HTS) assays for drug discovery have been adapted in 2D monolayer cancer cell models, which inadequately recapitulate the physiologic context of cancer. Primary cell 3D cell culture models have recently received renewed recognition not only due to their ability to better mimic the complexity of in vivo tumors as a potential bridge between traditional 2D culture and in vivo studies. 3D cell cultures are now cost effective and efficient and have been developed by combining the use of a cell-repellent surface and a magnetic force-based bioprinting technology. We have previously validated 3D spheroid/organoid-based cytotoxicity assays and now present the results for the first fully automated high throughput screening campaign performed using this technology on a robotic platform against 150K of the Scripps Drug Discovery Library (SDDL). The active compounds identified in the primary screening were tested as concentration response curves to determine the efficacy and the potency of the compounds between 2D and 3D cell models. Further SAR of the most interesting compounds will be assessed to reveal the importance of 3D for early drug discovery process. These data indicate that a complex 3D cell culture can be adapted for HTS and further analysis provides a future basis for therapeutic applications due to their ease of use and physiological relevance toward humans.

Utilizing the Power of NGS and Functional Genomics in Modern Screening

Session Chair: Bryan Davies, University of Texas At Austin

  • Discovery of Next-Generation Antimicrobials through Bacterial Self-Screening of Surface-Displayed Peptide Libraries
    Bryan Davies, University of Texas At Austin

    Many platforms can screen peptides for their ability to bind to bacteria, but there are virtually no platforms that directly assess the phenotypic outcome of the interaction. This limitation is exacerbated when identifying antimicrobial peptides because the phenotype, death, selects against itself. This has caused a bottleneck that confines research to a few naturally occurring classes of antimicrobial peptides. We have used this seeming dissonance to develop Surface Localized Antimicrobial Display (SLAY), a platform that allows screening of unlimited numbers of peptides of any length, composition and structure in a single tube for antimicrobial activity. Using SLAY, we screened ~800,000 random peptides for antimicrobial function and identified thousands of active sequences. SLAY hits present with unique potential mechanisms of action and access to areas of antimicrobial physicochemical space beyond what nature has evolved.

  • Next-generation approaches to antibody discovery for treatment and prevention of infections caused by Gram-negative pathogens
    Dante Ricci, Achaogen

    The rising frequency and severity of bacterial infection in neonates causes millions of potentially preventable deaths each year across the globe, particularly in developing nations. Cases of neonatal infection are increasingly attributed to Gram-negative pathogens, with multi-drug-resistant Acinetobacter baumannii emerging as a major underlying cause of neonatal sepsis and consequent mortality, particularly in the Indian subcontinent. As traditional antibiotic therapy is often ineffective in prevention or treatment of pan-resistant Acinetobacter infection, and as vaccination is not an effective strategy for prevention of neonatal infection, alternative approaches to the control of infection are required. One such alternative is passive immunization, in which newborns are antibacterial monoclonal antibodies are administered to newborns to prevent bacterial infection with during the critical first 28 days of life. To that end, we have initiated a monoclonal antibody (mAb) discovery effort focused on the isolation of broadly protective IgG mAbs targeting specific pathogens implicated in neonatal infection, with a primary focus on anti-Acinetobacter antibodies.

    Here, we will discuss the advantages of antibody-based approaches to prevention and treatment of Gram-negative infections, and we will describe the co-option of a unique single-B-cell screening and cloning platform for the efficient identification of broadly cross-reactive anti-Acinetobacter mAbs that bind live bacteria and forestall the onset of infection.

  • Single-cell and spatial transcriptomic profiling of mammalian spermatogenesis
    Bo Xia, New York University School of Medicine

    Spermatogenesis is a conserved process throughout the animal kingdom which produces male gametes. This dynamic process entails a series of stages comprising amplification through mitosis, meiosis, and spermiogenic differentiation into functional sperm. The germ cells and niche are locally well-orchestrated in the epitheliums of seminiferous tubules to generate mature sperms periodically. However, systematic characterization of the sequential transcriptomic changes of spermatogenesis remains challenging. Here we use single-cell RNA-seq to build a spermatogenesis atlas of both human and mouse. The atlas, not only reals the transcriptomic dynamics of main spermatogenic cell types, i.e. spermatogonia, spermatocytes, round spermatids and elongating spermatids, but also distinguishes the sub-stages across all spermatogenic stages and subtypes of testicular somatic cells, including Leydig cells. Moreover, our data allows us to systematically identify the conserved as well as divergent transcriptome dynamics between human and mouse spermatogenesis. In addition, we applied spatial transcriptomics to mouse testes. By combining single-cell transcriptomes with the spatial transcriptomes, our data reveals synchronized spermatogenesis in local seminiferous tubules, highlighting the potential of spatial transcriptomics for organ-wide studies.

    Beyond the spermatogenesis atlas, we further our study on how widespread gene expression in male germ cells impact genome sequence evolution. Consistent with previous reports from bulk RNA-Seq, we detected the expression of ~87% of all protein-coding genes in the germ cells. Surprisingly, we found that these genes maintain significantly lower germline mutation rates than the remaining unexpressed genes. Moreover, we found that transcription template strands of the expressed genes have even lower germline mutation rates than the coding strands, which implicates a signature of transcription-coupled repair (TCR). Collectively, these results led us to hypothesize a model that widespread ‘transcriptional scanning’ in the male germ cells functions to systematically check and remove DNA damage to safeguard its genome integrity.

  • SLAS2019 Innovation Award Finalist: High-throughput, CRISPR based 3D screening platform for rapid discovery of novel CDK4/6 inhibitor resistance
    Taraka Sai Pavan Grandhi, Genomics Institute of the Novartis Research Foundation

    As targeted anti-cancer therapies bring hope to millions of cancer patients worldwide, the emergence of resistance to those therapies acts as a cruel reminder of tumor evolution and relapse. While cancer patients initially respond to the targeted therapies, tumors reappear in many cases, often as metastases leading to their death. Identification of drug resistance mechanisms to novel anti-cancer drugs before their emergence could allow for prophylactic treatment regimens. Hence, it is crucial to design novel discovery platforms to allow for rapid identification of clinically-relevant drug resistance mechanisms. 3D cancer spheroids are good physiological representations of the disease as they capture many of the hallmarks of actively growing in-vivo tumors; such as zones of active proliferation, hypoxia and necrosis, metabolic and nutrient gradients, increased cell-cell and cell-ECM interactions and have proven extremely useful in detecting resistance against novel anti-cancer drugs. We have developed a novel microphysiological assay platform that allows enrichment and selection of gene-edited resistant clones in the native 3D in-vitro environment under standard CDK4/6 inhibitor treatment. Utilizing this platform, we performed a genome-wide knockout assay and pooled 3D enrichment screen and identified multiple novel resistance mechanisms against CDK4/6 inhibition. Further, we report that the 3D assay is robust enough to enrich even a single resistant cell within a 3D tumor spheroid. This phenotypic 3D cell platform represents a tremendous advancement over traditional in-vitro resistance detection models and allows expanded possibilities of co-cultures, selection pressures and clonal competition.

Advanced in vitro Culture systems including Stem Cell-based Screening technologies

Session Chair: Cassiano Carromeu, StemoniX

  • Induced pluripotent stem cell-based high throughput screening for modulators of cardiac hypertrophy
    Stefan Braam, Ncardia

    The basis is formed by Ncardia’s proprietary differentiation technology, which allows reproducible manufacturing of highly purified, relatively mature cardiomyocytes, with high predictivity in cardiac toxicity and efficacy assays. Large scale manufacturing in state-of-the-art stirred tank bioreactor systems enabled the production of cardiomyocyte batch sizes suitable for high throughput screening. We demonstrate the development and validation of a scalable assay for induction of cardiac hypertrophy, using NT-proBNP secretion as a readout.

    To ensure highly reproducible and accurate drug efficacy screening using our hypertrophy assay in a high throughput mode we have set up and validated an automated platform for cell culturing, assay readout and data handling. Using verapamil as a reference compound to reduce hypertrophy we confirm assay robustness (S/B > 2) and reproducibility (Z’-factor > 0.4). A selected panel of anti-hypertrophic compounds was used for further assay validation and proved a high level of predictivity. Finally, the assay was used to screen a collection of 1280 off-patent drugs (95% approved drugs) with high chemical and pharmacological diversity (Prestwick Chemical Library) for anti-hypertrophic activity.

    Altogether, with the current results we demonstrate an end-to-end solution for more efficient and cost-effective drug discovery using a combination of cutting edge hiPSC, bioprocessing and high throughput screening technologies.

  • Cell Stress Biosensors for Rapid, Live-Cell Detection of Neurotoxic and Cardiotoxic Compounds in iPSC-Derived Neurons and Cardiomyocytes
    Kevin Harlen, Montana Molecular

    Nearly two-thirds of drugs fail prior to Phase II clinical testing, many due to adverse toxicity. Further, the side effects associated with many clinical drugs, especially chemotherapeutics, can force patients to end treatment. Early detection of adverse toxicity of drugs or lead compounds prior to clinical trials would aid in rapid, cost-effective drug development; yet few in vitro tools exist to test for cellular toxicity in disease relevant cell types. Those that do, often lack valuable information such as mechanisms of toxicity or are not applicable in cell types that are relevant to disease. We created a genetically-encoded fluorescent assay to detect chemically induced stress in living cells. We tested the ability of this assay to detect neurotoxic and cardiotoxic effects in iPSC-derived peripheral neurons and cardiomyocytes. To assess the propensity of different chemotherapeutics to cause neurotoxicity we monitored cell stress in iPSC-derived peripheral neurons in a dose dependent manner. Cell stress responses were monitored using fluorescence imaging and compared with neurite outgrowth. Increased cell stress correlated with reduced neurite outgrowth. IC50 values calculated from each assessment were highly similar for most compounds. However, for some compounds, like oxaloplatin, the cell stress assay produced IC50 values well below those produced by measuring neurite outgrowth. These results suggest an increased sensitivity for neurotoxicity using the cell stress assay. Further, not all compounds tested induced cell stress, indicating that cell stress can be used to glean insight into specific toxicity mechanisms. We next tested the ability of the cell stress assay to identify cardiotoxic side effects from a group of chemotherapeutics known as tyrosine kinase inhibitors, as well as doxorubicin and two cardiac glycosides. In conjunction with monitoring cell stress we also monitored cellular calcium levels and cardiomyocyte beating using a genetically encoded calcium indicator in iPSC-derived cardiomyoctyes. Cell stress again served as a marker of toxicity that correlated with both changes in cardiomyoycte beating and intracellular calcium levels. Combining these data, we were able to classify the cardiotoxic tendency of chemotherapeutics from minimal to highly cardiotoxic. These results demonstrate the ability of this assay to not only identify drug induced toxicity, but provide details on the mechanisms of toxicity. The latter of which is extremely important in identifying means to combat neurotoxic and cardiotoxic side effects.

  • Applications of brain-model technology to study neurodevelopmental disorders
    Cleber Trujillo, University of California, San Diego

    The complexity of the human brain permits the development of sophisticated behavioral repertoires, such as language, tool use, self-awareness, and consciousness. Understanding what produces neuronal diversification during brain development has been a longstanding challenge for neuroscientists and may bring insights into the evolution of human cognition. We have been using stem cell-derived brain model technology to gain insights into several biological processes, such as human neurodevelopment and autism spectrum disorders. The reconstruction of human synchronized network activity in a dish can help to understand how neural network oscillations might contribute to the social brain. Here, we developed cortical organoids that exhibit low-frequency network-synchronized oscillations. Periodic and highly regularized oscillatory network events emerged after 4 months, followed by a transition to irregular and spatiotemporally complex activity by 8 months, mimicking features of late-stage preterm infant electroencephalography. Furthermore, we found that the Methyl-CpG-binding protein 2 (MECP2) is essential for the emergence of network oscillations, suggesting that functional maturation might be compromised at early stages of neurodevelopment in MECP2-related disorders, such as Rett syndrome, autism, and schizophrenia. As evidence of potential network maturation, oscillatory activity subsequently transitioned to more spatiotemporally irregular patterns, capturing features observed in preterm human electroencephalography (EEG). These results show that the development of structured network activity in the human neocortex may follow stable genetic programming, even in the absence of external or subcortical inputs. Our model provides novel opportunities for investigating and manipulating the role of network activity in the developing human cortex.

  • 3-Dimensional Human Cortical Neural Platforms for Drug Discovery in Neurodevelopmental Disorders
    Cassiano Carromeu, StemoniX

    The human Central Nervous System (CNS) has a unique structural organization that is critical to its complex functions. Efforts to model this intricate network in vitrohave encountered major bottlenecks. Recently, much work has been focused on obtaining 3D brain organoids in an attempt to better recapitulate the brain development/function in vitro. Although self-organized 3D organoids can potentially more closely recapitulate key features of the human CNS, current protocols still need major improvements before being implement in a drug discovery scenario. We have recently launched a highly homogenous human induced Pluripotent Stem Cells (hiPSCs)-derived cortical spheroid screening platform in 384 well format, composed of cortical neurons and astrocytes. Using high throughput calcium flux analysis, we showed the presence of quantifiable, robust and uniform spontaneous calcium oscillations, which is correlated with synchronous neuronal activity on the spheroid. Our platform is optimized to have a highly homogenous and consistent functional signal across wells, plates, and batches. Finally, we demonstrated the feasibility of using this platform to interrogate large libraries of compounds on their ability to modulate the human CNS activity. Here, we describe the use of this platform to investigate neurodevelopmental disorders. When implementing hiPSCs from Rett Syndrome (RTT) patients on our platform, a clear functional phenotype emerged. RTT 3D neural cultures displayed a signal that indicates a compromised neural network with slow, large, synchronized calcium signal frequency. We also performed a pilot screen using a targeted library of 296 compounds for their ability to alleviate the observed RTT phenotypes in vitro. In summary, we demonstrated the feasibility of incorporating a neurodevelopmental disorder in a high-throughput screening platform. The system presented here has the potential to dramatically change the current drug discovery paradigm for neurodevelopmental disorders and other neural diseases.

Enhanced Lead Finding Prioritization and Hit Optimization

Session Chair: Bridget Wagner, Broad Institute

  • Phenotypic discovery and mechanism-of-action studies: a focus on the beta cell
    Bridget Wagner, Broad Institute

    A loss of beta-cell mass and biologically active insulin is a central feature of both type 1 and type 2 diabetes. A chemical means of promoting beta-cell viability or function could have an enormous impact clinically by providing a disease-modifying therapy. Phenotypic approaches have been enormously useful in identifying new intracellular targets for intervention; in essence, we are allowing the cells to reveal the most efficacious targets for desired phenotypes. A key step in this process is determining the mechanism of action to prioritize small molecules for follow up. Here, I will discuss how my group is identifying small molecules that promote beta-cell regeneration, viability, and function.

    We have developed a number of platforms to enable the phenotypic discovery of small molecules with human beta-cell activity, providing excellent starting points for validating therapeutic hypotheses in diabetes. First, we developed an islet cell culture system suitable for high-throughput screening that has enabled us to screen for compounds to induce human beta-cell proliferation, as well as to perform follow-up studies on small molecules that emerge from other efforts. This work identified DYRK1A inhibition as a relevant mechanism to promote beta-cell proliferation, and 5-iodotubercidin (5-IT) was shown to induce selective beta-cell proliferation in NSG mice. We have collected a number of available DYRK1A kinase inhibitors for kinase profiling and correlation with beta-cell proliferation. It appears that inhibition of DYRK1A and CLK1 are both correlated with the level of proliferation induced. More recently, we have found that this small-molecule approach actually causes the loss of several markers of beta-cell maturity (PDX1, NKX6.1, MAFA) in human beta cells, as measured by immunofluorescence, suggesting that induction of a proliferative state results in beta-cell dedifferentiation. We are investigating whether dedifferentiation enhances the effects of DYRK1A inhibitors, or whether it is sufficient for proliferation in the first place.

    We also developed an assay for high-throughput screening to identify small molecules that can protect beta cells from apoptosis induced by pro-inflammatory cytokines. This work led to the discovery that histone deacetylase 3 (HDAC3) is a relevant target for inhibiting beta-cell apoptosis, with beneficial effects in islets in cell culture. Further, isoform-selective inhibitors can delay (and partially reverse) the onset of autoimmune diabetes in NOD mice. Although we know the target for such compounds, the precise mechanism of action is still unclear. Thus, we have been performing gene expression- and proteomics-based experiments to uncover the pathways in beta cells perturbed by HDAC3 inhibition.

  • Improving the Drug Discovery Tool Box
    Daniel Weaver, PerkinElmer

    The data deluge. The elusive molecule. Activating the receptor. The greater complexity of protein-based therapeutics. Rapidly identifying side effects. The financial impact of success or failure.

    These are only a few considerations in the lengthy series of choices required to select a drug target. The staggering amount of screening data that must be processed in choosing candidates is well known to the industry. Meanwhile the technologies applied to the winnowing of lead candidates emerging from High Throughput Screening (HTS) continue to increase in complexity.

    This discussion will focus on new screening technologies to analyze and review data from multiple outputs. As well as how to facilitate the comparison of data from different assay types and how to prevent workflow bottlenecks when the level of detail required for analysis is so immense.

  • Accelerating cancer drug discovery through accurate safety predictions: one goal of The ATOM consortium
    Sarine Markossian, Small Molecule Discovery Center and Department of Pharmaceutical Chemistry, University of California, San Francisco, CA; The Accelerating Therapeutics for Opportunities in Medicine (ATOM) consortium, San Francisco, CA

    Currently, drug discovery is a slow and sequential process with a high rate of failure. The Accelerating Therapeutics for Opportunities in Medicine (ATOM) Consortium is an academia, industry, and government partnership with the goal of rapidly accelerating drug discovery by integrating modeling, machine learning, and human-relevant complex in vitro models. One of our goals in ATOM is to optimize preclinical safety predictions, so we can incorporate predictive toxicology early in the drug discovery process. Hepatocyte toxicity, or drug induced liver injury (DILI), is a leading cause for attrition during drug development as well as one of the main reasons drugs are withdrawn from the market. We will describe our efforts to profile multiple 2D and 3D High Content assay formats to measure and predict hepatocyte toxicity. These multi-parametric data, coupled with Quantitative Systems Toxicology (QST) tools and deep machine learning, are enabling us to predict DILI from the structure of a proposed drug lead. In summary, by integrating high-performance computing and human-relevant in vitro models, we plan to transform drug discovery into a rapid, integrated, and patient-centric model.

  • Target Identification and Mechanism of Action in Drug Discovery
    Monica Schenone, Pfizer

    The importance of target-identification and mechanism-of-action (MoA) studies continues to be critical in small-molecule probe and drug discovery. As cell based screening methods evolved, academia and the biopharmaceutical industry began a shift from target-based to phenotypic screening and identifying the MoA of a small molecule hit has become an important challenge.

    Advances in DNA sequencing are enabling genome-wide association studies (GWAS) that identify genes or loci implicated in disease. In much the same way the shift in screening drives the need to identify small molecule MoAs, these advances create the need to understand “protein MoAs”. Investigating the mechanism by which genes or loci are implicated in disease will widen the landscape for new therapeutic targets and aid lead optimization campaigns for small molecules.

Back to Top



Automation and High Throughput Technologies

Track Chairs: Louis Scampavia, Scripps Institute, FL and Sam Michael, NIH

Advanced imaging technologies to bridge the gap between high-content and high-throughput

Session Chair: Louis Scampavia, Scripps Florida; Department of Molecular Therapeutics

  • High content 3-D light sheet microscopy in 96 and 384-well plate formats: cancer cell morphology and invasiveness and spheroid-based assays for compound safety screening
    Chris Dunsby, Imperial College London

    Light sheet fluorescence microscopy (LSFM) is a 3-D fluorescence imaging method providing low out-of-plane photobleaching and phototoxicity. Standard LSFM requires two microscope objective lenses orientated at 90° to one another in order to image the sample. However, the proximity of the excitation and detection objectives to one another and the sample makes high content imaging in conventional well plates difficult. Oblique plane microscopy (OPM) is an alternative approach that uses a single high numerical aperture microscope objective to provide both fluorescence excitation and detection, whilst maintaining the advantages of LSFM. OPM can be implemented on a commercially available fluorescence microscope frame to rapidly image volumes with subcellular resolution.

    We present the development and application of a stage scanning OPM (ssOPM) approach for light sheet fluorescence imaging in commercially available 96 and 384-well plates, including plastic-bottomed plates. 3D volumetric images are acquired by scanning the sample through the tilted light sheet at a constant velocity and image acquisition is electronically synchronized to the motion of the stage. This approach enables rapid 3-D imaging of volumes spanning many millimetres in their longest dimension whilst maintaining sub-cellular spatial resolution.

    We report two applications of the ssOPM system. In the first, the system was used to perform functional screens for regulators of cancer cell size and 3D invasiveness. In the screen for regulators of size, melanoma cells were grown as 2D cultures in 384-well plates and genes were systematically knocked down with a library of 120 siRNAs. For 3D invasion assays, 9 siRNAs were used and plates were incubated for 24 hours allowing cells to invade vertically into the gel prior to fixation and staining. For both assays, 100s-1000s cells were quantified per condition to allow 3D cytometric data analysis. We developed a MATLAB 3D image analysis pipeline for automated segmentation and morphological quantification of the image data. This allowed determination of cell size in 2D and 3D, measurement of cell invasiveness into the collagen matrix, and quantification of cell morphology of invading cells. The ssOPM approach will enable a better understanding of which genes are responsible for cancer cell size determination and invasion in 3D cultures.

    In the second application, the ssOPM system is being used to develop a 3D assay for spheroid-based compound safety screening. Multi-cellular HepG2/C3a spheroids with diameters in the range 100-200 mm were grown in low attachment u-bottomed dishes and then transferred to 384-well plates for ssOPM imaging. Following initial tests, SYTOX green and TMRM were chosen to read out spheroid viability and mitochondrial function as an indicator for drug toxicity. The assay Z’, signal to noise ratio and coefficient of variation have been measured using max-min plating protocols and the results compared to conventional wide-field imaging and spinning-disk confocal imaging.

  • Development and Characterization of Patient Derived 3D Hepatocellular Carcinoma Organoid Models using HCS Methods and their Application to Precision Medicine and Drug Discovery
    Paul Johnston, University of Pittsburgh Dept. Pharmaceutical Sci.

    Hepatocellular carcinoma (HCC) accounts for > 90% of primary liver cancers and is the 2nd leading cause of cancer related death. HCC is associated with a range of environmental and infective etiologies including hepatotropic virus (HBV & HCV) infections, non-alcoholic steatohepatitis, nonalcoholic fatty liver disease, alcohol and aflatoxin exposure. 80-90% of HCC patients have liver cirrhosis, the most important risk factor. 83% of new HCC cases occur in South Asia, but incidence is rising elsewhere due to increased prevalence of metabolic syndrome and HCV infections. Surgical resection, liver transplantation, and ablation offer high response rates and potential cures for early stage HCC. However, due the asymptomatic nature of HCC >80% of patients are diagnosed at an inoperable advanced stage. Systemic chemotherapy of advanced HCC with cytotoxic drugs and/or combinations elicit response rates of 10-20% and don’t prolong overall survival (OS). The multi-tyrosine kinase inhibitors sorafenib and regorafenib are approved for advanced HCC, and for patients with tumors progressing on sorafenib, respectively. However, sorafenib improves OS by only ~3 months and adverse events are common. Regorafenib, prolongs progression free survival by ~1.5 months and OS for ~3 months. There is an urgent need to develop new and effective HCC therapies. The complex etiologies, pathogenesis and genetic heterogeneity of HCC makes drug discovery and development extremely challenging, and the lack of representative HCC models has been a major obstacle. Individual patient derived 3D tumor organoid (PTOs) models are being used for disease-specific drug chemo-sensitivity testing, part of precision medicine strategies to guide drug selection. PTOs address both inter- and intra-tumoral heterogeneity, and retain the structure, morphological, genetic and clonal heterogeneity of the original tumor. We describe the characterization of HCC PTO morphologies and proliferation using HCS methods and the application of these cultures to precision medicine and drug discovery strategies.

  • High-Content Imaging of Colorectal Cancer Patient-derived Organoids to study Tumor Microenvironment
    Seungil Kim, Lawrence J. Ellison Institute for Transformative Medicine (USC)

    Complex tumor microenvironments lead to intra- and inter-patient heterogeneity across multiple cancer types, which prevent consistent clinical treatment outcomes. Micro-physiological and cellular changes impact tumor cell survival and disease progression significantly. Therefore, it is critical to understand how tumor cells behave and respond to drugs under different environmental conditions. We have established 3D organoid cultures from colon cancer patient tissues. Organoids were treated with known clinical cancer drugs to examine how they respond to various treatments. To keep intact organoid structures, we performed whole mount immunostaining with tissue clearing techniques. Ki-67 (Cell proliferation), Cleaved-caspase 3 (Apoptosis), E-cadherin (Cell junction) and F-actin (Cytoskeleton) changes were examined during drug treatments. Cleaved-caspase 3 positive cell numbers were significantly increased after drug treatments but the degree of cell death differed depending on each drug type and dose. F-actin was enriched at the border between the outer surface of cells and the inner core in control organoids but this uniform pattern of actin cytoskeleton was lost in Staurosporine- and Irinotecan-treated organoids. To complement our fixed imaging workflow, we also performed live cell imaging on patient-derived organoids labeled using either Lentiviral-H2B transduction (Stable) or vital dyes (Transient). Dynamic imaging of H2B-GFP organoids illustrated cell division events, which were used to quantify cell proliferation rates in individual organoid. Cell proliferation was significantly reduced with Oxaliplatin treatment suggesting that this drug affects organoid growth by inhibiting cell divisions. DRAQ7 (Dead cells) staining was increased rapidly in entire organoids with Staurosporine and Irionotecan caused cell death at the outer surface of the organoids. Interestingly, we observed organoid contraction and shrinkage with Irinotecan and 5-Fluorouracil (5-FU) treatments highlighting the advantages of live cell imaging with regards to depicting dynamic drug- and concentration-dependent responses. Given tumor-stromal cell interactions are key microenvironmental factors influencing drug response, we also isolated cancer-associated fibroblasts (CAFs) from patient tissues to co-culture with patient-matched tumor organoids. H2B-GFP labeled organoids cultures was seeded on top of the fibroblasts to make physical contact between the two different cell types. By using high-resolution 3D imaging, we examined the impact of CAFs on the drug response of patient-derived organoids. 3D/4D organoid imaging is a powerful method to study the tumor microenvironment and screen preclinical drug compounds using patient tissues. We believe that a high-throughput automated imaging system coupled with a bio-repository of patient-derived organoids will expedite the identification of effective anti-cancer treatment options for individual patient as well as our understanding of biological questions regarding the complex tumor microenvironment.

  • Perfusable 3D angiogenic networks in a high-throughput, microfluidic culture platform
    Remko van Vught, Mimetas

    The transition from 2D to 3D cell culture is a first step towards more physiologically relevant in vitro cancer models. To adequately capture the complex tissue architectures observed in vivo, 3D microfluidic techniques incorporate and achieve long-term gradient stability, continuous perfusion and patterning of cancer cell layers as stratified co-cultures. We used a standardized high throughput (n=40) microfluidic 3D tissue culture platform, called the OrganoPlate® to generate precisely controlled gradients, without pumps, ideal for growing blood vessels and inducing controlled 3D angiogenic sprouting. The blood vessel is grown against an extracellular matrix (ECM) gel with cancer cells and is subsequently exposed to pro and anti-angiogenic compounds to direct sprouting towards 3D cancer cell clusters. Utilizing high content confocal time-lapse imaging and analysis angiogenic potential was measured in various cancer models. The exposed vasculature shows many of the important hallmarks of cancer angiogenesis found in vivo, including tip cells induction and migration and stalk cells formation. Importantly, the stalk cells develop a perfusable lumen that is connected to the parental vessel as demonstrated with perfusion of high-molecular-weight (150KD) fitc-dextran through microvascular structures. This model will be used as an in vitro cancer screening platform to unravel the important drivers in angiogenesis and vasculogenesis and the mechanism of action of anti-angiogenic compounds. By combining this culture platform with mural cells, cell-cell interactions can be studied. In parallel, we will combine this 3D cancer angiogenesis platform with our current Tumor-on-a-Chip models to create tissue models with integrated vasculature.

Automating Screens Using Physiologically-relevant Models

Session Chair: Kristin Fabre, Translation Research Institute for Space Health at Baylor Medical College

  • Using a novel microfluidics device to replicate drug PK profiles in vitro to improve in vivo translation of drug actions
    Clay Scott, AstraZeneca Pharmaceuticals

    Drug effects are traditionally quantified in cell-based assays by testing drugs under static conditions and calculating the drug concentration that modulates the response by 50% (e.g. EC50). In contrast, drug administration in vivo results in dynamic changes in drug concentration due to absorption, biodistribution and elimination. Thus, a more quantitative translational assessment of drug-induced effects can be achieved when in vitro data is coupled to pharmacokinetic/pharmacodynamic (PK/PD) models, which allow translation of time-course and magnitude across biological systems, accounting for differences between in vitro and in vivo physiology. This presentation will describe the development and use of a microfluidics system to replicate in vivo PK profiles with cell-based assays to enable improved in vivo translation of drug actions. The system can be used to compare different PK profiles and dose scheduling paradigms to optimize drug-induced biological responses. The instrument can be used to quantify a desired biological response (efficacy) as well as an undesired side effect to establish an in vitro therapeutic index. This system has the potential to minimize animal studies for both efficacy and safety evaluation and effectively guide clinical use of candidate drugs.

  • Driven assembly of stem cell-derived human tissues for disease modeling and discovery
    Bill Murphy, University of Wisconsin

    The need for human, organotypic culture models coupled with the requirements of contemporary drug discovery and toxin screening (i.e. reproducibility, high throughput, transferability of data, clear mechanisms of action) frame an opportunity for a paradigm shift. The next generation of high throughput cell-based assay formats will require a broadly applicable set of tools for human tissue assembly and analysis. Toward that end, we have recently focused on: i) generating iPS-derived cells that properly represent the diverse phenotypic characteristics of developing or mature human somatic cells; ii) assembling organotypic cell culture systems that are robust and reproducible; iii) translating organotypic cell culture models to microscale systems for high throughput screening; and iv) combining genomic analyses with bioinformatics to gain insights into organotypic model assembly and the pathways influenced by drugs and toxins. This talk will emphasize recent studies in which we have explored biologically driven assembly of organotypic vascular and neural tissues. These tissues mimic critical aspects of human organs, and can be used for reproducible identification of drug candidates and toxic compounds. The talk will also introduce the use of our assembled human tissues to develop models of rare developmental disorders and degenerative diseases of the brain.

  • Performance of an automated Multi-Organ-Chip based long-term co-culture assay of a liver and an intestine organ model using a new robot platform
    Florian Huber, Technical University of Berlin

    The recent advent of microphysiological systems is envisaged to enable a global paradigm shift in drug development. Combining tissue models of various complexity in a microcirculation makes it possible to investigate systemic organ interactions on the smallest biological scale. In order to ensure physiological relevant culture conditions and treatment cycles for organ communication, a complex interplay of feeding strategies, substance exposure regimes and observation methods needs to be guaranteed. Currently, this requires a high amount of manual and personnel effort. The opportunity to have complex Multi-Organ-Chip assays performed by an automated robot platform will be the next milestone in acquiring more reliable and reproducible data for preclinical validation.

    Here, we present our first data generated by a novel Multi-Organ-Chip robot. The 2?Organ-Chip, used for this study, contains two culture circuits each combining two organ culture compartments with a microfluidic circuit. Various organ models e.g. 3D spheroids and barrier organs can be integrated into the 2-Organ-Chip. An on-chip micropump enables the circulation of one common medium and via a microscopic glass slide, the organ models can be observed at any time point. To evaluate the performance of the automated system we performed a co-culture of liver and intestine tissue models. 3D spheroids of HepaRGs and stellate cells have been pre-formed using ultra-low-attachment plates. The intestine was emulated via the EpiIntestinal™ models from MatTek. After initial evaluation of culture conditions an assay strategy was established and transferred to the Multi-Organ-Chip robot by using a dedicated customizable assay planning software. Step-by-step the respective culture parameters like a 24h feeding regimen, defined time points for compound application and sample extraction have been integrated into the robot interface. Furthermore, daily non-invasive in process controls like bright field imaging and fluorescence measurements enabled continuous observation of all tissue models. The Multi-Organ-Chip robot can operate up to 24 Multi-Organ-Chips simultaneously. In case of the 2-Organ-Chip we emulated 48 identical intestine-liver co-cultures, assigned into up to 12 different treatment groups. A heating platform kept the 2-Organ-Chips constantly at 37°C. The liquid handling procedures were performed under sterile conditions. Media and substances were pre-heated to 37°C and if necessary varying substance concentrations were pre-mixed automatically. Storage of culture supernatants was conducted in a built-in refrigerator. Subsequent analysis of tissue integrity and culture supernatants revealed that the automated cultivation of our liver-intestine organ models showed a high reproducibility and long-term vitality of the organ models with respective effects due to compound treatment.

    To the best of our knowledge, we here present the first automated platform, which can maintain complex multi-organ-combinationsover prolonged periods of time. The easy transfer of relevant any assay setup to our robot will enlarge the operational area for Multi-Organ-Chip-based assays in the near future.

  • 3D Paper-Based Tumor Models to Characterize the Relationship Between the Tissue Microenvironment and Multidrug Resistant Phenotypes
    Julie McIntosh, UNC Chapel Hill

    Multidrug resistance arises when a small population of cells in a tumor undergo changes in cellular phenotype that allow them to evade a particular treatment. We are developing a novel 3D paper-based culture screening platform that can stratify cells within tumor-like structures based on either environmental or phenotypic differences. Unlike other 3D culture platforms, we are able to rapidly separate (within seconds) discrete cell populations based on the location.

    Our platform involves stacking cell- or extracellular matrix (ECM)-laden paper scaffolds together to create a 3D tissue-like structure, which models the tumor microenvironment with defined gradients of oxygen, waste, and nutrients. The stacks are disassembled by peeling the paper scaffolds apart to isolate each discrete sub-population for downstream analyses of cell viability, protein and transcript levels, or drug metabolism. Our system is also highly customizable by: i) altering the size, number, or shape of available seeding areas by wax printing seeding boundaries onto each scaffold; ii) manipulating gradients by changing seeded cell density, stack thickness, or limiting exchange with culture medium; or iii) enabling either monoculture and co-culture setups, which allows the user to simulate a wide variety of in vivo conditions based on the research of interest.

    Here, we utilize this novel platform to investigate the phenotypic characteristics of chemotherapeutic resistant sub-populations within each culture. With a tumor cross-section model, we utilize flow cytometry to quantify the differences between cells at the drug resistant necrotic core of a tumor and cells at the proliferative outer shell. Specifically, we quantify the phenotypes that lead to passive resistance (e.g., decreased metabolic activity) or active resistance (expression of ATP-consuming drug efflux pumps). With an invasion assay setup, we investigate the relationship between drug resistance and the epithelial to mesenchymal transition (EMT) by isolating cells based on their level of invasiveness; a well-documented, observable phenotypic change that results in mesenchymal cells being more invasive than epithelial cells. The isolated sub-populations are analyzed for changes in i) drug response with dose-response curves, and ii) epithelial and mesenchymal specific protein expression levels with enzyme-linked immunosorbent assays (ELISAs) as a means to connect drug resistance to epithelial or mesenchymal phenotypes. This platform is an inexpensive customizable screening tool, which can be utilized to study specific subsets of cellular populations for advancements in both targeted chemotherapeutic development and understanding of the heterogeneity of tumor populations.

Automating Target-based and Complex Phenotypic Drug Discovery

Session Chair: Beth Cimini, Broad Institute

  • Discerning function from phenotype with high throughput image analysis
    Beth Cimini, Broad Institute

    As computational analysis of high-throughput imaging screens has advanced and matured, so has our understanding that cell morphological phenotypes have much to teach us about how cells function. After using staining techniques such as the Cell Painting assay (Bray et al 2016), features of cell size, shape, intensity, and texture can be extracted by tools like CellProfiler (McQuin et al 2018); once extracted, these features can be built into a “profile” that allows the researcher to group chemicals by mechanism of action, associate genes into functional pathways (even learning novel pathway-pathway interactions), and even help assess drug resistance for patients in ex vivo assays (reviewed in Caicedo et al 2017). While traditionally this has been done by comparing changes in the per-well average measurements after various treatments, incorporating measurements of the heterogeneity of treated cells can increase the ability to correctly group similar treatments (Rohban et al, 2018). In the absence of conventional segmentation and feature extraction, features extracted by deep networks on cropped treated single cells can also learn to classify treatments successfully (Caicedo et al 2018).

    While the features and relationships learned from such experiments are valuable in and of themselves and can answer questions within a given experiment, once extracted they can also be used for other purposes. High content imaging assays can be used to aid target enrichment by learning relationships between lower throughput functional assays and features extracted from imaging assays that are more easily scaled up. Such relationships can then serve as classifiers to create “virtual screens” that increase hit rates of selected drugs by as much as 50-250X (Simm et al 2018). These advancements may substantially bring down the time and/or cost of drug discovery, making research more efficient and creating a “Rosetta stone” that allows association and comparison across many kinds of data.

    Bray et al, Nature Protocols 2016. doi:
    McQuin et al, PLoS Biology 2018. doi:
    Caicedo et al, Nature Methods 2017. doi:
    Rohban et al, biorXiv 2018. doi:
    Caicedo et al, 2018 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). doi:
    Simm et al, Cell Chemical Biology 2018. doi:

  • An automated ‘Target-class’ assay to identify novel activators of K2P Ion channels
    Paul Wright, LifeArc

    Target-class drug discovery is a complementary approach to diseased focused strategies which concentrate on either a defined and well validated target or rely on using a physiologically relevant phenotypic assay. This third approach allows for the spreading of risk across a family of unprecedented targets, offering a greatly increased chance of identifying novel pharmacological agents by driving synergies in assay development, target structure and ligand development. We have developed an automated system for applying target based discovery to a family of ion channels, the two-pore domain potassium channels (K2Ps).Approximately 12% of marketed drugs are known to target ion channels. However, only a small subset of known channels are the molecular targets of therapies. K2P channels act to carry background (or leak) potassium current in a variety of cell types and despite a number of important roles, and promising expression profiles, they have proved a difficult target class to modulate with small molecules. This has in turn limited the interrogation of the precise physiological function of K2Ps and efforts to generate K2P targeting therapeutics. We developed a system to rapidly screen the K2P family to assess which channels were amenable to activation (which channels were ‘druggable’) and allow the simultaneous analysis of multiple channels for small molecule selectivity. A novel cell based assay was developed, which optimized cell culture techniques to significantly reduce assay development times and expeditely screen multiple targets at once. Novel analysis methods were also produced to allow quantification and rapid visualisation of large numbers of compound response curves at multiple channels.This ‘target class’ approach has allowed the simultaneous interrogation of a family of targets and identified a diverse range of previously undescribed pharmacology. Using novel automation and analysis techniques screening synergies have been maximized and early-phase hit identification timelines minimized. It has allowed the identification of K2P drug targets which are amenable to small molecule activation, de-risking multiple channels from a technical point of view, and provided selective starting points for chemical optimization. The advantages of a 'target class' approach compared to traditional target based or phenotypic screens will be discussed and future follow up plans for these targets will be presented.

  • A fully automated Multi-Organ-Chip Robot for standardized long-term culture, substance application and microscopic analysis
    Ann-Kristin Muhsmann, Technische Universität Berlin

    Currently, the performance of long-term in-vitro assays using microphysiological systems, like Multi-Organ-Chips, require high manual effort and work force. This makes it still difficult to use them for high-throughput testing and the adaption to industrial needs, where reproducibility and standardization of microphysiological assays are essential demands. Moreover, automation will facilitate the use of microphysiological systems for regulatory purposes.

    Here we present, for the first time, a fully automated robotic platform for the standardized long-term cultivation of TissUse’s Multi-Organ-Chips. One robot can culture up to 24 Multi-Organ-Chips simultaneously, providing customized incubation, systemic pulsatile media circulation, regular media exchange, substance application and microscopic analysis for each Multi-Organ-Chip. Four levels of interactive handling features were developed to ensure these functions. The Multi-Organ-Chips are inserted into a heating platform providing an adjustable constant temperature from 36 to 40°C. All liquid handling procedures are performed under sterile conditions inside a tailor-made class II safety cabinet. An automated provisioning system integrated into a refrigerator beneath the safety cabinet can hold a minimum of 66 well plates of various sizes and deliver them to the liquid handling system on demand. Cell culture media or substances are pre-heated to 37°C and applied with high pipetting accuracy using single use tips. The robot runs autonomously for a minimum of 4 days before a restock of disposables (e.g. pipet tips) is necessary. Media samples extracted from the Multi-Organ-Chips are stored at 4°C for further analysis. An actuated full sized microscopic unit beneath the Multi-Organ-Chips allows for routine optical analyses such as bright field imaging and fluorescence measurements at any chosen time point. Images are analyzed automatically using data processing software. A dedicated robot assay planning software enables the intuitive and time efficient input of new assay information. The software also checks for assay feasibility and will provide instructions for equipping the system with the respective Multi-Organ-Chips and consumables. All features ensure that in near future the platform will also be able to operate the TissUse Human-on-a-Chip in the near future.

    While requiring less personnel resources and enabling a more standardized and reproducible operation of the Multi-Organ-Chips, the automated platform will also make it easier to run even more complex culture strategies leading to a closer physiological maintenance of the organs cultured in the Multi-Organ-Chips. Feeding refinements (e.g. several times per day) and individual substance application together with customizable sample extraction time points and measurement patterns will further increase the value of Multi-Organ-Chip-derived data for preclinical validation.

  • Development of high-throughput data analysis methods to bridge high-throughput APC assay with ion channel research and drug discovery
    Tianbo Li, Genentech (Roche)

    Ion channels regulate a variety of physiological processes and represent the second largest class of known drug targets. Among the known methods to study ion channels, patch clamp electrophysiology remains the gold standard with its unsurpassable precision in all ion channel functional assays. The automated patch clamp (APC) screening technology has emerged to meet the challenge of scaling up this gold-standard method. However, due to the complexity of electrophysiological data, the fast-increasing APC throughput is facing a major challenge when it comes to robustly analyzing the data. Here I will discuss our recent breakthrough in this challenge by co-developing high-throughput screening data analysis methods with Genedata. To match our daily throughput of ~8,000 Nav channel recordings from the Nanion SyncroPatch 768PE, a data reduction strategy was used to reduce the original recording 1st level data size by ~1,000 times. Then the 2nd level data with more than 100,000 recordings can be analyzed by Genedata Screener® as one experiment. To ensure high data quality, an automated quality control method was developed by optimizing four key parameters: seal resistance, peak current, capacitance, and series resistance. Additionally, a customized SyncroPatch data analysis method was developed for Screener® to directly handle the 1st level raw data for kinetic characterization of Nav channel currents. Overall, the fast advancing APC high-throughput technologies together with the robust and high-throughput data analysis methods will have a significant impact on ion channel research and drug discovery.

Screening Automation: Modular vs. Highly integrated systems

Session Chair: Carleen Klumpp-Thomas, NIH/NCATS

  • Yoga for uHTS: How to be both strong and flexible to adapt to stretching requirements
    Yunyun Hsu, UHTS Group - Discovery Technologies

    High throughput and flexibility of a screening automation are two of the most important components that remain challenging to piece together. Highly integrated systems are well known for their high throughput, efficiency and robustness while modular systems are great for their flexibility and versatility. Amgen’s automation platform was designed to include the best of both worlds with a highly integrated system containing a versatile and future-proof docking station as well as a separate modular MuHTS system adapting to different throughput requests. The docking station enables the flexibility needed in terms of adapting different readers for various assay technologies. It also provides the option to run parallel protocols or doubling the throughput of a protocol when docking a second copy of the same reader. The modular MuHTS system developed in-house accommodates the need for smaller throughput or lower priority requests. This platform consists of a stationary hotel, a barcode scanner and a liquid dispenser with two flexible docking locations to include different offline readers and compound transferring technologies. This setup represents a meeting point between efficiency and flexibility to cover the needs ranging from a few plates for titration to hundreds of plates for a primary screen.

  • Perspectives on the design, deployment and utilisation of modular, mobile screening automation
    Paul Harper, AstraZeneca

    At SLAS2017 we presented on a partnership with High Res Biosolutions to develop a new concept for modular, reconfigurable automation to meet our varied small molecule screening demand. This successful project established the Co-Lab line of products.

    For SLAS2019 we will review the deployment, validation and impact to AstraZeneca’s internal and external facing screening activities. This will include, but not limited to; our experience from the use of collaborative robotics, CoLab system optimisation and HTS performance across flex cart and core platforms. We will discuss the benefits, learnings and pit falls of undertaking a complete laboratory re-fit to deliver scientist friendly HTS automation. In addition, we’ll outline the enhancements to date with peripheral equipment partners, system scheduling and automated quality control to further assay robustness.

    Looking further afield we’ll present our ongoing activities to integrate screening with our design, make, test and analyse (DMTA) chemistry automation. Reviewing our prototype, its functionality and development to date, along with our future concepts / aspirations to an integrated “lab of the future”.

  • Automated System Integration – Some Useful Paradigms
    Paul Hensley, Ionfield Systems

    High content and high throughput automation continues to evolve to address the changing requirements of the scientific community. It is reasonable to assume that this evolutionary process will continue, perhaps even accelerate, especially if recent changes in the drug discovery process demonstrate immediate success. The anticipation of application changes is already part of many organizations planning processes. The flexibility of new platforms and modules is assuming a high priority. From the perspective of a supplier, we’ve seen the virtually complete transition from clients having in-house automation expertise to instrument manufacturers and integrators providing upgraded capabilities, applications knowledge and a greater breadth of support services. Flexibility to meet changing discovery processes will result in an expansion of these services. We have developed a set of working paradigms for the integration of multivendor automation that shortens the time from inception to integration and operation. Each installation for us is a case study allowing us to find ways to improve. These commonalities will apply to many automation projects. Early planning is critical to success. It focuses on identifying the optimal functionality of the hardware for the application, will standard software suffice or will customization be required, how do we design and deliver the user interface, and is the capability required already in the API and drivers. Further along in the process, there is a review of process optimization within the robotic environment, primarily focusing on space utilization and timing/throughput issues. Once the early planning is complete, the process we use studies efficiency of operation of the entire robotic system, ease of operation, access for maintenance, communication of error detection/recovery, the supply of fluids/reagents, and safety. The user interface design employed helps to lead the operator through typical day to day use with helpful graphics and input prompts. Detailed application control settings are accessed from a click-through screen, easily reached as needed using the module interface. The same interface for automated operation is typically run through the API from the scheduler screens using that software’s design scheme. The detailed screens are accessed while at the scheduler screen but by screen mirroring. This design eliminates reprogramming of the API for changes to the operational software settings and customized scheduler software. Unique to our automation modules, they are routinely employed to clean labware for repeated reuse. Labware plus the overhead for ordering, shipping, inventory management, unpackaging and ultimately disposal can be reduced for most applications without any effect on results. More companies can be expected to offer products and services that address this market. Minimizing waste not only makes financial sense, but it also reduces the burden of drug discovery waste on the environment.

  • Reducing the Science Community Plastic Waste Contribution Through Financially Viable Solutions
    Ali Safavi, Grenova

    Plastic waste is a growing issue for our planet. It is estimated that more than 8.3 x 109 metric tons of plastic have been produced since its introduction. It is thought that less than 90% of total plastics are reused and instead enter the waste stream adding toxins and physical debris to the environment. In 2015, Urbina, et. al. estimated labs contributed 5.5 million tons a year to the problem. While many industries work to find innovative solutions to their plastic waste contributions, the science industry has been slow to adopt aggressive abatement or repurposing strategies. While at the same time, the amount of plastic in the lab continues to increase. Every sample uses multiple plastic tips and tubes to generate results. As more labs adopt new technology and automation strategies, laboratory plastic consumables waste will continue to grow. For example, by the end of 2018 it is estimated that over 77 million pounds of pipette tips will be discarded after a single use into landfills. That’s enough tips to circle the earth more than ten times. Additionally, the electricity wasted/used to produce that number of plastic pipette tips could have powered almost 21,000 homes in the US for a year. Multiple technologies and strategies are being employed within automation groups and at the benchtop to help laboratories substantially decrease their plastic waste. Employing these technologies can not only decrease waste but also easily align with overall laboratory budgetary goals resulting in both financial waste and plastic waste reduction.

Emerging Strategies and Technologies for High-Throughput Automation

Session Chair: Caroline Shamu, Harvard Medical School

  • Toward Standardizing Automated DNA Assembly
    Luis Ortiz, Boston University

    DNA-based technologies have revolutionized research over the past decade, and industry and academia are both keen to leverage these new capabilities as efficiently as possible. A crucial demand is the ability to assemble basic DNA parts into devices capable of executing a desired function. Equally important are the needs to build these devices quickly, robustly, and at scale. Common DNA assembly strategies like Modular Cloning and Gibson Assembly, while well-known in the field of Synthetic Biology, often differ greatly in their protocols and execution between research groups, individuals, and even for specific DNA constructions. This variability necessitates standardization, especially if the DNA assembly process is to leverage automation. Towards this end, we performed a parameter sweep to identify critical factors, such as DNA part concentration, final assembly size, and number of DNA parts in a reaction, which most impact DNA assembly efficiency when automating the process. These efforts included development of a software tool called Puppeteer, which provides both high level in-silico specification of assembly products using DNA part repositories, and generation of both human-readable and machine-readable protocols for DNA assembly reaction preparation. Puppeteer also quantifies the advantage of executing DNA assembly jobs on liquid handling hardware through a new metric we developed known as “Q-metrics”. Q-metrics (Q-Time & Q-Cost) quantify the savings of both time and cost of a particular job done on a liquid handling robot against performing the job manually. We calculated these metrics by running head-to-head experiments comparing associated hands-on time and costs of performing identical DNA assembly jobs of different scopes both manually and automated with a liquid-handling robot. Taken together, we have developed protocols, software, and quantifiable metrics that will provide a solid foundation for research groups new to automating the process of DNA assembly. We are continuing to develop our pipeline and expanding the assembly strategies used, as well as the number of Puppeteer-compatible liquid handling devices. It is our hope that we can provide a fully automated, robust, and repeatable DNA assembly package that can be deployed in any academic or industrial setting.

  • Routine lab assembly line 2.0 – 3D-Inspection and manipulation of biological samples in culture dishes with the PetriJet31X platform
    Felix Lenk, TU Dresden INT

    Digitalization, Automation and Miniaturization currently change the way we live and work. It also affects the daily work in laboratories creating what we perceive as the Lab 4.0 or the Lab of the Future. The disruptive development of new technologies such as open source automation technology, the Internet of Things (IoT) and 3D-printing offer endless possibilities for an in-house engineering of new laboratory devices, which are compact, adaptable and smart. In conjunction with automated 3D-image analysis, powerful instruments emerge to create and resolve research data.

    At the SmartLab systems department of the Technische Universitaet Dresden, Germany approaches for the laboratory of the future have been developed and implemented. This includes the PetriJet31X platform technology which was developed the automate all processes associated with culture dishes in environments such as routine laboratories for microbial screening or blood sample testing as well as culture development for the next generation of antibiotics.

    The device technically is a x-y-robot consisting of two linear axles enabled to transport all kinds of culture dishes from A to B through a 3D-printed gripper-system which can also remove the lid of the culture dish. Core part of the programming is a self-learning control software that does not need any teaching – the most time-consuming part of setting up a typical robot. With the presented solution an experiment conducted on samples is planned only once and executed for all culture dishes in the machine with the right processing station installed – e. g. 3D sample imaging and analysis. It is no longer necessary to specify locations for culture dish piles and treated dishes get allocated dynamically while user interactions are directed by LED-lighting. The system can process more than 1.200 culture dishes in an 8-hour shift and is equipped with a storage unit for these culture dishes. The system consists of a 3D-imaging station able to resolve images of the culture dishes and their contained biological samples from 60 different positions. With these images, photogrammetric algorithms form a 3D-model of the culture dish contents and non-invasively derive properties such as biomass, volume, shape etc. of the biological sample. With this system, the number of needed samples for growth screenings are brought down as no samples need to be harvested during the cultivation.

    The PetriJet31X platform now operates at the Chair of Microbiology at the TU Dresden for screening of new antimicrobial substances and the next generation of antibiotics. The systems enables biologists to screen agent combinations faster and use the gained image data to feed new deep-learning algorithms.

  • SLAS2019 Innovation Award Finalist: Microfluidic chambers using fluid walls for cell biology
    Cristian Soitu, University of Oxford

    A high variety of microfluidic platforms that aim to miniaturize and automate biochemical assays have been developed in the last two decades but for the majority of these there is no progression beyond the proof-of-concept experiments due to the complexity of fabricating and operating them. We introduced a novel method for creating arrays of fluidic-walled sessile droplets, the equivalent of conventional microtiter plates, using only standard dishes and fluids [1]. The method consists in displacing a continuous fluid layer into a pattern of isolated chambers overlaid with an immiscible liquid to prevent evaporation. This liquid plays an additional role acting as a physical barrier, making the system contamination resilient. The resulting chambers have a rectangular footprint and use the maximum surface available. Pliant, self-healing fluid walls confine volumes as small as 1 nl.

    We now show that the technique is not restricted to arrays but allows for quick and flexible prototyping of a high range of microfluidic circuits with irregular patterns. Chambers of various geometries can be created around a region of interest, which can contain cells, with unprecedented flexibility. A basic set of capabilities is demonstrated by accommodating cloning and wound-healing assays. Fluidic-walled chambers offer good optical clarity making them ideal for imaging the samples contained within. The simplicity of the technique eliminates the need for expensive machinery or equipment. It is all incorporated in a benchtop system that can produce and access arrays of hundreds of fluidic chambers within minutes

    • Soitu, C., A. Feuerborn, A. N. Tan, H. Walker, P. A. Walsh, A. A. Castrejón-Pita, P. R. Cook and E. J. Walsh (2018). "Microfluidic chambers using fluid walls for cell biology." Proceedings of the National Academy of Sciences.

  • Achieving Reproducibility in Biological Experimentation with an IoT-Enabled Lab of the Future
    Ben Miles, Transcriptic

    Reproducing results of published experiments remains a great challenge for the life sciences. Fewer than half of all preclinical findings can be replicated, according to a 2015 meta-analysis of studies from 2011-2014. Many variables contribute to this poor rate of reproducibility including mishandled reagents and reference materials, poorly defined laboratory protocols, and the inability to recreate and compare experiments.

    The consequences are significant. In drug discovery, failure to reproduce experimental outcomes stands in the way of translating discoveries to life-saving therapeutics and revenue-generating products. In economic terms, at least $28 billion in the U.S. alone is spent annually on preclinical research that is not reproducible. Even a small improvement to the rate of reproducibility could make a large impact on speeding the pace and reducing the costs of drug development.

    Most recommendations to address the reproducibility crisis have focused on developing and adopting reporting guidelines, standards, and best practices for biological research.

    Cloud computing combined with the Internet of Things (IoT) offers an opportunity to leapfrog the standards-setting debate and create more precise and reproducible research without adding human capital or increasing process time. An automated, programmatic laboratory reduces error by relying on hands-free experiments that follow coded research protocols. Tracked by an array of sensors and pushed to the cloud, every documented step, measurement, and detail is automatically collected and stored for easy access from anywhere through a web interface. Underpinned by a connected, digital backbone, biological experimentation looks more like an information technology driven by data, computation, and high-throughput robotics, ultimately leading to advances in drug discovery and synthetic biology.

    At Transcriptic, we have developed the Transcriptic Common Lab Environment to address the current state of large-scale experimentation and to unlock a future where software and technology innovation drive high-throughput scientific research. This laboratory-of-the-future is available to users, via the internet to work in a cloud-based ecosystem where collaborators can traceably fine-tune, debug, and optimize iterative biological experiments to achieve reproducible results.

    As experiments become more complex, datasets larger, and phenotypes more nuanced, transformative technologies like a programmatic robotic cloud lab will be necessary to ensure that these high-value experiments are reproducible, the results can be trusted, and the protocols producing these experiments can be compared. This advance builds on a foundation of IoT-enabled innovations that have already contributed to a smarter, more efficient, and safer world.

In-house and Open Source Automation: Noncommercial products, devices and software used to support laboratory automation

Session Chair: Sam Michael, NIH/NCATS

  • 3D printing to cut out the middle man: rapid prototyping to innovate robots and save money
    J. Colin Cox, Genentech

    3D printing affords highly customized and affordable in-house design solutions for laboratory workflow issues. In order to address process or vendor shortcomings, prior solutions may have been to work with a plastics vendor to design a new item, such as a reservoir. Doing so may have taken tens of thousands of dollars, several months of waiting, many tedious back-and-forth meetings with the vendor, statements-of-work and other legal documents, etc. In this presentation, we show how straightforward ideas, free software, and a thousand-dollar 3D printer can obviate those obstacles. We will review what 3D printing means, how people new to the community can get started and explain the basic printing process. Next, we will share highlights from our own experience in using 3D printing to rapidly prototype new ideas and demonstrate how a printer can easily pay for itself in a few weeks of use. For instance, we designed customized tube storage racks for thousands of oligonucleotides at only 7% the cost of a less functional, commercially-available storage rack. We will also show how we constructed racks specific for workflows both at the bench and on a robotic work-surface, which can be produced on demand. In another use case, we illustrated how creating microplate bumpers can enhance automated magnetic bead processing. Finally, we will showcase and reveal in a step-by-step fashion how we rapidly prototyped and iteratively designed reagent-saving reservoirs specific to one of our liquid handlers to save tens of thousands of dollars each year in wasted reagents. Our solution is fabricated specifically for a particular system and allows reclamation of clean, unused reagents at the end of a method. Overall, this presentation is intended to introduce robotic laboratories to rapid prototyping, 3D printing, and to help labs begin to produce their own solutions without the need of vendors or middle-men.

  • An Additive Biomanufacturing Platform to Automate 3D Cell Culture and Screen for 3D Biomaterial Properties
    Sebastian Eggert, Institute of Health and Biomedical Innovation, Queensland University of Technology

    Three-dimensional (3D) cell culture technologies are emerging as a toolbox to engineer 3D models with physiologically relevant characteristics to bridge the gap between in vitro and in vivo. To incorporate sophisticated biochemical and mechanical cues as mimics of the native extracellular matrix, biomaterials – particularly hydrogels – are implemented to design physiologically relevant models for investigating cell physiology and manufacturing disease models.

    Even though the relevance of these 3D model has become recognized and successfully demonstrated for drug screening applications, current 3D cell culture technologies for hydrogels are not well suited for industrial applications. Preparation and manufacturing steps for 3D models involves many manual steps that increase the probability of user error; errors that can potentially manifest as inaccurate results. Additionally, the manufacturing of 3D models with standard laboratory equipment, which has been originally developed for 2D cell culture, is challenging, since automated solutions are missing for 3D model preparation and manufacturing involving hydrogels. In addition, low-throughput of hydrogel-based 3D cell culture applications hampers wide-spread adoption and integration into the drug development pipeline. For this reason, 3D cell culture can benefit greatly from the integration of automation and high-throughput approaches for 3D model manufacturing and screening.

    Herein, an ‘Additive Biomanufacturing Platform’ has been designed and developed for automated production and testing of hydrogel-based 3D models to screen for 3D biomaterial properties. Using a modular platform design, exchangeable and customizable units have been designed to enable a (semi-) automated workflow for preparation, manufacturing, and screening. The biomanufacturing unit converges conventional pipetting capabilities with emerging bioprinting functionalities, resulting in a closed process workflow for the preparation of the various hydrogel compositions, required mixing steps with cells, and finally the production of the 3D models into SBS-compatible or customized formats. In addition to a well plate transportation unit, the integrated inventory and storage units enable throughput up to 20 well plates. To demonstrate the applicability of the developed platform to automate the manufacturing of 3D models, the integration of various hydrogel systems will be presented along with the capability to manufacture microdroplets as well as defined geometric structures. The feasibility to manufacture physiologically relevant 3D models in a reproducible and high-throughput manner will be demonstrated with breast (MCF-7) and prostate (LNCaP) 3D cancer models.

    In this presentation, we discuss the current barriers to successfully translate hydrogel-based 3D models for industrial applications: Automation, throughput, and validation criteria. Building upon our research objectives, the talk presents how the development of automated solutions for manufacturing and screening of 3D models are capable of fostering reproducibility and efficacy. Finally, an outlook will present the capabilities to screen for 3D biomaterial properties to establish a 3D model library.

  • Enabling Endogenous Protein Detection by Acoustic Droplet Ejection: 3D-Printed Labware and Beyond
    Michael Iannotti, NIH/NCATS

    While antibody-based assays such as sandwich ELISA and AlphaLISA can detect unmodified proteins and are routinely used in high-throughput screening (HTS), they have constraints due to the quality and availability of antibodies, excessive costs, and technical considerations. An alternative to these techniques is the reverse phase protein array (RPPA) in which nanoliter spots of cell lysate are arranged in high density onto a protein binding substrate, typically using nitrocellulose-coated glass slides and pin- or tip-based arrayers, to enable endogenous protein quantification by ubiquitous immunochemical protocols. While only requiring a single antibody, the high costs of coated slides can be prohibitively expensive for large HTS campaigns, and instrumentation compatibility with slides can be an issue.

    To generate new protein-based assays for 1536-well HTS, acoustic droplet ejection (i.e. acoustic dispensing) was used to transfer nanoliter volumes of media from cell-containing source plates to recipient assay plates for quantification of a secreted bioluminescent reporter. We then designed an orthogonal antibody-based methodology by merging the precision and flexibility of acoustic dispensing with the throughput of RPPA. In this technique, which we call acoustic RPPA, costly nitrocellulose-coated slides are replaced by prototype 3D-printed, automation-compatible nitrocellulose membrane plates to enable an acoustically arrayed immunoassay with similar performance to the bioluminescent assay for measuring protein secretion. Incorporating protein standards enables the quantification of native, endogenously secreted protein from 384- and 1536-well cell-based assays with picogram sensitivity, and the non-destructive nature of acoustic RPPA permits multiplexed media sampling with cytotoxicity and imaging assays. Performance and throughput of new iterations of acoustic RPPA assay plates are being further improved by 3D printing and machining additional labware that is commercially unavailable. The acoustic RPPA methodology can therefore be utilized as a powerful standalone proteomic technique or component of a robust drug discovery platform that generates extensive biological profiles from individual wells using relevant cellular systems such as stem and primary cell models.

  • Collaborative product development of equipment for high throughput screening
    Helen Plant, AstraZeneca Pharmaceuticals

    There is a wide range of commercial equipment available for screening purposes, including reagent dispensers, plate washers, and multiparameter plate readers. Both speed and accuracy are key for use in a high throughput screening environment, where equipment can be either used in stand-alone mode, or integrated onto more complex automation platforms.

    AstraZeneca have undertaken an extensive review of this equipment over the last 4 years as part of an infrastructure refresh programme. This has led us to select preferred equipment and to identify gaps in this niche area of science. In turn we have developed positive partnerships with vendors to drive the development of these key pieces of equipment, fulfilling our requirements and delivering better products.

    Within this presentation we will give examples of how the following products have been developed and how they can now improve HTS workflows:

    • The Certus Flex microplate dispenser: microvalve technology for contactless dispensing of a wide range of liquid classes.
    • The BlueCatBio centrifugal plate washer: centrifugal emptying with the individual addition of up to four separate solutions for complex phenotypic assays.

    We will outline the enhancements to date with peripheral equipment partners to collaboratively design and deliver hardware solutions and software enhancements for these devices.

    Moreover we will highlight the future developments in this field and how we envisage these being applied.

Back to Top

Biologics Discovery

Track Chairs: Wade Blair, Viiv Healthcare and Jonah Rainey, Oriole Biotech, Inc

Next-generation technologies: High-throughput miniaturized screening platforms for rapid discovery and development of lead biologics

Session Chair: Veronique Lecault, AbCellera

  • Accelerate Library-to-Lead Triage of Antibody Libraries Using Array SPR
    Yasmina Abdiche, Carterra

    Throughput, speed, resolution, and sample consumption are often key limiting factors for performing detailed kinetic and epitope characterization of antibody libraries destined for use as therapeutics or supporting diagnostics and reagents. In addition, antibody immune responses can be used to survey the epitope landscape of pathogens and inform the design of better immunogens for developing vaccines. Here, we demonstrate three core applications of Array SPR that can be performed routinely and quickly in a 384-array format; (1) capture kinetics, (2) epitope binning, and (3) epitope mapping. Together, these assays provide a comprehensive characterization of your antibody library with minimal sample consumption, often less than 5 micrograms per antibody for most applications, enabling more confident decisions to be made earlier in the research process and obviating the need for preliminary ELISA screening. We show that Array SPR can generate high quality capture kinetic data from 384 mAbs in parallel using less than 10 micrograms of purified antigen, making it highly efficient on antigen consumption. Epitope binning in a 384-array format allows for a rapid “high-definition” assessment of the depth and breadth of your library’s epitope diversity, providing exquisite resolution between near-identical clones, which can aid in the identification of new epitopes and shed light on mechanism of action. Since antibody consumption does not scale with the size of the panel being investigated when using Array SPR, a 384x384 binning matrix can be accomplished with the same antibody sample consumption as for a smaller panel. Finally, arraying a peptide library facilitates the epitope mapping of a large panel of antibodies (384+), if their epitopes can be recapitulated on peptides.

  • When the Power of Nature Meets Technology: Overcoming Tolerance by Deep Mining of Immune Repertoires
    Veronique Lecault, AbCellera

    The majority of monoclonal antibody (mAb) therapeutics have been generated and isolated from natural immune systems, typically using mouse or rat hybridoma methods. As targets become increasingly challenging, the ability to look deeply into natural immune repertoire is often critical to unlock therapeutic programs.

    AbCellera has developed an end-to-end antibody discovery platform to select monoclonal antibodies with defined properties directly from single B cells. Antibody-secreting cells (ASCs) from patients or immunized animals are loaded into devices containing arrays of nanoliter-volume chambers. Compartmentalization of ASCs in small volume chambers enables the detection of antibody secretion from single cells within hours. The screening platform is integrated within a fully automated instrument that enables the identification and recovery of selected ASCs based on high-throughput automated fluorescent imaging. ASCs with desired antibody properties are identified based on a wide array of multiplexed assays including bead-based binding, specificity and competition, as well as cell-based assays for binding and functional readouts. Single cells of interest are recovered and processed using a next-generation sequencing approach to retrieve the paired heavy and light chain variable region sequences with high efficiency (> 90%). Using a two-step screening process, this platform allows for multi-parameter antibody selections from any species (e.g. human, mice, rat, rabbits, llamas) and from different immune tissue sources (e.g. blood, spleen, bone marrow, lymph nodes), with a routine throughput between 200,000 and 5,000,000 cells screened per run and recovery of over 200 paired sequences in approximately 5 days.

    While antibodies from natural immune responses are widely regarded as superior to those generated by display technologies, immune tolerance poses a serious challenge for targets with high inter-species homology. Insoluble and poorly immunogenic targets such as membrane proteins exacerbate this challenge. We show how AbCellera’s ultra-deep screening technology overcomes these challenges, producing hundreds of diverse rodent antibodies against targets with 100% rodent-human homology.

  • Single-cell Deposition Technologies to Increase Cloning Efficiency During Cell Line Development
    Mandy Yim, Genentech

    Technologies, such as microfluidic drop-on-demand dispensing, have gained much attention in the biopharmaceutical industry in recent years. This is largely driven by health authorities’ requirements for demonstrating that biologics-producing cell lines are clonally derived. Recent technology advances have offered some advantages such as higher deposit efficiency, additional monoclonal assurance, and option to integrate the workflow into a robotic system. Here we review evaluations from our early assessment of emerging platforms of single-cell deposition technologies to increase cloning efficiency at our cell culture facility. In this talk we will discuss the technology that we adopted and its subsequent integration into our workflow.

  • The Data Journey at GSK: How Biopharm Molecular Discovery has digitalised its workflows to increase data quality, reduce cycle-times and enable data reuse.
    Stephen Ashman, GlaxoSmithKline

    In recent years, the Biopharm Molecular Discovery team at GSK has invested in platform technologies and process optimisation which has dramatically increased the scale and throughput of the organisation. As we have grown, the volume of samples we manage and data we produce has increased greatly and traditional methods of capturing, analysing and sharing data for teams to interpret are no longer adequate to keep pace. At GSK we passionately believe that our data and the organisational knowledge that it encodes is one of our greatest assets. To maximise its value, we have modernised our workflows and processes. By focussing on data standards, workflow automation, non-invasive data governance, data integration and upskilling, the "Data Journey Team" has significantly improved the efficiency of our therapeutic programs, increased access to data and enabled new analyses which previously could not have been considered. To deliver best practice across the organisation, we have ensured that every therapeutic program team has a data analyst whose role is to assure the quality of the team's data and configure dashboards which are accessible to all team members and offer impactful visualisations to facilitate rapid decision-making. To enable this, we have data engineers who integrate data from multiple master data sources including our legacy systems. Our goal is to remove the need for data silos (spreadsheets and slide presentations) and instead provide our teams with the tools to interrogate the source data directly. The efficiency can be remarkable; some critical path data wrangling tasks which previously took weeks can now be achieved in a few minutes with templating and well-designed visualisations. Data integration has also enabled us to reveal learnings across multiple programs and quickly share high quality data with business partners for secondary uses such as modelling, prediction and in silico design. The upshot, we have developed a Biopharm Knowledge-base, populated from source data and metadata, accessible across the organisation and generating powerful new insights to drive our science.

Specialized approaches to develop biologic therapeutics for complex targets/h4>

Session Chair: Jonah Rainey, Oriole Biotech, Inc

  • Generation of ion channel blocking antibodies by fusing venom-derived 'knottins' into antibody CDR loops
    Damian Bell, IONTAS

    Much effort and expenditure has been spent by pharmaceutical and biotechnology companies with little success in the quest to generate potent and selective antibody inhibitors of ion channels. In contrast, a multitude of venomous animals block ion channels using small cysteine-rich peptides in defence or predation. However, such naturally occurring “knottin” blockers of ion channels often suffer from manufacturing difficulties, short half-lives and a lack of specificity.

    Using phage display we have developed a novel molecular fusion format wherein naturally occurring cysteine-rich peptides are inserted into peripheral CDR loops of an antibody while retaining the folding and function of both molecules. In this novel format (termed a KnotBody), the cysteine-rich peptide enjoys the extended half-life of an antibody molecule and the peripheral CDRs gain additional diversity within a scaffold which is pre-disposed to blockade of ion channels. We have demonstrated functional insertion of multiple cysteine-rich peptides which block the voltage-gated potassium channel, Kv1.3 and the acid sensing ion channel ASIC1a. The modular nature of the KnotBody binding surface and the amenability of this format to phage display technology will facilitate further optimisation of potency and selectivity of ion channel blockade by engineering both knottin and antibody loop sequences.

  • Antibacterial monoclonal antibodies : A strategy to prevent serious bacterial infections
    Christine Tkaczyk, MedImmune

    Staphylococcus aureus is an opportunistic pathogen that causes numerous debilitating infections such as pneumonia, endocarditis, bacteremia, complicated skin soft tissue infections. These infections are associated to high level of morbidity and mortality especially in immune-compromised patients. Antibiotic resistance has reduced treatment options and underscores a need for alternative antibacterial treatment strategies. The increase incidence of multiple drug resistant isolates raises for the need of alternative therapies such as immuno-therapy. We have developed three human monoclonal antibodies (mAbs) against Staphylococcus aureus secreted alpha toxin (MEDI4893), and the four main bi-component leukotoxins (SAN481) as well as surface expressed Clumping Factor A (SAR114), six virulence factors exerting different activities on the host. AT and the bi-components leukotoxins (LukSF, LukED, HlgAB and HlCB) are pore-forming toxins that causes immune dysregulation, cell death and promotes bacterial dissemination, whereas ClfA is a fibrinogen binding protein that promotes biofilm formation, bacterial agglutination and complement evasion. MAb combination showed broad strain coverage in multiple animal models through complementary mechanism of action such as toxin neutralization, inhibition of bacteria agglutination and opsonophagocytic killing. MEDI4893 prophylaxis also demonstrated efficacy in a S. aureus + Gram-negative mixed lung infection model. Together our data hold some promise as an alternative therapy for the prevention of S. aureus diseases.

  • Recombinant human B cell repertoires enable screening for rare, specific and natively-paired antibodies
    Saravanan Rajan, MedImmune

    The human antibody repertoire is increasingly being recognized as a valuable source of therapeutic grade antibodies. However, methods for mining primary antibody-expressing B cells are limited in their ability to rapidly isolate rare and antigen-specific binders. Here we show the encapsulation of two million primary B cells into picoliter-sized droplets, where their cognate V genes are fused in frame to form a library of scFv cassettes. We used this approach to construct natively-paired phage-display libraries from healthy donors and drove selection towards cross-reactive antibodies targeting influenza hemagglutinin. Within four weeks we progressed from B cell isolation to a panel of unique monoclonal antibodies, including seven that displayed broad reactivity to different clinically-relevant influenza hemagglutinin subtypes. Most isolated antibody sequences were not detected by next-generation sequencing of the paired repertoire, illustrating how this method can isolate extremely rare leads not likely found by existing technologies.

  • Integration of biotherapeutics in combination screening with small molecule libraries
    Matthew Hall, National Ctr for Advancing Translational Sci

    Biotherapeutics account for 50% (11/22) of the new molecular entities that were approved by the FDA in 2016 with an additional 15 biotherapeutic NMEs having been approved in 2017, comprised of nine antibodies, one antibody-drug conjugate, and two enzymes. This highlights a trend in pharmaceutical companies to invest significantly in alternative modalities to small molecules. With over 200 approved recombinant protein products on the market and over 1,500 more in clinical trials, there exists a clinical need to identify effective combinations of small molecule and biotherapeutics. We have developed a high throughput screen of annotated compound libraries for combination with biologics to identify tractable synergistic combinations and elucidate underlying biology. These initial studies have led to small molecule / biologic combinations that have shown synergy and promise in animal models. Optimization of experimental procedures will be described and case studies will be discussed.

Phenotypic selections and novel assays to enable biologics discovery

Session Chair: Steven Rust, MedImmune

  • A patient centric function F.I.R.S.T™ approach to cancer immunotherapy discovery
    Björn Frendéus, BioInvent

    Identification of novel targets, pathways and drugs to help boost activity and overcome resistance to currently available drugs provides a formidable challenge to improve cancer patient survival. BioInvent has developed a phenotypic discovery platform F.I.R.S.T™ that uses primary patient cells, immune competent- and humanized experimental models, and a human recombinant antibody library, to identify clinically relevant target:antibody pairs in an integrated manner.

    This talk will introduce and exemplify how F.I.R.S.T™ has been used to identify multiple target:antibody pairs that modulate key pathways critical to overcome immune suppression in the tumor microenvironment. Focus will be on complementary strategies to identify novel Treg pathways, and of overcoming antibody drug resistance.

  • High Throughput Measurement of Antibody Drug Pharmacokinetics: Assessment of Penetration of Antibodies and Conjugates into 3D Cell Culture Models
    Thomas Villani, Visikol, Inc

    In developing an antibody-based therapeutic for a solid tumor target, it essential that the therapeutic not only binds to the target of interest with high affinity but also penetrates the tissue. The ability for an antibody-based therapeutic to penetrate into tissue has a significant impact on its in vivo efficacy and thus a reliable and inexpensive assay for screening antibodies for penetration has great value in the therapeutic development process. To address this need, an in vitro assay was developed that combines 3D cell culture models with tissue clearing and fluorescent labeling to quantitatively assess antibody penetration. It was demonstrated that this assay allows for rapid evaluation of antibody penetration within models that accurately replicate in vivo diffusion kinetics and that whole antibody libraries could be quickly screened. The introduction of this assay allows for antibodies to be quickly screened, altered and optimized during inexpensive in vitro assays prior to expensive in vivo studies where poor penetration can lead to late stage failures.

  • High-throughput Analysis of Bispecific Antibodies for Clone Selection using RapidFire Mass Spectrometry
    John Tran, Genentech

    With recent new advances, high-throughput label-free screening is now realized. Of the new technologies that have enabled this, the Agilent RapidFire, a trap-n-elute liquid chromatography system coupled to mass spectrometry, has been routinely utilized for screening small molecule, peptide, and protein libraries. However, there are relatively few reported applications using this system to screen larger drugs such as the emerging bispecific antibody therapeutics. In our application, bispecific IgGs were generated through coexpressing two different light and heavy chains in a single host cell. Assembly of these antibodies was facilitated by introducing knob-into-holes mutations in the heavy chains. While this strategy is potentially more efficient, unwanted mispaired IgG species can be produced in addition to the desired bispecific IgG. (1) The goal of this assay is to rapidly screen through hundreds to thousands of clones through characterizing and quantitating bispecific antibodies to select for the best pair match. In order to achieve the desired throughput, we have developed a RapidFire-MS method towards this purpose.

  • Antibody Phenotypic Selection Enables the Discovery of Novel Molecular Targets and Therapies
    Steven Rust, MedImmune

    Antibody drug discovery has predominantly relied upon target-specific approaches for clinical success. However, as the search for well-validated, antibody-tractable targets becomes more challenging and target space becomes increasingly competitive alternative approaches, such as “target-agnostic” antibody phenotypic selection, are gaining favour. In this presentation, successful antibody phenotypic selections, developed for the discovery of Oncology therapies, will be detailed. Results from screening antibodies for anti-proliferative and pro-apoptotic effects will be discussed. The application of 3D cell cultures and high content imaging to identify antibodies with subtle mechanistic differences will also be detailed. Finally, the requirements for performing an effective antibody phenotypic selection, including cell supply, antibody enrichment, assay development and target deorphanisation, will be summarised.

Back to Top

Cellular Technologies

Track Chairs: Greg Davis, SIAL and Luke Gilbert, UCSF

Advances in Genome Editing Technologies

Session Chair: Gregory Davis, MilliporeSigma

  • Strategies for Addressing Chromatin Limitations in CRISPR-based Genome Editing
    Gregory Davis, MilliporeSigma

    CRISPR systems function as an adaptive immune system in bacteria and archaea for defense against invading viruses and selfish genetic elements. While the effector nucleases from some CRISPR systems have been engineered for heterologous targeting in eukaryotes, practical implementation of most CRISPR systems has been met with mixed or little success. Furthermore, within eukaryotic chromosomes, distinct genomic loci are sometimes recalcitrant to modification by even the best crafted CRISPR tools. Eukaryotic chromatin structure is distinctively different from that of prokaryotes, and this creates potential molecular incompatibilities since bacterial genomes represent the context in which all known CRISPR systems have evolved. Mammalian heterochromatin has been hypothesized to limit Cas9 target access, and nascent published data suggests nucleosomes do impede Cas9 target access and cleavage in vitro. This presentation will focus on a brief review of literature which substantiates chromatin as a barrier for genome editing and how this issue has influenced our strategic approaches for improving CRISPR-based genome editing. In particular, data presented will cover our proxy-CRISPR methods for restoring DNA binding, nuclease, and HDR activity of various CRISPR systems (Chen et al., 2017) and a new approach (unpublished) based upon Cas9 / Cpf1 fusions to several domains known to destabilize nucleosomes and facilitate decompaction of chromatin.

  • Developing methods to enable precise and specific genome editing in living cells
    Wei-Hsi Yeh, Harvard University & Broad Institute

    Methods to precisely and efficiently edit the DNA within the genome of living cells has long been a major goal of the life sciences. The recently established field of base editing has enabled a new approach to genome editing without inducing double-stranded DNA breaks. DNA base editors comprise a catalytically disabled nuclease fused to a nucleobase deaminase enzyme and, in some cases, a DNA glycosylase inhibitor. Base editors directly convert one base pair into another, enabling the efficient installation of point mutations in non-dividing cells without generating excess undesired editing byproducts. Thus far, two classes of base editor have been developed; cytidine base editors capable of converting a C:G base pair to a T:A base pair, and adenine base editors which can convert an A:T base pair to a G:C base pair.

    The application of base editing, like other forms of genome editing, should ideally proceed with minimal off-target modification at unintended genomic loci. We describe two advances that address these limitations. First, we reduce off-target base editing by installing mutations into our third-generation cytidine base editor (BE3) to generate a high-fidelity base editor (HF-BE3). Next, we purify and deliver BE3 and HF-BE3 as ribonucleoprotein (RNP) complexes into mammalian cells, establishing DNA-free base editing. RNP delivery of BE3 confers higher specificity even than plasmid transfection of HF-BE3, while maintaining comparable on-target editing levels. Finally, we apply these advances to deliver BE3 RNPs into both zebrafish embryos and the inner ear of live mice to achieve specific, DNA-free base editing in vivo. Next, we characterized the specificity of adenine base editor (ABE) and determined that it displays a dramatically improved DNA specificity when compared to Cas9 nuclease. Finally, we will discuss other new approaches for precision genome editing that do not rely on generation of DNA double stranded breaks.

  • CRISPR-STOP: gene silencing through base-editing-induced nonsense mutations
    Cem Kuscu, University of Tennessee Health Science Center

    CRISPR–Cas9-induced DNA damage may have deleterious effects at high-copy-number genomic regions. Here, we use CRISPR base editors to knock out genes by changing single nucleotides to create stop codons. We show that the CRISPR-STOP method is an efficient and less deleterious alternative to wild-type Cas9 for gene-knockout studies. Early stop codons can be introduced in ~17,000 human genes. CRISPR-STOP-mediated targeted screening demonstrates comparable efficiency to WT Cas9, which indicates the suitability of our approach for genome-wide functional screenings.

  • An Automated Gene-Editing Platform for Breast Cancer
    Angela Quach, Concordia University

    About 70% of breast tumours express estrogen receptor alpha (ER?), and has been a strong stimulator for breast cancer proliferation. In addition to their complex roles in cancer, ER? also controls a wide range of physiological processes from regulating the development and function of the female reproductive system and initiate protective based functions. Although to fully understand the connection between physiological and molecular functions of the estrogen receptor, it requires an in-depth understanding of the spectrum of genes regulated in the ER pathway. Hence, there have been studies in finding new small molecule inhibitors that will prevent the upregulation of genes related to the estrogenic signaling pathway. But before designing new inhibitors, a continuous challenge is to find genes that are related to the ER pathway that regulates the growth and differentiation of the cells. A possible mechanism is to search for these genes is by knocking out these genes in the genome using gene-editing techniques. Our group has recently developed an automated gene-editing droplet-based microfluidic platform which is capable of automating culturing and editing of lung cancer cells using traditional-based lipid-mediated transfection and performed cellular analysis through imaging techniques. However, in the context of breast cancer cells, traditional methods of gene-editing are not possible. Breast cancer cells (like T47D-KBLuc) are typically hard-to-transfect cell lines which lead us to integrate alternative gene delivery methods (e.g., viral transduction and electroporation) on our microfluidic system to improve delivery. Furthermore, to show the versatility of our platform, this led us to test different types editing methods: RNA interference (RNAi) using short-hairpin RNA (shRNA) and CRISPR using Cas9 protein. Using these different methods, we present three main findings: (1) results from the optimization of gene delivery conducted through viral transduction and electroporation to assess the efficacy of the novel integration of these methods on droplet-based microfluidics, (2) to maintain the integrity of analysis-on-chip, results of cell proliferation measurements using immunofluorescence on device is shown and (3) finally, the targeting of genes expressing ER?, p53 and other important oncogenes will be assessed for a loss-of-function screen. Overall, with the novel incorporation of alternative gene delivery methods, this platform aims for working with harder-to-transfect cell lines regarding automated gene-editing to further demonstrate the flexibility and efficacy of using automated gene-editing tools on device. We hope this would allow for better ease and rapidity in finding therapeutic targets for breast cancer treatment.

Genetic Screens for Target Discovery and Validation

Session Chair: Luke Gilbert, UCSF

  • Mapping the genetic landscape of human cells
    Luke Gilbert, UCSF

    Seminal yeast studies established the value of comprehensively mapping genetic interactions (GIs) for inferring gene function. Efforts in human cells using focused gene sets underscore the utility of this approach, but the feasibility of generating large-scale, diverse human GI maps remains unresolved. We developed a CRISPR interference platform for large-scale quantitative mapping of human GIs. We systematically perturbed 222,784 gene pairs in two cancer cell lines. The resulting maps cluster functionally related genes, assigning function to poorly characterized genes, including TMEM261, a new electron transport chain component. Individual GIs pinpoint unexpected relationships between pathways, exemplified by a specific cholesterol biosynthesis intermediate whose accumulation induces deoxynucleotide depletion, causing replicative DNA damage and a synthetic-lethal interaction with the ATR/9-1-1 DNA repair pathway. Our map provides a broad resource, establishes GI maps as a high-resolution tool for dissecting gene function, and serves as a blueprint for mapping the genetic landscape of human cells.

  • Hierarchical organization of the human cell from a cancer coessentiality network
    Traver Hart, MD Anderson Cancer Center

    Genetic interactions govern the translation of genotype to phenotype at every level, from the function of subcellular molecular machines to the emergence of complex organismal traits. Systematic survey of genetic interactions in yeast showed that genes that operate in the same biological process have highly correlated genetic interaction profiles across a diverse panel of query strains, and this observation has been exploited to infer gene function in model organisms. Systematic surveys of digenic perturbations in human cells are also highly informative, but are not scalable, even with CRISPR-mediated methods.

    Given this difficulty, we developed an indirect method of deriving functional interactions. We hypothesized that genes having correlated knockout fitness profiles across diverse, non-isogenic cell lines are analogous to genes having correlated genetic interaction profiles across isogenic query strains, and would similarly imply shared biological function. We constructed a network of genes with correlated fitness profiles across 400 CRISPR knockout screens in cancer cell lines into a “coessentiality network,” with up to 500-fold enrichment for co-functional gene pairs.

    Functional modules in the network are connected in a layered web that recapitulates the hierarchical organization of the cell. Notable examples include the synthesis of oligosaccharide chains (two distinct modules), connected to the N-linked glycosylation complex (a third module), which glycosylate the EGFR and IGF1R receptors, which are further linked to downstream signaling modules. A second subnetwork delineates amino acids regulation of the mTOR complex via signaling through lysosomal transport, the Ragulator complex, and the GATOR2 complex, while classical TSC1/2 regulation is connected through a separate branch. The network contains high-confidence connections between over 3,000 human genes and provides a powerful platform for inference of gene function and information flow in the cell.

  • Cell growth is an omniphenotype
    Tim Peterson, Washington University

    Genotype-phenotype relationships are at the heart of biology and medicine. Numerous advances in genotyping and phenotyping have accelerated the pace of disease gene and drug discovery. Though now that there are so many genes and drugs to study, it makes prioritizing them difficult. Also, disease model assays are getting more complex and this is reflected in the growing complexity of research papers and the cost of drug development. Herein we propose a way out of this arms race. We argue that synthetic interaction testing in mammalian cells using cell growth as a readout is overlooked as an approach to judge the potential of a genetic or environmental variable of interest (e.g., a gene or drug). The idea is that if a gene or drug of interest is combined with a known perturbation, and causes a strong cell growth phenotype relative to the known perturbation alone, this justifies proceeding with the gene/drug of interest in more complex models like mouse models where the known perturbation is already validated. This recommendation is backed by the following: 1) Most genes deregulated in cancer are also deregulated in other complex diseases; 2) Diverse non-cancer drug responses on patient cell growth readily predict responses to clinical outcomes of interest; 3) Gene and drug effects on cell growth correlate with their effects on most molecular phenotypes of interest. Taken together, these findings suggest cell growth could be a broadly applicable, single source of truth phenotype for gene and drug function. Measuring cell growth is robust, and requires little time and money. These are features that have long been capitalized on by pioneers using model organisms that we hope more mammalian biologists will recognize.

  • Capturing macrophage complexity in vitro: human iPSC models as a platform for disease modelling and drug discovery
    David Brierley, GlaxoSmithKline

    Despite a substantial appreciation for the critical role of macrophages in adaptive and innate immunity, detailed mechanistic understanding of human macrophage biology has been hampered by the lack of reliable and scalable models for cellular and genetic studies. Although commonly used model systems such as macrophage-like leukemic cell lines or primary monocyte-derived-macrophages have advanced our understanding of macrophage biology; such cellular systems have differing limitations with regards to human disease relevance, genetic fidelity and practicality of cell supply. Here we discuss the application of human induced pluripotent stem cell (hiPSC)-derived macrophages as an unlimited source of patient genotype-specific cells, that provides a powerful and scalable platform for disease modeling and drug screening.

    Through multi-parametric high-content cell-based assays, we demonstrate that hiPSC-derived macrophages can be polarized in vitro to both functionally and molecularly distinct ‘classically activated’ and ‘alternatively activated’ subtypes. Furthermore, we compare the molecular fidelity of such subtype-specific cellular models to primary counterparts through phenotypic, functional, transcriptomic and proteomic profiling. Harnessing our hiPSC-derived macrophage platform with CRISPR/Cas9 based genome editing has further enabled the genetic, molecular and cellular exploration of mechanisms underlying immune cell disease phenotypes.

    Embedding such hiPSC cellular models into the earliest stages of drug discovery at the expense of traditional reductionist cellular models provides both the human physiological relevance and throughput needed to improve the efficiency of drug discovery and development.

Development of Cellular Models for Phenotypic Screening

Session Chair: Xiaoxia Cui, Washington University in St. Louis

  • Building accurate research models via genome editing
    Xiaoxia Cui, Washington University in St. Louis

    For drug discovery and therapeutic development, the quality of data relies heavily on the model used to mimic either the disease state or a biological process in human. The models can be immortalized cancer cell lines, induced pluripotent stem cell (iPSC) lines, primary tissues and whole animals. With rapid advances of gene editing technologies, desired modifications are being introduced more precisely into a wide range of genomes. Yet limitations still exist. I will discuss various types of editing we have created at our center, such as scar-free point mutations, large deletions, large insertions, and conditional alleles, issues we ran into and solutions for obtaining specific on-target modifications and effort on increasing throughput. We are also looking at the large collection of edited single cell clones for unexpected on-target modifications and off-targeting events to better understand the genetic accuracy of the models we create.

  • Patient specific human iPSCs to model fetal hematopoietic anomalies associated with infant leukemia
    Todd Druley, Washington University School of Medicine

    Most pediatric cancers arise from immature cell types, and nearly all show a paucity of somatic mutation, indicating a significant link between pediatric cancer, germline variation and aberrant development. Infant leukemia (IL) is a unique and poorly understood pediatric leukemia with a mortality rate >50% and almost no somatic mutation. While IL has a high prevalence of KMT2A rearrangements (MLL-r), at physiological levels in HSCs these rearrangements fail to induce a short latency leukemia phenocopying IL in mammalian models when expressed at physiologic levels. IL arises in utero, suggesting derivation from a fetal progenitor. Hematopoietic development during embryogenesis is regulated spatio-temporally, with hematopoietic stem cell (HSC)-independent progenitors specified extra-embryonically, and HSC adult-like progenitors specified intra-embryonically. Previously, we have found that IL patients, independent of the presence of MLL-r, possess a significant enrichment of germline variation in COMPASS complex members, which strongly suggests that additional germline factors are required for IL transformation. COMPASS protein complexes are nucleated by members of the SET and MLL/KMT2 family. They have specific, non-overlapping and poorly understood roles in mesoderm and hematopoietic differentiation. In addition, several COMPASS members have been found to be recurrently mutated in various cancers and certain developmental defects.

    Methods: We have established multiple human iPSC lines (hPSC) from infant leukemia germline cells in order to characterize how each child’s germline variation impacts hematopoietic specification compared to human ESCs and hPSCs from healthy individuals with and without COMPASS gene knockouts and inducible MLL-r. Accessing hematopoietic progenitors in a human embryo is very challenging. However, by utilizing a unique, stage-specific human pluripotent stem cell differentiation strategy that can generate the progenitors of each of these programs, we can interrogate genetic and epigenetic mechanisms of normal and IL-associated hematopoietic development and transformation, respectively.

    Results: To date, we have genomic, epigenomic and functional results in these hPSCs demonstrating a crucial role for specific COMPASS members on the endothelial-to-hematopoietic transition (EHT) when CD34+ cells without MLL-r expression lose the ability to specify hematopoietic progenitors. Compared to wild type hPSCs, isogenic hPSCs with single gene COMPASS family knockouts demonstrate quantitative changes in histone modifications, even in the pluripotent state before any directed differentiation has begun.

    Conclusion: Like many pediatric cancers, IL appears to be more a developmental defect due to combinations of germline variation and acquired mutations that skew mechanisms of cell fate specification, rather than an aggregation of genetic errors as seen in adult cancers. These functional consequences of these combinations of variants and mutations can be best modeled in cells from the actual patients, reprogrammed to hPSCs and directed to the relevant cell fates. Future work will focus on xenograft studies with these cells to serve as preclinical models.

  • Magneto-mechanical force enhancement of cell transfection
    Andy Tay, Stanford University

    The American Cancer Society estimates 1.8 million new cases of cancer in 2018. Despite better treatments, the mortality rates of many cancer including glioblastoma and melanoma remain high. Chimeric Antigen Receptor (CAR)-T-cell, with on-going clinical trials, offers a promising strategy to engineer immune T cells with cell surface receptors that recognize and kill cancer cells. Unfortunately, it is difficult to deliver genes and integrate them into the genome of T-cells (~10% efficiency) for effective cancer immunotherapy. Here, we describe a technique where we integrated magneto-mechanical modulation of cell deformability and intracellular cargo trafficking with magnetic particles (MPs) during nanostraw-localized electroporation to boost the transfection efficiency of Jurkat- and T-cells from ~12% to ~48.0%. This method generated minimal cellular stresses, as measured with intracellular calcium levels and RNAseq data, compared to other transfection techniques. As cell proliferation and metabolism were unaffected by nanostraw-localized electroporation, there was 2-3 folds shorter waiting time to generate large cell numbers (10^9) for therapies. The nanostraw-electroporation system (NES) consists of thousands of alumina nano-channels/straws protruding from a polyethylene membrane which cells adhere onto. Localized electric fields are applied through nanostraws to create transient membrane pores for cargo delivery. It was found that biomechanical forces enhanced transfection in adherent cells. To test if this is true for non-adherent cells, we utilized FDA-approved starch-coated magnetic particles (MPs) with cyto-protective effects to generate magneto-mechanical forces. Starch-coated MPs were biocompatible, and were internalized by Jurkat- and T-cells after 24 hrs of incubation. We applied static magneto-mechanical forces to the Jurkat- and T-cells during nanostraw-electroporation and found that this boosted the transfection efficiency from ~12% to ~30% due to a decrease in physical distance between NES and cells which minimized degradation and loss of plasmid cargo. After nanostraw-electroporation, we applied low frequency (to avoid generating heat) alternating magneto-mechanical forces through internalized MNPs to perturb the cytoskeletal network connected to the nuclear envelope. This increased intracellular cargo trafficking and opening probability of the nuclear pore complexes to enhance cargo entry into the nuclei. This modulation step further increased the transfection efficiency from ~30% to ~48%. Next, by monitoring calcium stress signals and RNAseq, we found that across all the different transfection methods, NES with magneto-mechanical modulation resulted in minimal cellular stresses and treatment did not significantly lengthen cell doubling time unlike other techniques. This is partly attributed to magneto-mechanical depolymerization of actin cytoskeleton and recruitment of lysosomes to facilitate membrane pore repair.

    Existing FDA-approved T-cell isolation protocol makes use of MPs. We plan to develop an integrated on-site platform where magneto-mechanical modulation with MPs is combined with NES for enhanced T-cell isolation and transfection. We envision this to reduce logistical hurdles of accurately tracking and supplying patient-specific T-cells for therapies.

  • RepliGut: An advanced adult human stem cell screening platform for the pharmaceutical and biotechnology industries
    Nancy Allbritton, Altis Biosystems

    Altis Biosystems, Inc. has recently brought its RepliGut series of human intestinal model platforms to market for pharmaceutical screening and microbiome research. Traditionally, pharmaceutical and biotechnology companies utilize colorectal cancer cell lines (primarily Caco-2 cells and their derivatives) for screening libraries of compounds as part of their drug-discovery efforts. However, there are fundamental differences between the primary small intestinal and colonic epithelia and these cancerous cell lines including chromosomal instability, altered metabolism, and aberrant functional characteristics. The reliance of the pharmaceutical industry on colorectal cancer cell lines for drug discovery results in false positives that waste money during clinical trials and false negatives that result in missed opportunities. To overcome the industry’s reliance on colorectal cancer cell lines, we have developed culture conditions for maintaining primary intestinal epithelial tissue derived from adult human stem cells. The RepliGut platforms display the full repertoire of intestinal epithelial cell types (stem cells, transit-amplifying cells, goblet cells, Paneth cells, enteroendocrine cells, enterocytes) in vitro, and express key transporters and metabolic enzymes at the gene, protein and functional levels. Being a monolayer possessing low permeability and high TEER, the platforms are amenable to standard screening strategies, such as toxicity, transport, permeability, cytokine secretion, etc. The RepliGut technology is superior to intestinal organoids as the RepliGut recreates critical features displayed by intestinal organoids yet in a planar, open-faced geometry while maintaining the cellular composition, integrity, and functional aspects of the intestinal epithelium. In contrast, the spherical, cystic architecture of intestinal organoids is not compatible with most screening assays because the organoid possesses an enclosed lumen and is buried within a mass of Matrigel (a hydrogel). Thus, the lumen of the organoid is not accessible to drugs, food stuffs, pre and probiotics and other agents unlike the gut surface in vivo. The RepliGut system also improves upon a number of planar formats by virtue of the proprietary scaffolds that support stem-cell self-renewal, adult rather than fetal gene expression patterns, and continued culture of primary cell lines. Recent advances have also led to patterned three-dimensional crypt mimics for more precise study of cellular proliferation and differentiation using high content analysis. Supporting Altis’ service and product lines is a growing biobank of adult stem cells from all 6 regions of the human intestine (large and small). This biobank is annotated based on the demographics and genetics of donor tissue that can be expected to provide more accurate data in functional assays.

Back to Top

Data Analysis and Informatics

Track Chairs: Amy Kallmerten, Merck and Yohann Potier, Novartis

The Data Repurposing Challenge: Research Data Sharing & Reuse Strategies

Session Chair: Amy Kallmerten, Merck

  • Towards a PC-Less Laboratory
    Joshua Bishop, Merck & Co

    Within a pharmaceutical research laboratory environment, technology is, by and large, less of a virtue and more of vice. A typical experience within the confines of Merck Research Laboratories conforms to the following numbers: Research scientists are outnumbered by PCs at a rate of >2:1, research scientists will interact with more than a dozen interfaces on more than 5 PCs, laboratory computing resources are mandated to support three versions of Windows operating systems and engagement with more than 500 HW/SW vendors is required to maintain status quo. Add to this calculation the surface area for a potential cyberattack present from this volume of technology, and the demand for a paradigm shift in how we do business is clearly articulated. Herein, we present work towards removing the PC from our research laboratory environments through the following efforts: Break the common laboratory PC into four key parts (drivers, execution instructions, configuration, operational status), expose as much of these capabilities as possible through services, layer a generic and delightful user interface over these services that maps to scientific research workflows. These efforts will lead to a reduced number of PC’s in the laboratory, improved efficiencies through fewer interfaces, less of a demand on laboratory computing resources and a resilient environment through reducing the surface area for potential attack.

  • The Practical Application of Natural Language Processing to Improving Data Quality in Drug Discovery.
    Jay Gill, BFLConsulting LLC

    The new generation of Machine Learning (ML) are greedy and need lots of high quality data. Fortunately, there is a lot of data available in the cloud. Unfortunately, the available data is often unstructured (text) and needs to be transformed to a structured format or spread across many data sources which requires linking data across data sets. Linking and mining both structured and unstructured data is complicated because data suffers from lack of context, different words for the same thing, ambiguous words (Hedgehog the animal or gene?) and abbreviations, personal “short cuts” or format issues.

    Our team has focused on named entity recognition (NER) to resolve these issues. NER uses curated dictionaries to relate terms to entities (P2Y2 is a Gene). Dictionaries contain synonyms along with rules for refining and disambiguating terms. So, Aripiprazole is treated the same as Abilify and if Hedgehog is in the same sentence as signaling it’s most likely the gene and expands short hand like IL1/2 to IL1 and IL2.

    Often simple terms are not enough for the detection of concepts such as adverse events or the relationship between a gene and an indication. We can group the “cleaned” terms from above into complex queries like “find me documents where a human gene appears near a biological verb that is near an indication” where biological verbs are things like induces, regulates, suppresses etc... This approach has dramatically improved the ability to find documents based on complex relationships and concepts for example identifying adverse events.

    Several groups are using NER to improve data hygiene across the data life cycle. During data entry NER tools can be used to control data entry by limiting the terms scientists use. Legacy data can be cleaned by replacing synonyms with preferred terms or identifying concepts. An excellent example of NER being used for data mining occurred when a group applied NER to social media and rapidly discovered that people don’t use proper medical terminology, so they might enter “my head hurt, and I felt woozy” and not head ache and nausea. We worked with this group to develop a dictionary and pattern detector that was sensitive to common terms and expressions and related them to medical terms.

    This talk will cover that basics of NER and its application to several common issues facing data scientists in drug discovery where managing and using the disparate forms of data (literature, CROs, HTS, Genomics…) is critical to success.

  • SiLA & AnIML: Lab of the Future, meet Data Analytics
    Burkhard Schaefer, BSSN Software GmbH

    Connectivity and seamless interoperability are among the goals for many “lab of the future” projects. Additionally, organizations are looking at analyzing and leveraging their data in new ways. Cloud technology, machine learning and artificial intelligence tools are getting into focus. To bring such efforts to fruition, a seamless data flow between systems must be established.This presentation will explore how standards can provide avenues towards such a digital lab. Standard protocols and data formats serve as infrastructure enablers for integration of instruments, systems, and a unified data flow.We will review the progress of the open ASTM AnIML and SiLA standardization initiatives, which can provide key building blocks to establish interoperability. While AnIML specifies an XML-based standard data format for analytical data, SiLA provides webservice-based communication standards for interfacing with instruments. Software clients can use SiLA to discover and interact with services inside the lab. These services may be instruments or other systems. All services communicate using a common open protocol, built on cutting-edge web technologies. AnIML provides the data format to plan and document the execution of lab experiments.This ensures that data from all experiments is captured in a consistent and easily accessible format, enabling easy consumption of the data via analytics tools, feeding machine learning models. Data becomes readily available and can generate value beyond the original goal of the experiment. This paves the way from the bench to result, and from result to long-term insight.By joining forces, a new ecosystem is emerging around SiLA and AnIML that allows end-to-end integration of instrument control, data capture, and enterprise system (ELN, LIMS) connectivity. This presentation reports on the current state of this ecosystem and discusses the rapid rate of adoption in the vendor community.

  • LARA and SiLA2.0/AnIML – an integrated open source laboratory automation planning and evaluation suite.
    Mark Doerr, Institute of Biochemistry Uni Greifswald

    Evaluation and visualisation of high-throughput screening data is a bottleneck at each larger automated / robotic screening platform. This is especially challenging, if data from diverse experiments are combined, e.g., cell growth data with enzyme activity data or fluorescence assay data. Or even more challenging, if data from several screened microtiter plates or different rounds of experiments shall be summarised into one statistical analysis.

    We developed a Python based free and open source laboratory automation assistant LARA ( integrating the latest SiLA2.0 standard ( for device and process communication and AnIML ( for long-term data storage.

    Further components of the LARAsuite are an experiment planning tool, a generic process generator that directly translates molecular biology or biochemical protocols into real robot processes, modules to evaluate and collect all analytical data generated by various instruments (e.g. plate readers, HPLC, GC, MS, PAGE, ...) into one database and finally evaluate the data statistically and visualize the results, using a python based evaluation language. These results and all intermediate process steps, as well as raw data, are explorable via a web interface. The software design paradigms are: speed, ease of use, stability and reliability, modularity, flexibility, extensibility and openness of the source code. It is based on well established open source packages Python (Python-Django database and web framework), R, SQLite/postgreSQL, and Qt5.

    The LARA suite is very successfully applied in our high-throughput enzyme screening facility at the University of Greifswald ( On this platform we screen up to 5000 enzyme mutants per week in microtiter scale, applying cell growth, protein induction, cell harvesting, cell lysis and optical enzyme assay steps in one process.

    The code of our software is freely available on or on gitHub ( The author is member of the SiLA2.0 core development team.

The Lab of the Future: Automation in the Digital Age

Session Chair: Nicola Richmond, GlaxoSmithKline

  • The Digital Lab: Enhancing Data Integrity and Adherence to FAIR Guiding Principles
    Dana Vanderwall, Bristol-Myers Squibb

    Data is the product of any science-based R&D organization. And since data is created in the lab, the lab itself is the very beginning of the data management and stewardship lifecycle. In this talk we will discuss embedding data standards at a foundational level allow for the creation of a 100% “digital lab” that removes numerous error modes from laboratory processes. The digital lab expands the scope of current automation capabilities and self-documents the relevant context of the process in real-time, capturing enriched metadata during the execution of an experiment, data reduction and analysis, decision capture, including details of the materials and instruments used. Entropy is removed from the process using normative terms provided by the Allotrope Foundation Ontologies in the laboratory software – enabling correct, consistent and complete metadata capture. Conformance to SOPs and analytical methods is also enforced digitally using standard input (instruction sets) described in a semantic graph. The completed data set connects equipment, materials, processes, results and decisions and is stored in a semantic graph (RDF triples) along with “raw data” (W3C data cube) in an open, portable, vendor-neutral format. The resulting standardized, highly annotated body of data is more consistent, searchable and more easily integrated across domains, and also enables the automation of reports and other structured documents- with documented provenance to the original source. When completed, the digital lab approach will greatly enhance data integrity (ALCOA-CCEA rules) and adherence to the FAIR Guiding Principles for scientific data management and stewardship.

  • Building the Laboratory of the Future: How Digital Tools can Augment Human Scientific Output
    Charles Fracchia, BioBright

    For many decades mankind has looked to automate tedious and error-prone steps of biomedical research in an attempt to improve reproducibility and throughput. While automation has had a large beneficial effect on overall throughput of biomedical research, reproducibility is still facing a crisis. In this talk, we will discuss automation trends in data generation, collection and analysis in the biomedical sciences and show how carefully designed digital tools can improve reproducibility while integrating humans in the loop. We will review both currently available and future-looking technology that are key to the laboratory of the future (e.g.: augmented reality, machine/statistical learning, voice assistants, etc). We will also discuss the need for a new framework that focuses on using machines and software to augment human scientists in the laboratory.

  • A Cloud-based Solution for Automated Device Data Collection, Validation, and LIMS Integration for Analytical QC
    Scott Rehlander, StackWave

    Most modern labs continue to employ time consuming and error prone manual processes for collecting, validating, and publishing raw device data. Although there are a number of self-hosted Scientific Data Management Systems (SDMS) that claim to alleviate at least one aspect of these processes, most of these solutions are built on outdated technology, have no support for cloud deployments, and do not facilitate the use of validation and export pipelines. At the Broad Institute, we set out to build a modern, cloud-hosted informatics solution for the automation of our analytical data that does more than just file aggregation, enabling scientists to work smarter by automatically collecting, validating, and exporting all of their data through a flexible, cloud-hosted pipeline. This solution minimizes errors, saves valuable scientist time, and provides a comprehensive, fully auditable, record of data as it moves from the device to the LIMS.

    The analytical lab in the CDoT (Center for Development of Therapeutics) group at the Broad Institute has a set of UPLC devices that are able to be used interchangeably. Attached to each device is a computer that, by default, is used as the primary storage of both raw and processed data files. To begin a run, an input file is generated by the LIMS and placed into a directory accessible by the device computer. The operator uses this input file to initiate a run on the device. The result of the run is a set of raw result files, some of which store transient data and are ignored. Once the run is complete, the operator performs an integration, effectively producing a Chromatogram for each sample in the run. The data collection solution sends all of these files, along with the summarized run output, to the cloud in real-time. Collected data is then automatically checked for completeness, and valid results are subsequently transformed and exported to the LIMS in a LIMS-ready format.

    Over the course of 5 months, our automated solution has processed chromatograms for more than 7,400 samples, saving countless hours for operators and analysts. The system has captured at least 624 invalid documents, segregating them and informing the lab manager to prevent erroneous data from being imported into the LIMS.

  • 'Self-Driving Experiments': How IoT, AI, and Robotics Can Enable the Future of Science
    Sridhar Iyengar, Elemental Machines

    Many industries have rapidly adopted technologies integrating connected devices (IoT or the "Internet of Things"), AI-based analysis, and Robotics to form closed-loop systems for improving quality, increasing throughput, and reducing costs. In the life sciences, such closed-loop integrations have not been as readily adopted due to the inherent heterogeneous nature of biological research: robotics need to handle sensitive biological materials, gases, and fluids, generations of knowledge are still archived in text-based records, a countless number of different manufacturers (each either having no cloud-connected solution, or each having a proprietary one), and finally having to decipher the effects of myriad variables that can affect an experimental run (temperature, humidity, storage conditions, etc.). In this talk, we present a novel system that integrates and provides a unified interface for this heterogeneous world. We demonstrate how seemingly unrelated information and variables can combine to confound experimental results and how a globally-integrated informatics platform can identify and highlight these issues before they become prohibitively complex. We further delve into examples of closing the loop and feeding these insights into automated/robotic systems to auto-correct an experiment in real time to achieve a "self-driving experiment" akin to a self-driving car.

Creating and Maintaining a Data Culture Across Laboratories and Disciplines

Session Chair: Paul Clemons, Broad Institute

  • Shared data ecosystem of NCI’s Cancer Target Discovery and Development (CTD^2) Network
    Paul Clemons, Broad Institute

    The National Cancer Institute (NCI) supports, among many initiatives, the Cancer Target Discovery and Development (CTD2) Network, which currently comprises 12 Centers nationwide. The primary goal of the Network is “to bridge the knowledge gap between large-scale genomic datasets and the underlying etiology of cancer development, progression, and/or metastasis”. To make progress toward this ambitious goal, member Centers of the Network are necessarily quite diverse in their biological specialties, experimental methods, and informatic or computational sophistication. To encourage sharing and re-use of data between member Centers, and data dissemination to the wider cancer research community, the CTD2 Network has evolved an interconnected ecosystem of web-based tools and data services, including the CTD2 Dashboard, CTD2 Data Portal, and several investigator-initiated web-based resources. Each of these resources serves distinct purposes and audiences, and accordingly requires different compromises to implement and different processes to maintain. I will provide an overview of these resources, lessons learned during their evolution, and prospects for future data integration within and beyond the CTD2 Network.

  • One of these things is not like the other… or is it? How to know when to repurpose software solutions for your laboratory
    Abbey Vangeloff, Yahara Software

    Often when trying to solve a problem, we find ourselves looking in familiar places for the answer. But when it comes to data and informatics, relying on surface similarities might prove problematic. How is one to know when a familiar technology is appropriate for a new application or when an entirely new technology is needed? To answer these questions, we developed a methodology to review and analyze requirements and assess existing solutions for fitness to repurpose in new areas. This method led to some surprising findings about when and what types of software solutions are appropriate to reuse for new applications and which are not. In this talk, we will outline the steps to review technology and determine fitness to repurpose for new applications. To illustrate the method, we will review a case study of a production software platform used to track trucking data that was repurposed to collect and store global public health surveillance data. Participants will learn actionable steps to analyze, review and determine fitness of their current software solutions for new applications as well as some rules of thumb about when software can be repurposed and when to consider new solutions.

  • A longitudinal analysis over 10 years of enabling technologies for early drug discovery using open source tools
    Pierre Baillargeon, The Scripps Research Institute

    Progress in 3D printing technology and the evolution of open source hardware/software tools over the past decade have made the development of custom automation much more accessible for laboratory staff. These efforts have democratized access to many tools and technologies which were previously the exclusive domain of engineering and fabrication firms. In addition to cost reductions due to open source efforts, many domain experts have produced voluminous amounts of free quality training and documentation which further improves the accessibility of these tools and technologies to end users with little to no formal education in related fields. The result of these efforts can be seen with the fabrication of adaptive/customized technologies that have leveraged 3D printers, microcontrollers, electronics fabrication design tools and the explosion of open source software.

    The Lead Identification team at Scripps Florida has leveraged many of these open source tools to deploy a variety of custom in-house platforms which increase productivity, further automate quality control tasks and enable novel HTS workflows. These platforms include a pipetting light guide which makes use of custom 96 and 384 well LED matrices which match the SBS footprint of microtiter plates and were designed using the KiCad suite of open source electronic design automation tools, then fabricated using the OSH Park printed circuit board production service. These custom LED matrices, when coupled with an Arduino microcontroller, provide users with guided step-by-step, well-by-well pipetting indicators to eliminate errors, assist with new employee training and increase productivity when laboratory automation cannot be used.

    Other platforms developed by the Lead Identification team include an Arduino and Raspberry Pi based real-time liquid dispenser QC system which has been fully integrated into the Scripps uHTS platform, custom 3D printed magnetic incubator shelving for HTS spheroid projects and an automated platform for the QC of compound plates. These custom designed in-house platforms have positively impacted screening and compound management operations by reducing staff workload for QC processes, replaced manual spheroid screening processes with automated routines and improve quality assurance in compound management operations via tens of millions of compound wells which have been imaged and verified against the Scripps corporate LIMS. The advantages of developing in-house automation using open-source tools and lessons learned over a decade of doing so are presented.

  • Can automated reasoning over large data sets accelerate translational science?
    Mathias Wawer, Broad Institute

    “Our ability to generate molecular and clinical research data has outpaced our ability to analyze them.” Touting the untapped value of large and fragmented data sets, this statement is often found in abstracts and sales pitches alike. If only we could make all these data accessible and interpretable, the story goes, we could make research more efficient and translate findings to the clinic faster. The NIH Biomedical Translator program has set out to test this promise. By bringing together biologists, clinicians, and engineers, the program asks: what does it take to make it work? What data do we need, which algorithms to analyze them? How can we better connect experts? What value can we generate? Over the last two years, the program has employed innovative and sometimes unusual approaches to the application process, funding, and community organization to build a first prototype for a universal biomedical translator - a computational tool that can autonomously answer biomedical and translational questions by combining data integration with automated reasoning and artificial intelligence. I will provide an overview of the program, lessons learned over the past two years, and the challenges ahead.

Back to Top

Drug Target Strategies

Track Chairs: Peter Hodder, Amgen and Margaret Scott, Genentech

Inducing Protein Degradation

Session Chair: Mei-Chu Lo, Amgen

  • The Degrader Strategy Rule Book
    Robert Blake, Genentech

    The classic drug receptor occupancy model states that the magnitude of the response to a drug is directly proportional to the amount of drug bound. In this scenario the effect of a drug is limited by its affinity and its residence time on its target. However a new paradigm in drug mechanism of action is emerging, in which a drug modulates the proteostais of its target, resulting in decreased protein level. Degrader theory predicts that when a drug induces the proteolytic degradation of its target protein, one drug molecule can degrade multiple target protein molecules (catalytic degradation). Thus, the magnitude of the drug’s effect is no longer limited by drug affinity or receptor occupancy, and the duration of effect can exceed that of the drug exposure. The first drug known to act through a degrader mechanism was Fulvestrant, a selective estrogen receptor degrader (SERD), which consists of an estrogen receptor ligand and a “degradation tail”. We will discuss the development of other novel protein degraders that also use the degradation tail strategy. Another systematic approach to the development of protein degraders is based on bivalent molecules (Chemical Inducers of Degradation; CIDEs) that bring the target protein in close proximity to ubiquitin ligases, resulting in its ubiquitination and subsequent degradation. We will also discuss factors that influence the ability of these types of molecules to induce protein degradation.

  • Activity of CRBN and VHL Derived PROTACs Across a Cancer Cell Line Panel
    Dane Mohl, Amgen - Oncology Research

    The success of small molecule therapeutics such as fulvestrant and lenalidomide, that promote ubiquitin mediated degradation of cancer related targets, has fueled an intense effort to mimic their activities with larger bispecific molecules called PROTACs (proteolysis targeting chimeras). PROTACs are engineered from two separate ligands joined by a flexible linker; one half engages the E3 ligase while the other half binds the disease relevant target protein. Proximity of a target to an E3 ligase results in ubiquitination and ultimately degradation of the disease-causing protein. Recent examples of BRD4 degrading PROTACs utilizing the ubiquitin ligases CRBN or VHL are quite potent and possess cellular activity that exceed that of the parent ligands. Thus, many new PROTACs targeting different proteins for degradation are being generated with the same or similar ligase binders, and the search for new PROTAC ligases is underway in both academia and industry.

    Hypothetically, the PROTAC strategy can target nearly any intracellular protein for degradation with many different ubiquitin ligases if small molecule ligands for each half of the chimera are discovered and if both the target and hi-jacked ligase are present within the same compartment of a relevant cell type. A difficult and necessary task is to identify the best PROTAC ligases from the hundreds of ligases encoded by the human genome, but little is known as to what makes a ligase particularly viable in the PROTAC strategy or if the PROTAC mediated activity of a ligase can be predicted by its RNA or protein expression in normal and diseased tissues. To better understand both the limits and the drivers of ligase activity in cancer cells, we screened a panel of 50 cancer cell lines with BRD4 degrading PROTACS that engage either CRBN or VHL. By comparing the activity of the PROTACs to the expression and mutation of genes that make up the E3 ligase complexes, guiding conclusions about the selection of future platform ligases were made. To rapidly validate new PROTACs made from novel target and ligase pairs, we developed a click chemistry platform that allows efficient construction of novel PROTACs in small quantities from a library of ligase binders.

  • Small-molecule induced protein degradation with proteolysis targeting chimeric molecules (PROTACs)
    Markus Queisser, GlaxoSmithKline

    Targeted protein degradation using bifunctional small molecules known as proteolysis targeting chimeric molecules (PROTACs) is emerging as a novel therapeutic modality. PROTACs redirect ubiquitin-ligases to target specific proteins for ubiquitination and proteasomal degradation. The advantages of the PROTAC technology lie in its modular, rationally designed molecules, capable of producing potent, selective and reversible cellular protein knock down as demonstrated in both cellular and in vivo. This approach combines the efficacy typically associated with nucleic acid–based approaches with the flexibility, titratability, temporal control and drug-like properties associated with small-molecule agents. The removal of a disease-causing protein is an attractive therapeutic option. PROTACs possessing efficient cellular degradation of target proteins with functional activity have been described in the literature, with some examples showing in vivo efficacy in disease-relevant models. This presentation aims to highlight the potential of PROTACs in drug discovery with a focus on their challenges from our perspective.

  • Expanding the Tool-Box of Ligands for Targeted Protein Degradation
    Gwenn Hansen, Nurix Therapeutics

    Targeted protein degradation is emerging as a novel strategy for drug discovery. In this modality, bifunctional small molecules are used to redirect specific proteins to the ubiquitin proteasome system where they can be broken down and eliminated from the cell. These bifunctional molecules link together two ligands, one that binds to a target protein and another that binds to an E3 ligase such that these proteins can be drawn into proximity to allow ubiquitylation and ultimately degradation of the protein of interest. Removal of proteins from cells or tissues as opposed to direct inhibition of function may afford a differentiated therapeutic profile and could offer particular value when eliminating pathological proteins from disease states like cancer. Furthermore, this approach may broaden the spectrum of proteins amenable to small molecule therapy owing to the fact that the target ligand needs only to bind to the protein surface, presumably a less challenging hurdle for discovery. This presentation will describe Nurix’s approach to the discovery and evaluation of bifunctional degraders and will discuss our emphasis on novel ligand discovery. Specifically, this talk with highlight our use of DNA-encoded library technology to better enable identification of degradation-compatible ligands across a range of protein targets and E3 ubiquitin ligases.

Applying Genomic & Proteomic Technologies to Target Identification and Validation

Session Chair: William Hahn, Dana-Farber Cancer Institute

  • Defining a cancer dependency map
    William Hahn, Dana-Farber Cancer Institute

    We now have a draft view of the genetic alterations that occur in human cancer and molecular profiling of cancers provides the means to identify therapeutic agents for some patients. However, the number of mutations found at low frequency, the molecular heterogeneity of most cancers and the lack of therapeutic agents specific for many cancer drivers limits the application of precision medicine in many patients. To complement genome characterization studies, we have used genome scale gain and loss of function approaches to identify genes required for cell survival and transformation. Specifically, we have performed systematic studies to interrogate rare alleles found altered in cancer genomes and used advances in synthetic gene synthesis to prospectively interrogate all possible alleles of known cancer genes. In parallel, we have performed both genome scale RNAi and CRISRP-Cas9 screens in more than 800 cancer cell lines to identify differentially essential genes and the context that specifies gene dependency. These studies allow us to define a global cancer dependencies map, which will help further operationalize precision cancer medicine.

  • Guiding the use of small molecule inhibitors in cancer using chemical-genetic interaction maps
    Sourav Bandyopadhyay, University of California, San Francisco

    Small molecule inhibitors have often unpredictable mechanisms of action and sensitivity in cancer. We desribe new approaches that leverage rapid screening of collections of genetically defined, isogenic cells to map the ways by which tumor genetics can modulate drug responses of cells in culture. This concept has been applied to both small molecule inhibitors (Martins et al. Cancer Discovery 2015) and to DNA damaging chemotherapy (Hu et al. Cell Reports 2018) to identify new synthetic lethal relationships as well as genetic causes of resistance to PARP inhibitors. Our studies define a systems approach using in vitro screening of cell lines for the rational design and optimization of small molecule inhibitors for clinical development.

  • High throughput antigen and antibody production at multiple scales, in support of an open source antibody discovery platform
    James Love, Institute for Protein Innovation

    The importance of antibodies in biomedical research, and their use as therapies, is driving efforts to discover new molecules. New efforts also seek to correct the flood of poorly validated antibodies that given rise to, in part, the reproducibility crisis in research. At the Institute for Protein Innovation, we are developing novel technologies to discover, using high throughput liquid handling automation, new antibodies that will be distributed in an open source effort to aid the entire community. Our antibody initiative seeks to generate antibodies to all human cell surface proteins- approximately 6000 targets. In order to generate these antigens and antibodies that bind them, we rely heavily on automation, that has been customized to utilize the eukaryotic expression systems. Thousands of proteins can be expressed and purified, in parallel, per week at the small scale, and due to the requirements for larger amounts of products, we have developed methods to produce hundreds of purified proteins at the larger scale. Transient transfection of HEK293 or CHO cells is carried out robotically at the 384, 94 and 24 well scales; and in tube and flask format at the 30ml, 90ml and 150ml scales. 96 proteins in parallel are purified in plate (1ml) and tube (30ml and greater) formats using filter based affinity resin capture, or via magnetic affinity resins, which removes the need for clarification via centrifugation. Techniques that have been difficult to speed up and parallelize, such as desalting at the milliliter scale, and size exclusion chromatography to separate aggregate from monodisperse material, have been developed and successfully implemented. Downstream assays to validate and triage antibodies to remove non-ideal candidates have been miniaturized and implemented at 384 well scales. All samples are tracked via barcodes and a LIMS system is under development to easily monitor progress and productivity of the pipeline. We will describe how we have approached these challenges using off the shelf automation, custom integrations and ‘home made’ solutions.

    The Institute for Protein Innovation is a not for profit organization.

  • TARGATT™: Rapid and Efficient Transgene Integration Technology to Develop Large Mammalian Cell-Based Screening Libraries
    Andrew Hilmer

    Mammalian cell-based libraries offer an affordable and physiologically relevant context for high throughput screening for applied biomedical research. Current technologies to develop mammalian cell display libraries either result in multiple transduced cells, or are inefficient to prepare large library screens and may carry vector backbone. Here, we introduce the TARGATT™ site-specific, integrase-mediated gene integration technology which provides an efficient strategy for integrating a gene of interest into a preselected, transcriptionally active locus. The TARGATT™ system provides a 1:1 variant-to-cell ratio with a single cell containing a single copy of transgene inserted at a single docking site. This provides a very efficient, uniform transgene expression, and reproducibility for constructing large isogenic cell libraries for rapid, high throughput screening. We engineered a TARGATT™-HEK293 Master Cell line containing an integrase recognition landing pad in the Hipp11 safe harbor locus in HEK293 cells. In combination with an HSV-TK/GCV negative selection or promoter-less reporter genes to enrich for cells that have library plasmid integrated, this master cell line showed an > 20% knock-in efficiency without selection. With selection, virtually 100% efficiency can be achieved. The TARGATT™-HEK293 Master Cell Line thus enables stable cell line generation, and 4-8x larger library sizes than currently available technologies. The TARGATT™ technology is an efficient system for creating large cell-based library screens for applications in directed evolution (such as vaccine development, drug screening), genome wide screening, and biotherapeutic drug discovery and biomanufacturing.

Multidimensional approaches to determining mechanism-of-action (MOA)

Session Chair: Margaret Porter Scott, Genentech

  • Critical Decision Making using Biophysics
    John Quinn, Genentech

    Biophysical techniques that provide direct label free-functional binding assays continue to evolve enriching the information quality available to medicinal chemistry throughout the drug-discovery process. While biophysical techniques tend to require significant investment in technology and expertise they are of proven value in many applications. Techniques such as nuclear magnetic resonance (NMR) and surface plasmon resonance (SPR) have become standard approaches to finding new chemical matter from small (< 10k) compound collections while MS-based methods can now process larger collections (>100k). Biophysical methods can provide direct on-target, label-free binding with improved data quality relative to semi-quantitative reporter-based assays which generally suffer from higher levels of interference. While a small subset of techniques such as SPR may dominate for routine applications we have found that other techniques often prove valuable including differential scanning fluorimetry (DSF), microscale thermophoresis (MST) and switchsense. Many of these technologies provide complementary information and orthogonal confirmation is usually a pre-requisite to compound prioritization. Using state-of-the art technology and associated methods, it is now possible to drive structure–activity relationship (SAR), assign mode-of-action (MoA) and estimate structure-kinetic-relationship (SKR) with sufficient throughput to support drug-discovery project teams. This is invaluable for progressing challenging targets where tool compounds may be unavailable and binding MoA unknown. Real-time, label-free detection techniques, such as SPR, are ideal for kinetic studies by guiding SKR from hits-to-leads. Here we provide an overview of recent biophysical work supporting pipeline projects and illustrate how we apply these techniques productively to maximize project impact using selected case studies.

  • A Unique Automated Hydrogen / Deuterium Exchange Mass Spectrometry Platform for the Characterization of Protein Ligand Interactions
    Michael Chalmers, Eli Lilly and Company

    Hydrogen / Deuterium Exchange Mass Spectrometry (HDX-MS) is an information rich technique for the characterization of protein ligand interactions in solution. Upon ligand binding, alterations to protein structure/dynamics are reflected in altered backbone amide exchange kinetics. These changes can be measured via HDX-MS, and provide a detailed profile of any ligand induced changes to protein structure or dynamics. We apply HDX-MS at different stages of the small molecule discovery process, including hit characterization, binding site identification and mechanism of action studies. To facilitate these experiments, we have implemented an end-to-end HDX-MS platform comprised of unique automation and data analysis tools.

    Over the last decade, the LEAP Technologies CTC-PAL system has become the automation platform of choice for HDX-MS experiments. Pioneered by ExSAR and the Griffin laboratory this system operates in a “real-time” mode, where the entire HDX-MS experiment workflow is integrated. Although successful, the “real-time” experiment has limitations, the most significant of which is the inefficient use of MS acquisition time. We therefore designed, built, and executed a new automation platform for HDX-MS experiments. Following a “decoupled” workflow we automated the sample preparation in 96 well plate format, followed by freezing at -60°C. Each frozen plate is then transferred to a second automation platform, where individual samples are thawed and injected into the MS. This “decoupled” format allowed us to increase our HDX capacity threefold, without the use of additional MS instruments. Other benefits include an increase in flexibility, the use of disposable tips for sample preparation and reduced sample preparation time. The output LC MS data are processed in HDX Workbench and then evaluated with a Bayesian analysis tool designed to extract rate constant data for each amide within the target protein.

    In this presentation we will describe our automation platform in detail, and provide examples of how we apply HDX-MS in our small molecule discovery efforts.

  • High-throughput agonist shift assay development, automation of global curve fit analysis and visualization of positive allosteric modulators (PAM) to support Drug Discovery
    Sony Agrawal, Merck Research Labs

    In recent years, allosteric modulation of 7 transmembrane receptors (7TMRs) has become a highly productive and exciting field of receptor pharmacology and drug discovery efforts. Targeting the less conserved and spatially distinct allosteric binding site has been a strategy adopted to overcome selectivity issues with orthosteric ligands. For example, in the case of the five muscarinic receptors, this lack of subtype selectivity has made it extremely difficult to probe the therapeutic potential of activating the predominantly CNS-expressed M4 receptor due to simultaneous activation of peripherally expressed M2 and M3 subtypes, resulting in severe side effects. To characterize positive allosteric modulation (PAM) pharmacology at the M4 receptor, a PAM shift assay was developed that measures the ability of compounds to shift the dose-response relationship of the native ligand, acetylcholine (ACh), thereby enhancing receptor activity.

    In response to increased demand, an automated high-throughput agonist shift assay was established to characterize the positive allosteric modulators (PAM) molecules for human muscarinic acetylcholine receptor 4 & 2 (M4 & M2) in several species (human, rat & rhesus). PAM shift assays global curve fit analysis is complex and data handling required many resources and time which is not feasible in course of lead optimization program. Therefore, data analysis was streamlined using automated work flow manager (AWM), ActivityBase to populate the data points and antagonist concentrations than a customized the Biovia Pipeline Pilot script to automate Graph Pad Prism fitting. Visualization was accomplished with Spotfire in which the number of compounds was not limited and data analyzed in an automated manner to produce quantifiable measurements of allosterism. In parallel, these assays were transferred to the HighRes Biosolution robotics system, resulting in a throughput increase of 300%. The improved and automated workflow provided greater reproducibility owing to higher fidelity of the data. The overall automation effort of assay as well as Global curve fit analysis has saved significant time in assay execution, simplified data analysis, reporting and also eliminated errors due to manual work. The derived allosteric parameters can be used to prioritize compounds for In vivo studies and preliminary human dose prediction.

  • Advantages of applying a new biophysical approach to HTS for lead ID, lead validation and optimization
    Amit Gupta, NanoTemper Technologies

    A known trend in HTS-driven drug discovery, biophysical methods are becoming an integral part of the workflow as targets have become more diverse, the demand for fragment-based screening projects increase and the need to identify ligands with different mechanisms of action continues to rise. Membrane proteins and targets previously considered undruggable or that don’t exhibit enzymatic activity have particularly benefited from this trend as these methods can overcome obstacles often faced with these challenging targets and are successful in lead identification and remain a crucial tool in lead validation and optimization. Despite their many advantages, biophysical methods suffer from limitations particularly in assay development, throughput and automation that is needed for lead ID and validation.

    But Temperature Related Intensity Change (TRIC) technology changes these shortcomings. Recent developments show that measuring TRIC can be successfully applied to make the characterization of binding events for drug discovery:

    • More robust with improved S/N ratios
    • More sensitive to binding events from fragments to antibodies
    • More efficient with faster assay development and reduced measurement times, effectively increasing the throughput

    Here we present the latest findings in TRIC-based detection and biophysical characterization of binding events and the direct application of TRIC for HTS of diverse protein targets.

    Gupta, A.J., Duhr, S., Baaske, P., MicroScale Thermophoresis. Enc. Biophys. doi:10.1007/978-3-642-35943-9_10063-1

Back to Top

High Definition Biotechnology

Track Chairs: Tony Dickherber, NIH and Kristen Brennand, MSSM

Emerging Technology Platforms for Single Cell Analysis

Session Chair: Peter Smibert, New York Genome Center

  • SLAS2019 Innovation Award Finalist: Dissecting tumor cell plasticity and population interactions supporting metastasis using single cell genomics
    Ashley Laughney, MSKCC

    Most cancer deaths are caused by metastasis—recurrence of disease in distant organs, which is seeded by disseminated tumor cells (DTCs). The invasion-metastasis cascade releases large numbers of DTCs throughout the body, even during the early stages of tumor growth. A minority survives as potential seeds for future metastatic outbreaks. They often evade immune surveillance and resist anti-cancer therapies, which primarily target dividing cells. The biology underlying these adaptive abilities remains poorly understood. We exploited the droplet-based barcoding technology, inDrop, to transcriptionally profile carcinoma and tumor-infiltrating cells at the single cell level from primary and metastatic patient lung adenocarcinomas. Our data show that primary tumors are replete with metastasis-initiating cells and demonstrate a striking level of developmental plasticity, recapitulating most epithelial lineages of the alveolar and bronchial airway and expressing key embryonic lineage-determining transcription factors, SOX2 and SOX9. Conversely, metastases exhibit a significant reduction in lineage diversity and are predominantly restricted to a Wnt-responsive, SOX9high endoderm-like progenitor. To dissect potential mechanisms underlying this lineage-specific escape, > 30,000 cancer cells from a mouse model of delayed lung cancer metastasis were transcriptionally profiled at different stages of disease progression to identify factors that regulate the phenotypic switch from quiescence to a proliferative, immune-exposed state. We compared the development of DTCs trapped in a quiescent state to those that drive metastatic outbreak, either spontaneously or upon depletion of Natural Killer cells, using PhenoGraph clustering, imputation of drop-out noise, and identification of diffusion trajectories. Lineage constraint was alleviated upon natural killer (NK)-cell depletion, demonstrating a dynamic interplay between developmental plasticity and immune-mediated pruning during lung cancer progression and the evolution of metastasis.

  • CITE-seq and related technologies for high content single cell phenotyping
    Peter Smibert, New York Genome Center

    High-throughput single-cell RNA sequencing has transformed our understanding of complex cell populations, but cannot measure single cell phenotypic information provided by protein levels. We recently described cellular indexing of transcriptomes and epitopes by sequencing (CITE-seq), a method in which oligonucleotide-labeled antibodies are used to integrate cellular protein and transcriptome measurements into an efficient, single-cell readout. Individual antibodies are labeled with oligos with different sequence barcodes, enabling the number of individual markers that can be measured simultaneously to far surpass what can be measured by cytometry-based approaches, while also measuring the transcriptome. CITE-seq is compatible with existing single-cell sequencing approaches and will readily scale as the throughput of these methods increase. A technically related approach, Cell Hashing, allows sample multiplexing and confident multiplet identification in high throughput scRNA-seq approaches. We have combined these methods to deeply profile the human immune system. We present an in-depth characterization of the human immune system using a panel of ~80 antibodies revealing complexity not previously observed by scRNA-seq. Finally, we present recent additions to the CITE-seq toolkit for additional single cell omics applications and compatibility with other single cell analysis platforms.

  • Amphiphilic Particle-templated Monodisperse Droplet Generation for Democratized Digital Bioassays
    Chueh-Yu Wu, University of California, Los Angeles

    The capability to partition a fluid sample into nanoliter-scale volumes of uniform size plays an important role to achieve uniform reactions enabling massive and high-throughput single-cell and single-molecule analysis, and leading to ultimate limits in nucleic acid sequencing, drug discovery, and personalized medicine. Microfluidics-based droplet technology has enabled a number of applications, including digital ELISA/PCR and InDrops/Drop-seq techniques for single-cell RNA sequencing. However, microfluidics requires expensive liquid handling equipment and Poisson encapsulation statistics limit the efficiency of approaches making use of encapsulated solid-phases, limiting adoption by non-centralized clinical labs and the efficiency of data acquisition in research settings respectively. We demonstrate an economic microfluidic-device-free droplet generation technology with high monodispersity using 3D-shaped amphiphilic particles that promote uniform aqueous drop formation through simple manual mixing and centrifuging steps. We engineered microscale particles, called drop-carrier particles, to be constructed with a void, hydrophilic inner layer, and hydrophobic shell, which thermodynamically stabilize a specific aqueous fluid volume to associate with each particle. The particles were manufactured using a modified high throughput optical Transient Liquid Molding (OTLM) setup, which photo-crosslinks microparticles by illuminating patterned UV light onto a pre-deformed flow stream of polymer precursor. We designed a microchannel to deform a co-flow of poly(ethylene glycol) diacrylate and hydrophobic external pre-cursor to a pattern with concentric layers and illuminated an array of UV patterns, simple rectangular slits here, onto the flow stream to reach a high production rate, up to ~36,000 particles/hour. We resuspended purified drop-carrier particles in oil, mix it with aqueous solutions, and centrifuged to generate particle-drops with identical size and shape across the population. We discovered that different from droplets stabilized by surfactants, the size and shape of the particle-drops can be determined by the 3D structure and interfacial tensions of the drop-carrier particles. The controlled shape was found to be non-spherical with circularity of ~0.4 and the size was narrowly distributed with a diameter of ~260 µm, allowing efficient image processing with a cut-off of shape or size. Importantly, the drops were observed to remain stable over more than 4 days without significant coalescence and size/shape change. Moreover, we found that the transportation of fluorescent molecules between the particle-drops can be permitted, slowed down, or prohibited depending on the surfactant used and molecular weight of the target molecule. Last but not least, we also demonstrated proof of concept of applications using particle-drops: solid-phase reactions and microgel encapsulation. The uniform reaction conditions in each particle-drop across the population was shown by incubating the particles with inner biotinylation and associated drops with fluorescent-labeled streptavidin while single microgels with various diameters ranging from 50 to 160 µm was shown to be successfully encapsulated in one particle-drop following a single Poisson distribution.

  • Droplet-Microarray - a universal miniaturized platform for screening of live cells in 2D and 3D environments
    Anna Popova, Karlsruhe Institute of Technology

    Phenotypic screenings of live cells are essential in drug development, fundamental science and diagnostics. Majority of screenings are performed in state of the art 96-, 384- and less 1536-well plates, where the working volume ranges from 2 to 200 µL. Further miniaturization of microtiter plates is not possible due to capillary effect arising from physical barriers between the wells. However, further miniaturization is a clear demand of modern screening community. Saving on costs is an important, but not the only problem. The need of performing large screenings on minute amount of physiologically relevant cells, like human primary and stem cells, is acute for many applications. In addition, dramatically reducing the cultivation time (from days to hours) that is needed to accumulate substances released from the cells, such as antibodies for example, is an important issue in many research and industrial projects.

    Keeping in mind these needs in screening community, our goal is to develop a platform that allows performing cell-based screenings as in microtiter plates, but in nanoliter volumes. Droplet-Microarray is a technology that is based on precise micro-patterning on different materials, like glass or plastic, resulting in creation of an array of hydrophilic spots on superhydrophobic background. Such arrays trap droplets of aqueous solutions, becoming a wall-less 2D array of separated stable homogeneous droplets. Due to absence of physical barriers, droplets on Droplet-Microarray can be miniaturized indefinitely. The design and size of spots and respectively volume of trapped droplets, is easily controlled and can be adopted for a particular application. Such arrays are compatible with cell screenings, where each droplets serves as a well in microtiter plates. Droplet-Microarray allows for screening in 3 to 200 nL droplets on 1 to 200 cells per experiment. At the same time, it is a universal platform and can be used for various applications similar to microtiter plates. Cells can be cultured either in 2D as monolayer or suspension; or in 3D either in hanging droplets as spheroids or in hydrogels. Droplet-Microarray is not restricted to mammalian cell culture and can be compatible with bacteria, yeast and even small model organisms as zebrafish. Finally, it can be used with existing equipment, including microscopes, non-contact dispensers and cell printers making Droplet-Microarray compatible with automated screening workflows.

    We will present proof-of-principle studies demonstrating use of Droplet-Microarray platform for miniaturized screening applications of cells in 2D and 3D environments. We will demonstrate our new results on automation of Droplet-Microarray platform and on step-by-step comparison of screening of library of anti-cancer compounds on Droplet-Microarray and in microtiter plates using commonly used cell lines and primary patient-derived tumor cells.

High Definition Tools to Overcome Disease Complexity

Session Chair: Amy Brock, University of Texas at Austin

  • SLAS2019 Innovation Award Finalist: Lineage-resolved molecular and functional analyses of cancer cell populations
    Amy Brock, University of Texas at Austin

    Heterogeneity across individual cancer cells and their descendants (clonally-derived lineages) impacts growth rate, tumor composition, and response to therapy. To improve treatment, new tools are required to measure and control the contributions of diverse cell subpopulations. Recent studies have demonstrated the utility of DNA-barcodes (consisting of random, unique, heritable sequences) in monitoring heterogeneous cell populations. Sequencing a barcode ensemble reveals clonal dynamics that may change with progression or treatment. Beyond observing and quantifying the lineage dynamics of a population, here we set out to manipulate specific cell lineages within the context of the whole cell population. To address this challenge, we developed a platform technology, Control of Lineages by Barcode Enabled Recombinant Transcription (COLBERT). We demonstrate that by tagging cells with a library of constitutively-expressed barcode guide-RNAs, this approach allows us to: 1) identify and quantify changes in the ensemble of lineages, 2) perform lineage-resolved genome and transcriptome analyses (scRNA-Seq), and 3) readily extract one or more specific lineages from the population by lineage-specific gene activation and FACS (using the expressed barcode gRNA with dCas9-VPR to drive expression of a fluorescent marker). Here we utilize the platform to quantify and isolate chemoresistant lineages from multiple cancer cell lines and from populations of primary patient derived cells. Using COLBERT to analyze a CLL cell line with del(13q), we were able to define the trajectories of specific resistant lineages after treatment with first-line chemotherapy, fludarabine/mafosfamide, and second-line targeted BCL2 inhibitor, venetoclax. We report on clonal fitness dynamics and lineage-resolved transcriptome/genome signatures of cells that failed to respond to one or both of these treatments, and compare these responses across multiple evolutionary trajectories. COLBERT provides a means by which we can integrate diverse layers of genomic and functional data to study cancer cells on a lineage-by-lineage basis over the course of treatment.

  • Integrative analysis of complex host-microbial interactions in the gut
    Ken Lau, Vanderbilt University

    Tissue complexity emerges from interactions of components across various biological systems, such as exogenous factors from the microbiota and different host cellular entities. These interactions can be characterized in multiple domains including genetic, spatial, and biochemical. Here we will discuss current tools that enable spatial and transcriptional profiling of individual cells, multiplex microscopy and single-cell RNA-seq, respectively, and their integration to dissect tissue-level complexity. We present several tools to interrogate these data types and demonstrate their applications in characterizing the function of intestinal tuft cells, the sentinel cell type linking the microbiome and the host. These emerging techniques will play a key role in understanding the role of complexity, for example, microbiome-host interactions, in dictating tissue function in homeostasis and dysfunction in human diseases.

  • Miniaturization of Illumina library preparation for high throughput, plate-based next generation sequencing studies
    Stuart Levine, Massachusetts Institute of Technology

    The high cost of Illumina library production is a critical factor in the utilization of underpowered experiments. Several studies have demonstrated that increasing biological replicates, even without increasing the total amount of sequencing, significantly improves experimental power. Automated liquid handlers can address this issue through increased throughput and miniaturization of reaction volumes. However, integrating these robots in centralized facilities can be challenging due to the diverse projects, sample types, and sample qualities submitted for analysis. Here we demonstrate the miniaturization of a diverse set of Illumina library protocols on the TTP Mosquito HV and their application to a broad variety of biological sample types. Methods for both RNA and DNA library production and from intact and fragmented samples have been successfully implemented and are widely used using either Illumina or NEB reagent sets. The miniaturization of these protocols has resulted in both significant savings in library production cost, and in a notable increase in average number of replicates performed by the scientists.

  • A novel, robust, label free system for measuring molecular interactions in solution, with picomolar sensitivity
    Darryl Bornhop, Vanderbilt University

    Biology works though molecular interactions in a complex milieu without labels. Measuring biological interactions without labels for drug discovery and development has proven difficult.

    The Bornhop lab has developed a robust, label free, solution-based detection system that is media independent, mass independent and can detect binding affinities into the low single digit picomolar range.

    The system measures changes in fundamental molecular properties directly in solution. Reproducible quantification of numerous types of binding interactions has been demonstrated in a large range of matrices (including in whole blood). The system can quantitate the binding of molecules ranging from cations, organic compounds, peptides, nucleic acids and large proteins. Further, measurements can be made in solution or to membranes in suspension, making the approach universal.

    Data will be presented showing the range of binders that can be measured, the sensitivity advantages and the matrix flexibility. Examples include: Drugs of abuse testing, demonstrating pg/ml limits of quantitation; detection of key cancer biomarkers (30 fold improved limit of detection (1pM) as compared to current clinical standard electrochemiluminescence); the infection of red blood cells by malaria parasites; the detection of DNA binding at the attomole level; allosteric binding of baclofen to the membrane bound CXCR4 receptor.

Systemic Approaches for Precision Medicine

Session Chair: Nancy Cox, Vanderbilt University Medical Center

  • SLAS2019 Innovation Award Finalist: AID.One: An Artificial Intelligence-Driven Digital Health Application for Clinical Optimization of Combination Therapy
    Dean Ho, SINAPSE, Biomedical Engineering and Pharmacology, BIGHEART, National University of Singapore

    Conventional combination therapy is administered at high drug dosages, and these dosages are commonly fixed. These combinations are designed to achieve drug synergy and enhanced treatment efficacy. However, drug synergy undergoes constant evolution when interacting with patient physiology, and is dependent on time, drug dosage, and patient heterogeneity. As a result, patient response to treatment on a population-wide scale can vary substantially, and many patients will not respond at all to combination therapies with unsuitable drug dosages. The inability to dynamically reconcile the time/dose-driven basis of synergy is a major driver of treatment failure and high costs of drug development and failed trials. Therefore, fixed dose administration presents a substantial challenge to the global optimization of combination therapy efficacy and safety. To overcome this challenge, we have developed CURATE.AI, a powerful artificial intelligence platform that dynamically modulates drug dosages in time, and is capable of optimizing clinical combination therapy for the entire duration of care, an unprecedented advance. CURATE.AI has been validated through multiple in-human studies ranging from solid cancer treatment to post-transplant immunosuppression and infectious diseases. In these studies, CURATE.AI implementation has markedly enhanced patient treatment outcomes compared to the standard of care. To scale the implementation of CURATE.AI, we have developed a digital health application, AID.One (AI Dosing for N-of-1 Medicine) in collaboration with a community of oncologists, surgeons, engineers, and additional researchers at the interface of medicine and technology. AID.One is capable of calibrating a patient's response to combination therapy to create individualized CURATE.AI profiles. These profiles correlate the patient's drug dosages with quantifiable measures of efficacy and safety into a 3-dimensional map that can be used to immediately pinpoint optimal drug doses for a specific patient at a specific time point. As the patient response to treatment changes over time, the CURATE.AI profile also changes so that drug modulation can occur to ensure that optimal efficacy and safety are constantly mediated by the combination therapy regimen. Importantly, AID.One and CURATE.AI use only the patient's own data, and do not rely on population-based algorithms to estimate treatment parameters.This talk will highlight the process of building AID.One, as well as prospective benchmarking results in multiple oncology studies and transplant immunosuppression and clinical development progress.

  • Genetic analysis at scale
    Benjamin Neale, Massachusetts General Hospital

    Large-scale population cohorts such as the UK Biobank are transforming the way we consider and pursue genetic analyses, offering the opportunity to learn about a whole range of human traits and diseases. Here I will describe genome-wide association analyses of over 4,000 distinct traits and diseases from the UK Biobank including estimation of heritability. Further, we performed sex-stratified analyses enabling us to probe the question of the degree to which genetic influences on complex traits are shared across men and women. Previous work on sex-stratified analyses has shown that height and BMI are largely similar but that waist-hip ratio shows clear evidence of sex-specific effects (Randall et al. PLoS Genetics, 2013). Analysis of the initial wave of UK Biobank shows differences in sex-specific estimates of heritability for waist circumference, blood pressure, skin and hair color (Ge et al, PLoS Genetics, 2017). Initial results suggest that the overwhelming majority of genetic influences are shared across the sexes, with an average genetic correlation estimated to be in excess of 90%. Nevertheless, a number of phenotypes do show sex-specific genetic correlation clearly less than 1 including fat percentage traits, exercise related traits such as pulse rate and haemoglobin concentration and prior history of smoking. These kinds of analyses illustrate the growing potential for biobanks to understand the impact of genetic variation across all of human health and disease.

  • Cloud-based Data Analytics Brings the Power of Artificial Intelligence to Biologists for High Content Analysis
    David Egan, Core Life Analytics

    While the technologies for the generation of high content data are now widely accessible; many biologists still struggle to make full use of the extracted numeric data. Often this is due to a lack of suitable data analytics tools that they can use independently; without having to engage the services of a data scientist.

    Here we show how a cloud-based data analytics application can give biologists the ability to analyze the most complex high content data sets. The application allows users to analyze these large numeric data sets with unsupervised analytical methods, to identify novel cellular phenotypes. They can then use integrated Artificial Intelligence functionality to build AI models based on these phenotypes that can be used to identify similar phenotypes.

    Until recently another problem in the high content analysis field was a lack of publicly available screening data sets, that biologists could use for training and analytical method validation. We demonstrate the utility of intuitive web-based data analytics tools in the analysis of data from a publicly available “Cell Painting” chemical screen from the public Giga DB data repository (

    Phenotypic outliers were identified which could then be rapidly viewed in the IDR ( public image data repository for validation of the phenotypes. Hierarchical clustering revealed a group of hits that were used to generate a Random Forest model which was then be applied to identify similar phenotypes.

    Our data analysis strategy can greatly accelerate high content screening projects by giving the biologist the ability to carry out their own data analysis. The ability to apply advanced AI algorithms without having to engage the services of a data scientist is especially valuable, and the application promises to make advanced high content analytical methods accessible to a far wider audience.

  • Data Integration: Genome X Transcriptome X EHR
    Nancy Cox, Vanderbilt University Medical Center

    Large-scale biobanks linked to electronic health records (EHR) offer unique opportunities for discovery and translation. I will describe how we impute transcript levels using publicly available genomic resources such as GTEx by building SNP-based prediction models from the measured transcript levels and genotype data in GTEx, and then apply these models to the genotype data in BioVU, the biobank at Vanderbilt University. BioVU has DNA on more than 250,000 samples linked to EHR going back 10-15 years on average and more than 20 years for some individuals. In addition to BioVU, we also utilize the medical phenome data on more than 2.8 million subjects. We have genome interrogation on 120,000 of the BioVU samples and test ~1200 phenome codes in ~18,000 genes with high quality out-of-sample prediction performance. Studies in BioVU are providing evidence for a continuum from Mendelian to common, complex diseases and I will summarize some of the most recent evidence.

Back to Top

Micro- and Nano Technologies

Track Chairs: Sammy Datwani, Labcyte and Amar Basu, Wayne State

Commercialization of Micro and Nanofluidic Technologies

Session Chair: Sumita Pennathur, UCSB

  • Presentation Title TBD
    Andrew Lynn, Fluidic Analytics

    Presentation information will be posted shortly

  • Recent Developments in Microfluidics and Microtechnologies for Applications in Life Science Research, in vitro Diagnostics and Medical Devices
    Mark Olde Riekerink, Micronit Microtechnologies

    This presentation will provide insight into the dynamics and benefits of rapidly evolving microfluidics and microtechnologies. Technologies that may have an enormous impact on the progress in science and healthcare.

    Microfluidics is the science of manipulating small volumes of liquids, usually on microchips made with semiconductor manufacturing techniques and containing small channels, which enable accurate control of liquids and chemical reactions. Because of the use of small volumes, quicker temperature shifts and faster liquid displacement is made possible. Moreover, microfluidics enables the automation and integration of complex operations on-chip, with reduced sample and expensive reagent volumes. With such properties, microfluidics obviously fits applications in healthcare along with other microtechnologies.

    Breakthroughs in the fields of life science research, in vitro diagnostics and medical devices are only possible because of the progress that has been made by both top designers and top manufacturers in these technology fields. With the right capabilities and growing experience in the field, a plurality of possibilities are offered by microfluidics and microtechnology companies that may lead to improvements in healthcare. For critical applications, where the challenge can be as critical as saving a patient’s life, selecting average quality is not an option.

    For this it will be key to successfully implement microfabrication technologies into industrial products. This does however require a multidisciplinary approach that usually starts from a conceptual design to prototyping to a final design and process transfer for volume manufacturing. Moreover, a flexible approach allowing the choice of a variety of fabrication materials and hybrid material combinations as well as various options for various fabrication technologies are key to the successful integration of required multiple functions into next generation products. This approach is of particular interest for developing relative complex designs with multiple micro- and nanostructures and functionalities on board. This way, development costs and time-to-market can be reduced and success rates for cost-effective commercialization are increased.

    During this presentation examples of developing industrial solutions together with recent technology developments will be presented and explained that may have a significant impact on applications in the field of Genomics, Point-of-Care diagnostics, Organ-on-a-Chip and cell culturing.

  • Application of Multi-Organ-Chips to Enhance Safety and Efficacy Assessment in Drug Discovery
    Reyk Horland, TissUse GmbH

    Microphysiological systems have proven to be a powerful tool for recreating human tissue- and organ-like functions at research level, providing the basis for the establishment of qualified preclinical assays with improved predictive power. However, industrial adoption of microphysiological systems and respective assays is progressing slowly due to their complexity. In the first part of the presentation examples of established single-organ chip, two-organ and four-organ chip solutions are highlighted. The underlying universal microfluidic Multi-Organ-Chip (MOC) platform of a size of a microscopic slide integrating an on-chip micro-pump and capable to interconnect different organ equivalents will be presented. Issues to ensure long-term performance and industrial acceptance of microphysiological systems, such as design criteria, tissue supply and on chip tissue homeostasis will be discussed. The second part of the presentation focusses on the establishment of automated MOC-based assays as a robust tool for safety and efficacy testing of drug candidates. These automated assays will allow for increased throughput and higher inter-laboratory reproducibility thus eventually enabling broad industrial implementation. Finally, a roadmap into the future is outlined, to further bring these assays into regulatory-accepted drug testing on a global scale.

  • Electrokinetic micro- and nanofluidic technologies for point-of-care devices
    Sumita Pennathur, UCSB

    Commercialization of micro- and nanofluidic technologies for lab-on-a-chip based assays has been of interest for decades, ever since microfabrication techniques have deemed it possible to miniaturize electronic devices. Initially there were many attempts to simply miniaturize existing technologies, which led to many commercialization failures, since physics is not linear across spatial scales. Specifically, macroscale technologies hit fundamental size limits (sample size, abundance of chemical species, absorbance limits, etc), and thus inventing a micro- or nanofluidic technology often requires fundamental breakthroughs in physics, chemistry, fabrication, and materials engineering. Therefore, for micro- and nanofluidic technologies to truly be commercialized, fundamentally new technologies that combine physics and chemistry must be innovated and employed which can not only match current assay performance, but perhaps even revolutionize the way samples are processed. Certainly, the last decade has brought about many of those new types of inventions, and in this talk I will highlight a few innovations from my laboratory at UCSB and/or companies I have founded. In particular, I will highlight how fluids at the nanoscale behave different than micro- and macro scale counterparts because of the influence of the surface, and in particular, with electrical field driven systems, because of the influence of the electric double layer inherent at a solid-liquid interface of electrolyte-based analyte system (ie. biofluids). In addition, I will show how these differences can lead to unique physics couplings that allow for the detection of ions with selectivity and sensitivity that have never been yet achieved. Furthermore, this sort of detection leads to indirect detection of biomolecules, as ion concentration can change as biomolecular reactions occur. Finally, I will show results from the latest company I founded, that uses unique physics at the micro- and nanoscale to overcome the technological barriers needed to develop a painless, wearable glucose sensor that can be rapidly commercialized for patients with Type I diabetes.

Micro and Nano Technologies for Digital, High Throughput, and Single-Cell Assays

Session Chair: Valerie Taly, UMRS1147- University Paris Descartes/INSERM

  • Single cell analysis in Microfluidics for cancer research
    Valerie Taly, UMRS1147- University Paris Descartes/INSERM

    Droplet-based microfluidic has led to the development of highly powerful systems that represent a new paradigm in High-Throughput Screening where individual assays are compartmentalized within microdroplet microreactors. By combining a decrease of assay volume and an increase of throughput, this technology goes beyond the capabilities of conventional screening systems. Droplets can be produced as independent microreactors that can be further actuated and analyzed at very high throughput (several thousands to ten of thousands per seconds). Added to the flexibility and versatibility of platform designs, such progress in sub-nanoliter droplet manipulation allows for a level of control that was hitherto impossible.

    We will show how droplet based microfluidics allows to perform single cell experiments with high efficiency and high throughput by presenting several recently developed droplet-based strategies. In particular, the capabilities of the newly developed microfluidic platforms for single cell encapsulation, phenotypic characterization and multiplex sorting. Finally, we will emphasize on the potential applications of such platforms for cancer research and resistance to treatment analysis.

  • ELASTO-TWEEZERS - a novel platform for high-precision cell elasticity measurements
    Severine Le Gac, University of Twente

    The mechanical and elastic properties of cells are essential for their function. Certain diseases, e.g., some types of cancer and cardiomyopathies, are associated with a change in the cell mechanical properties, and in turn, in an alteration of their behavior. Characterizing the mechanical and elastic properties of cells therefore has a diagnostic value to detect these diseases, but also to get better insights into the patho-mechanisms of these diseases.

    For high-throughput evaluation of the cell mechanical and elastic properties, a dual-beam optical tweezers setup with video-based force detection is used, which allows measuring the forces applied to a cell with piconewton resolution and the cell deformation with sub-micrometer resolution. However, for such stretching measurements using optical tweezers, the cell must be coupled to two micrometer-sized beads acting as handles, and this coupling step drastically limits the throughput of the measurements. Here, we report an innovative microfluidic cartridge to automate and parallelize the formation of such bead-cell-bead complexes prior to elastic measurements. Specifically, beads and cells are sequentially captured and patterned in the platform using dedicated trapping structures, before they are assembled as complexes with the help of the two laser beams. Aiming at a throughput of 100’s of measurements per hour, the current design allows the formation of 100 individual complexes in the microfluidic platform.

    In my talk, I will present different iterations of the microfluidic platform for its optimization, as well as its validation using both micrometer-sized beads and cells. Next, I will present preliminary data on elasticity measurements of HEK293 (human embryonic kidney) cells, as well as skin fibroblasts.

  • Unique Sensor Platform for Monitoring of Inflammation in Multi Body Fluids
    Badrinath Jagannath, The University of Texas At Dallas

    Inflammatory diseases are associated with high diagnostic cost and add to economic burden. The current blood-based techniques are not feasible for rapid monitoring of inflammation. They are also labor intensive. Sweat and saliva serve as valuable biofluids for wearable and point-of-care biosensing with plenty of diagnostic information. However, there is currently no single standard integrated platform that can detect biomarkers from both sweat and saliva. In this work, we have developed a novel, rapid and highly sensitive single electrochemical sensor system that can detect pro-inflammatory cytokine interleukin-6 (IL-6) in both sweat and saliva for inflammation monitoring. The developed sensor leverages a novel electrolyte system called room temperature ionic liquid (RTIL) with excellent electrochemical properties that makes the sensor system easily integrable across biofluids. RTIL prevents any pH drifts due to sweat and saliva and ensures the sensor is highly sensitive and specific to target analyte. The sensing was performed on a flexible nanoporous membrane of wearable form-factor using an immunoassay based method where a highly specific IL-6 capture probe antibody in RTIL is immobilized on the sensor surface via a thiol-crosslinker functionalized on the electrode. The binding interactions were captured using label-free electrochemical impedance spectroscopy (EIS) as this detection modality directly probes the electrode/ buffer interface for capturing the binding interactions. A very low detection limit of 0.2 pg/mL and a wide dynamic range over 3 logarithmic orders 0.2- 200 pg/mL was achieved in both sweat and saliva. The sensor also demonstrated excellent selectivity and specificity with no signal response observed for non-specific molecules glucose and lactate even at 10-fold higher concentrations. This enhanced sensor performance metrics is due to the use of RTIL in conjunction with EIS that prevents any non-specific interactions with capture probe antibody. Additionally, the insensitivity to pH variation and stability maintained by the RTIL makes this detection system robust and ideal for wearable and point-of-care sensing applications with sweat and saliva.

  • A Microfluidic Trap Array with the Ability to Perform Serial Operations on Nano-liter Samples
    Pulak Nath, Los Alamos National Laboratory

    Despite significant advances in microfluidic single cell analysis, serial processing on the isolated cells is not trivial. In this work, we present a novel microfluidic platform that can facilitate in situ processing and analysis of single cells in nano-liter volumes by trapping, delivering reagents, and mixing in a seamless operation. Furthermore, it is possible to deterministically release the sample for further off-chip processing and analysis. This platform is composed of microfluidic trap arrays placed along a central channel of a microfluidic chip. The traps are comprised of a thin semipermeable membrane on the top, which is exposed to a pneumatic control channel. Vacuum can be applied into the pneumatic control channel to actuate the membrane to withdraw single cell samples from the central microchannel. The semipermeable membrane allows air caught within the trap during the withdrawal to escape into the pneumatic control channel and thereby, filling all the traps with the same volume of fluids. Plugs of aqueous solutions separated by air are created in the central channel such that nano-liter samples can be withdrawn into the traps by the actuation of the membrane. This enables serial operations on the nano-liter samples. Furthermore, filling the central channel with fluorinated oils can create isolated droplets containing the trapped cells/cellular components, which can be released by applying pressure into the pneumatic control channels. The multilayered device is fabricated using a laser based micro-patterning and lamination method. Commercially available silicone films were integrated as the stretchable, semipermeable membranes. Following proof of principle with food coloring dye, trapping of single fluorescent beads and single cells, and serial operations such as bleaching, labeling, and lysis will be demonstrated. With its ability to integrate multiple serial operation steps on a single trapped cells, applications towards single cell toxicological analysis will be presented.


Session Chair: Daniel Levner, Emulate, Inc.

  • Organs-on-Chips: A Platform for Drug Development and Disease Modeling
    Daniel Levner, Emulate, Inc.

    Organ-Chips - such as the Lung-, Liver-, Brain- and Intestine-Chips - are micro-engineered systems that display physiological functions consistent with human in vivo. Each Organ-Chip is composed of a clear flexible polymer about the size of a AA battery that contains hollow channels lined by living human cells; these cells are cultured under continuous flow and mechanical forces, which recreate key aspects of the in vivo cellular microenvironment. We have found that cells cultured within the engineered 3D microenvironments of Organ-Chips go beyond conventional in vitro models, making Organ-Chips more predictive of in vivo physiology. Accordingly, Organ-Chips enable the study of normal physiology, pathophysiology, and mechanisms of action or toxicity in an organ-specific context. In this presentation, we will highlight studies from collaborative efforts across our Human Emulation System with various academic and industry partners to demonstrate the utility of the system as a more predictive human-relevant platform for efficacy, safety and mechanistic studies.

  • SLAS2019 Innovation Award Finalist: A High-Throughput Organ-on-Chip Platform for Drug Screening with Applications in Immuno-oncology
    Jeffrey Borenstein, Draper

    Screening for therapeutic effects in humans can be made more predictive by creating in vitro models that express organ- or tissue-specific function on a platform with a high level of throughput. The organ- or tissue-specific function can then be evaluated across many samples, doses, and replicates, and extrapolated to a clinical measure impacted by the organ- or tissue-specific function. We create a microfluidics-based model that controls micro-environmental parameters to generate tissue with organ-specific functions of kidney, liver, vasculature, gut, and immune-tumor interactions. The model utilizes tools including controlled microfluidic fluid flow, cell-substrate topography, immune cell flow and tumor fragment trapping technology, and cell-cell cues to control and influence tissue function. To achieve higher levels of throughput, the model is replicated within a 96-well format while maintaining the unique characteristic of controlled flow. To improve data collection, integrated electrical traces measure trans-epithelial/endothelial electrical resistance (TEER) in near real-time and provide a means to create additional sensing capabilities in the future. The microfluidic model demonstrates the ability to generate tissue in vitro with tissue-specific function, provides near real-time feedback on tissue barrier function, and can scale to relevant levels of throughput, also resulting in a screening tool for drug efficacy screening of immuno-oncology and combination therapies.

    A particular application of this high-throughput in vitro model technology is in the field of immuno-oncology, which is advancing rapidly as a new treatment modality for tumors that are often resistant to conventional chemotherapy. However, a major challenge in this field is the unpredictable response to immunotherapeutic compounds between patients and between different cancer types. Current methods for predicting efficacy such as syngeneic mouse models and patient-derived xenografts are low throughput and do not permit mechanistic studies. Here we describe the application of higher throughput organ-on-chip platform technology for evaluation of the efficacy of immune checkpoint inhibitors and combination therapies in the treatment of various human cancers. The model system permits dynamic perfusion of tumor biopsy samples to extend the window of ex vivo viability beyond current static models, and delivers lymphocytes to the tumor to recapitulate interactions which govern the function of immunotherapeutic treatment. Tumor killing and lymphocyte infiltration are monitored in real time under high-resolution confocal microscopy, enabling mechanistic studies of how the tumor microenvironment and immune system function are affected by experimental treatments. These high resolution images are processed using image analytic and machine learning algorithms to quantify results, providing a powerful new tool in assessing preclinical efficacy.

  • The next decade of pre-clinical drug discovery: Integrating disease-tunable 3D human microtissues into higher order systems
    Patrick Guye, InSphero AG

    The success of drug development programs depends significantly on having in-vitro models that reflect the human patient as closely as possible, while being highly practical in their implementation. This practicability should encompass reproducibility and availability of the model in time and space, on-demand production, scalability for screening, amenability to meaningful treatment windows, ease of access to as many relevant endpoints as possible, and compatibility with standard lab processes.

    We engineered a scalable and SBSS-compatible platform family based on human 3D microtissues models of various organ systems which can be induced to become diseased: Fibrosis, Steatosis/NAFLD, NASH for liver; T1D and T2D for pancreatic islets; and various tumor models. These 3D microtissues are not only immune-competent, but also highly dynamic, as demonstrated by their capability to relax and remodel following injury (e.g., liver fibrosis). In addition, 3D microtissues are organotypic in their cell composition and accessible to all current endpoints ranging from biochemical assays, to single cell extractions, to high content imaging.

    We leveraged the power of these 3D organotypic microtissues and combined them in a highly robust microphysiological device for establishing meaningful organ-organ interactions. We will discuss the challenges of combining 3D microtissues representing various organ systems in screening-friendly devices, as well as the unique advantages of such multi-tissue systems, including higher functionality, ad-hoc medium conditioning, in-situ compound metabolization and the ability to combine efficacy assessment with toxicity detection in the same system.

    Highly modular and scalable, such human in-vitro modeling platforms for efficacy and safety fit the needs and requirements of many stakeholders/departments in today’s pharmaceutical and biotech enterprises and will greatly simplify and accelerate drug discovery.

  • Towards a personalized drug development using autologous iPSC-derived Multi-Organ-Chips
    Anja Ramme, TissUse GmbH

    Until now, efficacy and toxicity of newly developed drugs are analyzed in preclinical studies with the help of animal testing and static cell culture studies. However, animal models are known not to adequately represent the human organism. Moreover, static cell culture experiments are not complex enough to represent important physiological interactions between the human organs. Due to these limitations along with the high failure rates and development costs, the drug development pipeline needs to adapt to a more efficient, cost?effective, animal?free process that applies human cells under physiological conditions.

    Methods: An alternative to the traditional models are microphysiological systems. These enable the combination of human-derived miniaturized tissues cultivated in physiological communication with each other and allow for complex tissue-tissue communication as present in the human body. The Multi-Organ-Chip is such a microphysiological system which currently combines the culture of up to four human organ models. A cell culture medium is circulated within the channel system which supplies the organ models with nutrients and oxygen and assures their communication via messenger metabolites. So far, predominantly cell lines or primary tissues from different donors have been used for the co-cultivation in microphysiological systems. However, the different genetic background of the tissues during the co-cultivation is a drawback. This problem was solved by generating several organ models with the same genetic background using induced pluripotent stem cells from a single donor. Additionally, we introduce a newly designed proprietary PBPK-compliant 4-Organ-Chip to generate human-relevant ADME- and toxicity profiles.

    Results: A successful co-cultivation combining a small intestine, liver, kidney and brain model was achieved in the PBPK-compliant 4-Organ-Chip. All four tissue models were pre?differentiated from induced pluripotent stem cells from a single healthy donor. A further differentiation into the organ models was achieved by utilizing the technological advantages of the 4?Organ?Chip. The medium circulation through the microfluidic channel system enabled a cross talk between the tissues. This organ cross talk led to further differentiation of the tissues over a 14-day co-cultivation, despite the fact that all tissues were cultured in one common medium in the absence of any growth factors. This maturation could be conclusively detected by qPCR, immunofluorescence and RNA sequencing endpoint analyzes.

    Conclusions: The autologous 4-Organ-Chip is the first step towards the development of a Patient?on?a?Chip. Genome editing or the usage of blood donations from patients can help to generate disease models on the 4-Organ-Chip in the future. The Patient?on?a?Chip provides the unique opportunity to accelerate the drug development and to intensify the research of individualized therapies. As a result, most laboratory animals, as well as healthy volunteers in Phase I clinical trials, could be replaced by the Patient?on?a?Chip.

Back to Top

Molecular Libraries

Track Chairs: Jonathan O'Connell, Forma Therapeutics and Andrew Alt, University of Michigan

Small Molecule Libraries

Session Chair: James Breitenbucher, University of California, San Francisco

  • Prospective Biologically Biased HTS Library Generation
    James Breitenbucher, University of California, San Francisco

    Historical attempts to fill “Diversity Space” and create the ultimate universal HTS library have largely resulted in an inefficient expenditure of resources. This is perhaps not unexpected given the vastness of chemical space and the likely relatively small number of biologically meaningful compounds. Recently the concept of “Dark Matter” has entered the HTS discourse. A concept which suggests that the majority of compounds in HTS collections will never hit a biological target and that compounds that hit a single biological target are in fact biased to hit other biological targets, relative to random chemical selections. With these concepts in mind Dart Neuroscience embarked on an effort to create a “Biologically Relevant” screening library by prospectively designing combinatorial libraries with target information driving library design. We will present data on the design of this library, as well as screening results, and commentary on lessons learned using this prospective library design approach.

  • Ultra-large library docking for ligand discovery
    John Irwin, UCSF

    Despite intense interest in expanding chemical space, libraries of hundreds-of-millions to billions of diverse molecules have remained inaccessible. In principle, structure-based docking can address such large libraries, but to do so the molecules must be readily obtained, the computation must be tractable, and new potential ligands must outscore the inevitable decoys. Here, we investigate docking screens of over 170 million make-on-demand, lead-like compounds. The molecules derive largely from 109 well-characterized two-component reactions with a >85% synthesis success rate. The resulting library is diverse, representing over 10.7M scaffolds not found in “in-stock” commercial collections. In benchmarking well-behaved targets, the enrichment of ligands versus decoys improved with library size, suggesting that more ligand-like molecules exist to be found as the library grows. To test this prospectively, 99 and 138 million molecules were docked against the soluble enzyme AmpC β-lactamase and the membrane-bound D4 dopamine receptor respectively. From among the top-ranking docking hits for AmpC, 44 diverse molecules were synthesized and tested. Five inhibited, including an unprecedented 1.3 uM phenolate that is the most potent AmpC inhibitor found from any screen. Within-library optimization revealed a 110 nM analog, the most potent non-covalent AmpC inhibitor known. Crystal structures of these inhibitors confirmed their fidelity to the docking prediction. For the dopamine receptor, 549 molecules were synthesized and tested from among top docking ranks, and also from intermediate and low ranks. On testing, hit rates fell monotonically with score, ranging from 24% for the highest ranking, declining through intermediate scores, and dropping to a 0% hit rate for the lower ranks. Integrating across the resulting hit-rate curve predicts 481,000 D4 active molecules in 72,600 scaffolds. Of the 81 new D4 actives found here, 30 had Ki values < 1 uM. The most potent was a 180 pM Gi-biased, selective, full agonist, among the most potent sub-type selective agonists known for this receptor. Prospects for making over 1 billion lead-like molecules community-accessible, and for their prioritization in structure-based screens, will be considered.

  • Data Directed Approach for Diversity-based HTS library design
    Louis Scampavia, Scripps Florida; Department of Molecular Therapeutics

    High-throughput screening (HTS) remains the principal engine of small-molecule probe and early drug discovery. However, recent shifts in drug screening toward nontraditional and/or orphan-ligand targets have proved challenging, often resulting in large expenditures and poor prospects from traditional HTS approaches. In an ever-evolving drug development landscape, there is a need to strike a balance between chemical structure diversity, for diverse biological performance, and target-class diversity aimed at generating efficient and productive screening collections for HTS efforts.

    In this presentation, data-driven approaches for compound library design and hit triage are examined. An overview of current developments in data-driven approaches formulated at Scripps and other institutes is presented that demonstrate increased hit rates using iterative screening strategies that also provide early insight into structural activity relationships (SAR). Scripps Research Molecular Screening Center, having served as one of four principal HTS screening centers during the NIH-funded Molecular Libraries Production Centers Network program, had performed well-over 300 HTS campaigns on a large variety of traditional and nontraditional targets. Informatics analyses of the NIH/MLPCN library reveals representative scaffolds for hierarchical related compounds correlated to various target classes. Data mining HTS screening results of various campaigns demonstrates that hierarchical-related compounds to these scaffolds have enhanced hit rates (up to ~50X); serving as a hit predictor and guide for compound library selection. Presented are the informatics that supports a novel approach to HTS library design aimed at mitigating risk when screening difficult targets by only requiring small pilot screens for early chemical landscape discovery and provide guidance to formulating larger HTS efforts. Hits discovered from a targeted pilot screen can allow for the selection and retesting of chemically related super-structures; which results in enhanced hit discovery while providing preliminary SAR heredity. Lessons learned in data-driven library selection allow investigators to cover larger amount of appropriate chemical space, increase the chances of identifying quality hits, provide guidance in library acquisition and even provide a foundation in combinatorial chemistry design.

  • A novel European Compound Screening Library for EU-OPENSCREEN
    Wolfgang Fecke, EU-OPENSCREEN ERIC

    The European Research infrastructure EU-OPENSCREEN was founded with support of ist member countries and the European Comission. Its distributed character offers complementary knowledge, expertise and instrumentation in the field of chemical biology from 20 European partner institutes while ist open working model ensures that academia and industry can readily access EU-OPENSCREEN´s collection of chemical compounds, equipment and associated screening data.

    This work describes our collaborative effort between different medicinal and computational chemistry sites in Europe to set up a protocol for the rational design of a general purpose screening library against novel biological targets, consisting of about 100.000 commercial small molecules. The molecules were first pre-selected to ensure chemical stability, solubility and other screening-compliant physicochemical properties as well as absence of reactive compounds. Then the collaborating groups applied different methods to create diverse sub-libraries according to their own inhouse expertise, aimed at providing hits for a wide variety of drugable targets. Quality of approved vendors, cost and compound availability were other factors which shaped the content of the final screening library.

DNA-Encoded Libraries

Session Chair: Christopher Kollmann, FORMA Therapeutics

  • Establishing a DNA Encoded Library Screening Platform: Challenges and Learnings
    Christopher Kollmann, FORMA Therapeutics

    DNA encoded libraries (DEL) are increasingly used as part of an integrated hit identification strategy, with many companies initiating external or internal efforts. The technology allows extremely large numbers of encoded compounds to be screened in parallel. Publications regarding the technology’s development have primarily focused on developing chemical reactions compatible with DNA or specific case studies of successful targets. This presentation will focus on the complex interplay of selection methodology, library design, and data analysis that leads to a successful DEL hit identification campaign using case studies from FORMA experience establishing a DEL screening platform.

  • The value of normalization and impact of library size on identifying hits from DNA Encoded Library screens
    Christopher Phelps, GlaxoSmithKline

    DNA encoded small molecule libraries (DELs) can be synthesized by several means (i.e., DNA recorded, DNA templated, self-assembling), but regardless of approach most DELs rely on affinity selection to identify hits from those libraries. Affinity selections coupled with chemically diverse DNA encoded libraries have demonstrated their value in identifying small molecule hits for the development of chemical probes and clinical candidates. Selection outputs are interrogated through sequencing of the DNA tags and analysis of the copy counts of selected molecules help to identify the ligands enriched specifically for the target of interest.

    Analysis based on the copy counts presents two challenges: 1) Copy counts do not correlate with activity of DEL hits synthesized on- or off-DNA and 2) Hits from very diverse libraries (>100 million) are rarely identified. At GSK we have adapted several different strategies for selection, sequencing, and analysis tailored to different DEL diversities that enable us to identify and rank hits from our entire library collection.

  • Nanomolar inhibitor for hexamer DNA helicase RuvBL1/2 identified in DEL screen
    Jacob Andersen, Vipergen

    RuvBL1 and RuvBL2 are related members of the AAA+ superfamily of conserved proteins, shown to possess both in vitro ATPase and helicase activity. They form a double hexamer complex involved in many essential cellular processes, among these transcriptional regulation of several important oncogenes [1,2]. The complex is over-expressed in a variety of human tumor types [3] which makes RuvBL1/2 an attractive target for chemical intervention. Unfortunately, finding suitable chemical matter via traditional high-throughput screening methods has proven challenging. Due to its complex multi-subunit structure, RuvBL1/2 has many potential binding sites and modes, and we therefore speculated that stabilization of a relevant conformation would be required to obtain hits. We screened under several different binding conditions, but only upon addition of non-hydrolysable ATP (ATPgS) hits were obtained. Herein we report the identification of nanomolar inhibitors of RuvBL1/2 derived from a DNA-encoded chemical library (DEL) screen. We used YoctoReactor technology to create high fidelity DELs and screened these using the Binder Trap Enrichment technology. We identified several unique series potent against RuvBL1/2 including a hit with a biochemical IC50 of 340 nM. A follow-up structure-activity relationship (SAR) investigation quickly resulted in a 10-fold improvement in biochemicalpotency along with a marked improvement in cellular activity. This effort ultimately afforded a compound with a cellular IC50 of 290 nM. In summary, the DEL screening campaign provided several attractive hits from which a lead series was identified that proved readily amenable to lead optimization of both physicochemical properties (MW, PSA, logP) and on-target activity.

    • Kanemaki M, Kurokawa Y, Matsu-ura T, Makino Y, Masani A, Okazaki K, et al. TIP49b, a new RuvB-like DNA helicase, is included in a complex together with another RuvB-like DNA helicase, TIP49a. J Biol Chem 1999; 274:22437-44
    • Puri T, Wendler P, Sigala B, Saibil H, Tsaneva IR. Dodecameric structure and ATPase activity of the human TIP48/TIP49 complex. J Mol Biol 2007; 366:179-92
    • Mao Y and Houry W.A., The Role of Pontin and Reptin in Cellular Physiology and Cancer Etiology. Front Mol Biosci. 2017 Aug 24;4:58
  • SLAS2019 Innovation Award Finalist: Microfluidic Functional Screening of DNA-Encoded Bead Libraries
    Brian Paegel, Scripps

    Combinatorial synthesis was originally conceived and pitched to solve one of the most vexing problems in high-throughput screening: library generation. Using split-and-pool methods, million-member compound collections could be accessed quickly and cheaply, but early combinatorial library chemotype mainstays (peptides) were not drug-like, adapting the libraries to robotic automation proved problematic, and screening hit deconvolution invariably lacked the necessary statistical power. DNA-encoded library (DEL) technology has resurrected combinatorial chemistry by enabling the construction of vast compound collections (> 107) of drug-like compounds that are evaluated en masse by affinity selection and deconvoluted with high statistical power via next-generation DNA sequencing (NGS). My laboratory has expanded on this powerful lead generation concept, engineering highly integrated microfluidic droplet-based circuits and bead-based DEL synthesis strategies that enable the direct functional interrogation of DELs. The circuits load library beads into picoliter-scale activity assay droplets, photochemically cleave the library member to dose the droplet, incubate the droplets, detect each droplet’s assay result at high speed, and sort droplets that signal the presence of a bioactive compound for subsequent NGS. I will discuss the design, synthesis, and characterization of 3 drug-like bead DELs ranging in diversity (10–100k members) and the microfluidic functional screening of these libraries using classic enzyme/substrate assays (HIV protease, autotaxin). I will also discuss a new in-droplet fluorescence polarization (FP) detection system for microfluidic DEL screening when a fluorometric activity assay is unavailable. Finally, as an example of a target that would be difficult to prosecute using convention DEL technology, I will also present library screens of a complex metabolism — bacterial reporter gene in vitro translation — and several families of bacterial translation inhibitors that were discovered in the screen. This bead-based DNA-encoded library and accompanying functional screening technology now raises the possibility of conducting DEL screens against a variety of complex, metabolizing systems, such as lysates, cellular targets, and tissues.

Speciality Libraries & Innovations in High-throughput Screening

Session Chair: Ashootosh Tripathi, University of Michigan

  • Compound Interest: Accessing the LifeArc compound bank
    Andy Merritt, LifeArc

    LifeArc, formerly MRC Technology, is a medical research charity with over 25 years’ experience in helping scientists turn their research into potential treatments. Our Centre for Therapeutics Discovery (CTD) works on drug discovery projects in collaboration with early-stage academic research for medical conditions where there is a clear need for new treatments. With over 80 scientists in CTD we support early stage target validation, hit identification and lead optimisation for both small molecule and antibody projects.

    One area of support where we have focused significant resource is enabling academic access to our chemistry resources and chemical collections. For many academic researchers access to compounds to support target validation, pathway elucidation and lead discovery can often be a limiting factor in the progress of their research; whether looking for tool compounds with well annotated and understood pharmacology or collections of compounds for varied screening approaches the cost and logistics of obtaining such compounds can be significant. In addition, the choice of which tool or set of compounds to use can need an understanding of drug discovery and medicinal chemistry that may not be immediately available to a researcher locally. LifeArc has established a simple, minimal cost model to enable academic access to our drug discovery expertise and if relevant obtain whichever sets of compounds from our collections that can potentially further their research.

    This presentation will highlight the development of the full range of the LifeArc compound collection, with emphasis on the successful application of compound sets to academic drug discovery programmes. Established and continually refined by collaboration between LifeArc’s computational and medicinal chemists, with many years of drug discovery experience between the team members, the LifeArc collection is not just a single deck of compounds. Rather there are sets of compounds to support multiple strategies for target validation or lead discovery, scaled to support all levels of screening capability and capacity from single compound analysis through to multiple robotic systems. In this talk we will cover diversity strategies from small numbers to 100,000 compound plus approaches, follow up mechanisms for various screening strategies (e.g. fragments and annotated collections) and LifeArc strategies outside of the ‘drug/hit-like’ space necessary for prosecuting targets such as protein-protein interactions or gram negative antibacterial target research.

  • Reinvigorating old medicines: Discovering adjuvants that rescue antibiotic activity against resistant pathogens
    Andrew Lowell, Virginia Tech

    Microbial resistance to approved clinical treatments requires creative new medicine approaches and can be significantly augmented by the use of automation. One underutilized avenue is the incorporation of antibiotic adjuvants co-administered with their partner antibiotic. The only significant commercial example to date is Augmentin, a penicillin-class drug that targets cell-wall synthesis in pathogenic bacteria. Augmentin is a combination drug containing the beta-lactam antibiotic amoxicillin along with clavulanic acid, an adjuvant that functions as an inhibitor of beta-lactamase, the bacterial enzyme responsible for resistance to this class of antibiotic. Thus, this combination therapy is capable of killing both resistant and susceptible bacteria whereas the antibiotic alone is only active against susceptible pathogens.

    Bacteria overcome different classes of antibiotics using a variety of resistance mechanisms. Remarkably, this specificity of resistance mechanism to certain classes is also true for those that act on the same cellular function; that is, two antibiotics that target the bacterial ribosome are not overcome by the same resistance mechanism. This orthogonality has continued to drive new antibiotic development; despite the finite number of established targets, a structure that targets a novel binding site often avoids established modes of resistance. Still, novel class discovery is difficult and time consuming, and an approach such as adjuvant development is a promising alternative because of the potential to reinvigorate approved drugs that are becoming ineffective.

    To identify new potential antibiotic adjuvants, we are repurposing an established transcription/translation-based assay that was developed to detect novel inhibitors of protein synthesis. By modifying this assay to include a ribosome-targeting antibiotic and the resistance mechanism that prevents its action, we can search for compounds that specifically target the resistance mechanism and thus have the potential to rescue the partner antibiotic, again making it effective against resistant pathogens.

    Natural product libraries are an important resource for this assay. Many antibiotics are natural-product derived and burgeoning knowledge about the communication and symbiosis between organisms suggests that different bacteria or higher-order species may work together to produce more effective mixtures in their ongoing chemical arms race. Therefore, the likelihood of identifying compounds that have evolved to be antibiotic adjuvants from natural product libraries is high, not just among bacterial extracts, but also from plants and fungi functioning in concert with endophytes or other bacteria. By supplementing our search with traditional libraries, rapid progress can be achieved alongside novel entity characterization and subsequent medicinal chemistry development.

  • Maximizing Chemical Diversity in a Natural Product Screening Library
    John Beutler, Molecular Targets Program. Center for Cancer Research, National Cancer Institute

    High throughput screening of natural products is a valuable part of drug discovery, as seen by the numerous examples of natural product hits which have become clinically useful drugs. For our screening library within the NCI Molecular Targets Program we have sought to enhance the base NCI collection of tropical plant and marine invertebrate pre-fractionated samples with extremophilic bacteria and unusual fungi from temperate locations, as well as natural product-like diversity-oriented synthesis libraries from synthetic academic collaborators. Challenges include working with a wide variety of countries while respecting the Convention on Biological Diversity, keeping track of and analyzing tens of thousands of samples from a multitude of sources, and sharing data and collaborating with source organizations. Examples of work in Kazakhstan, Turkey, and Brazil will be discussed, as well as work with academic synthesis groups and US microbial research groups.

    Funded in part by NCI Contract No. HHSN261200800001E, and by the Intramural Program of the US National Cancer Institute (Project 1 ZIA-BC011469-06)

  • Development of a Millennial Drug Discovery Platform using Microbial Treasure Trove
    Ashootosh Tripathi, University of Michigan

    If discovery of new antibiotics continues to vacillate while the ability of pathogenic microbes to develop resistance continues to surge, society’s medicine chest will soon lack effective treatments against a multitude of serious infections. To put the situation into context, over the last 30 years no new class of antibiotics has been introduced to mankind. Moreover, the majority of pharmaceutical efforts during the past six decades have focused on the synthetic enhancement of a limited set of unique core scaffolds. From these perspective, we envisioned that a more sustainable route to combat antibiotic resistance is the discovery of novel classes of antimicrobials, which would require a greatly improved antibiotic husbandry and a much effective list of unique microbial targets. Herein, we describe a robust high throughput antibacterial discovery platform involving key virulence and resistance mechanism as target against both gram-negative (A. baumannii, uropathogenic E. coli) and gram-positive (MRSA) pathogenic microbes.

    In one of our recent effort, we envisioned to tackle one of the urgent challenge of UPEC which is the cause of recurrent UTIs in women. It has been previously demonstrated that UPEC iron acquisition systems are upregulated during infection. Therefore, we screened for molecules that inhibit the growth of UPEC in low iron media from a library of natural product extracts (NPEs) and identified a novel metabolite, nicoyamycins and performed effective dose dependent bioassay.

    Acinetobacter baumannii has the ability to attach to a surface and build a complex matrix where they colonize to form a biofilm. To combat the widespread problem, we designed an in vitro screen to identify inhibitors of A. baumannii biofilms using NPEs derived from marine microbes. The strategy provided access to three novel metabolites, cahuitamycins with sub-micromolar efficacy. Efforts to assess starter unit diversification through their biosynthetic pathway lead to the production of unnatural analogues cahuitamycins D and E with increased potency.

    Previously, we also envisaged that the targeted inhibition of siderophore staphyloferrin B of Staphylococcus aureus, holds considerable potential as a single or combined treatment for methicillin resistant S. aureus (MRSA). Therefore, we developed a biochemical assay against the non-ribosomal peptide synthetase independent siderophore (NIS) synthetase involved in biosynthetic pathway of staphyloferrin. Analysis of NPEs led to the isolation of a novel class of antibiotics, baulamycins, acting as reversible competitive inhibitors of SbnE enzyme.

    The success of these studies provides a proof of concept towards the development of a new age discovery platform, providing solution for two major bottlenecks that impede the new drug pipeline: identification of novel drug leads and beating the resistance due to indirect microbial targeting without trying to kill the pathogen.

Back to Top

Our Sponsors

Premier Sponsor:

  • Diamond Sponsor

  • Diamond Sponsor

    BMG Labtech
  • Diamond Sponsor

  • Diamond Sponsor

  • Diamond Sponsor

    Thermo Fisher Scientific
  • Diamond Sponsor

  • Platinum Sponsor

    Beckman Coulter
  • Platinum Sponsor

  • Platinum Sponsor

    Miltenyi Biotec
  • Platinum Sponsor

    Molecular Devices
  • Platinum Sponsor

  • Gold Sponsor

  • Gold Sponsor

  • Gold Sponsor

  • Gold Sponsor

  • Gold Sponsor

    HighRes Biosolutions
  • Gold Sponsor

  • Gold Sponsor

  • Silver Sponsor

    Analytik Jena
  • Silver Sponsor

    BioNex Solutions
  • Silver Sponsor

  • Silver Sponsor

    Greiner Bio-One
  • Silver Sponsor

  • Silver Sponsor

    Sirius Automation Group
  • Silver Sponsor

  • Media Partner

  • Media Partner

  • Media Partner

    Cell Press
  • Media Partner

  • Media Partner

    DDN News
  • Media Partner

  • Media Partner

  • Media Partner

    GEN News
  • Media Partner

  • Media Partner

  • Media Partner

    Science AAAS
  • Media Partner

    Select Science
  • Media Partner

    Technology Networks
  • Media Partner

    The Analytical Scientist
  • Media Partner

    The Scientist
  • Media Partner

    The Translational Scientist