bioRxiv preprint doi: https://doi.org/10.1101/2020.11.09.373407; this version posted November 10, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission. A Simple and Efficient Pipeline for Construction, Merging, Expansion, and Simulation of Large-Scale, Single-Cell Mechanistic Models Cemal Erdem1,*, Ethan M. Bensman2, Arnab Mutsuddy1, Michael M. Saint-Antoine3, Mehdi Bouhaddou4, Robert C. Blake5, Will Dodd1, Sean M. Gross6, Laura M. Heiser6, F. Alex Feltus7,8,9, and Marc R. Birtwistle1,10,* 1 Department of Chemical & Biomolecular Engineering, Clemson University, Clemson, SC 2 Computer Science, School of Computing, Clemson University, Clemson, SC 3 Center for Bioinformatics and Computational Biology, University of Delaware, Newark, DE 4 Department of Cellular and Molecular Pharmacology, University of California San Francisco, San Francisco, CA 5 Center for Applied Scientific Computing, Lawrence Livermore National Laboratory, Livermore, CA 6 Department of Biomedical Engineering, Oregon Health & Sciences University, Portland, OR 7 Department of Genetics and Biochemistry, Clemson University, Clemson, SC 8 Biomedical Data Science and Informatics Program, Clemson University, Clemson, SC 9 Center for Human Genetics, Clemson University, Clemson, SC 10 Lead Contact * Correspondence:
[email protected] (C.E.) and
[email protected] (M.R.B.) 1 bioRxiv preprint doi: https://doi.org/10.1101/2020.11.09.373407; this version posted November 10, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission. ABSTRACT The current era of big biomedical data accumulation and availability brings data integration opportunities for leveraging its totality to make new discoveries and/or clinically predictive models.