Expansion for Brain Mapping

The Harvard community has made this article openly available. Please share how this access benefits you. Your story matters

Citation Kang, Jeong Seuk. 2019. Expansion Microscopy for Brain Mapping. Doctoral dissertation, Harvard University, Graduate School of Arts & Sciences.

Citable link http://nrs.harvard.edu/urn-3:HUL.InstRepos:42029733

Terms of Use This article was downloaded from Harvard University’s DASH repository, and is made available under the terms and conditions applicable to Other Posted Material, as set forth at http:// nrs.harvard.edu/urn-3:HUL.InstRepos:dash.current.terms-of- use#LAA

Expansion Microscopy for Brain Mapping

by

Jeong Seuk Kang

B.S. Electrical Engineering and Computer Sciences/ Materials Science and Engineering University of California, Berkeley (2015)

S.M. Applied Physics Harvard University (2017)

Submitted to the Harvard John A. Paulson School of Engineering and Applied Sciences in Partial Fulfillment of the Requirements for the Degree of

Doctor of Philosophy in Applied Physics

Harvard University May 2019

© 2019 Jeong Seuk Kang All rights reserved

Dissertation Advisor: Professor Edward S. Boyden Jeong Seuk Kang

Expansion Microscopy for Brain Mapping

Abstract

More than one billion people in the world suffer from brain disorders. To address this, more than one trillion US dollars are spent to develop the drugs, but ~92% fail to receive clinical approval. Among many potential reasons why treating brain disorders has been strikingly difficult, one reason could be that the complexity of neural circuitry and molecular composition of the brain have been poorly understood. For this reason, there needs to be new innovations in brain mapping pursuits, and expansion microscopy

(ExM) is proposed throughout this thesis as a potential candidate for most effectively meeting the needs of the efforts. First introduced in 2015, ExM allows for nanometer scale resolution to be achieved on a conventional microscope. By constructing an ​ expanding network inside the biological specimen, conjugating the biomolecules of interest to the matrix, and letting it expand after getting rid of everything else we are not interested in imaging, the physical distance between the biomolecules anchored to the polymer matrix increases, effectively overcoming the diffraction-limit of the conventional confocal microscope and thereby increasing the effective resolution of the microscope down to nanometer scale. Over the course of my graduate studies, I worked on three improvements to this modality: (1) applying ExM iteratively to the specimen and increase the effective resolution exponentially, (2) developing

iii intercalating lipid probes for visualizing lipid membranes in the context of ExM, and (3) devising a ExM-compatible approach to visualize extracellular space of a whole larval zebrafish. In addition to these, in an effort to understand what type of infrastructural help is needed to map the brain within our foreseeable future, I summarized an overview of current practices pursued by governments, industry, and academia to achieve scientific discoveries towards the end of this thesis.

iv Table of Contents

Introduction ------1

Expansion Microscopy ------9 ​ ​ Cookbook Style Expansion Microscopy Protocol ------11 ​ ​ Materials List ------13 ​ ​ Protocol Steps ------17 ​ ​ Reagents and Solutions ------25 ​ ​ Iterative Expansion Microscopy ------30 ​ ​ Resolution Validation for Expansion Microscopy ------34 ​ ​ Membrane Expansion Microscopy (MxM) ------40 ​ ​ Intercalating Lipid Staining Probes: palm-GeLy ------41 ​ ​ Immunohistochemistry-compatible MxM ------49 ​ ​ Directly Anchoring Lipids to ExM Gel ------53 ​ ​ Iterative MxM (iMxM) ------54 ​ ​ iMxM with Immunohistochemistry ------56 ​ ​ MxM with Fluorescent in-situ Hybridization for RNA imaging ------58 ​ ​ Methods ------61 ​ ​ Extracellular Space Labeling of the Zebrafish using Expansion Microscopy ------68 ​ ​ Whole Larval Zebrafish Expansion Microscopy ------69 ​ ​ Extracellular Space Labeling with Whole Larval Zebrafish Expansion ------71 ​ ​ Bridging the Gap between Research and Impact: Past and Future ------76 ​ ​ Government Agencies ------80 ​ ​ Accelerator Programs (Private Sector) ------84 ​ ​ Academic Institutions ------86 ​ ​ Conclusion ------100 ​ ​ Bibliography ------102 ​ ​

v Acknowledgement

Without Professor Ed Boyden, I wouldn’t be writing this thesis right now. When I first emailed Ed for a meeting back in 2015, I was on the verge of quitting my PhD for various personal reasons. During our short meeting (I was 25 minutes late to our 30 minute meeting because I am bad with directions and MIT buildings are, still, so confusing to me), he not only inspired me to continue my PhD, but also allowed me to join his neurobiology lab even though I had no prior experience in this field. He then went through all the trouble, so that I could work in his MIT lab as a Harvard student.

Really, I owe Ed billion thanks. Professor Jennifer Lewis, my Harvard academic advisor, has also been extremely generous to me. She had no practical reason to be my academic advisor, but she volunteered and saved my academic career. Professor

Federico Capasso and John Girash were also there for me when I had to figure out the complications involved in working at MIT as a Harvard student. More recently, Professor

George Church and Professor Samir Mitragotri so willingly agreed to be my committee members, and I couldn’t thank them more. These are the people that made me rethink about altruism, and I am sincerely grateful for their support.

And there are Manos Karagiannis and Jae-Byum Chang. Postdocs in our group who so patiently walked me through the world of bioengineering research on a daily basis. They taught me the practical know-hows and theoretical insights involved in every step of our projects. In addition to Manos and Jae-Byum, many members of our lab have been amazingly helpful on a personal and professional level: Anu, Nick, Monique, Jay, Fei,

vi Cristina, Daniel, Alexi, Tay, Dan, Oz, Adam, to name a few. We spent many sleepless nights and fun-filled days together, and I am very lucky to be able to call them both my colleagues and friends. Outside the lab, I want to extend my gratitude to my supportive partner Rachel.

Finally, I would like to thank Samsung Scholarship for supporting my graduate studies and my parents Shichul Kang and Hyewon Oh, as well as my brother Mok Kang and our beloved canine sister Poolibi, for all their infinite support. As for my next career, I will be working with Joichi Ito and Jessica Traynor on a new initiative, and I also owe them billion thanks for making my transition and my new journey possible. Hope I can make everyone who believes in me proud. I will try my best!

vii

Expansion Microscopy for Brain Mapping

Chapter 1

Introduction

More than one billion people in the world suffer from brain disorders, inflicting 1 in 6 of the world’s populations. More than one trillion US dollars are spent on addressing these, but the results have been discouraging. There were 267 programs in the portfolios of large pharmaceutical companies (i.e., Sanofi/Genzyme, Novartis, Johnson & Johnson, etc) in 2010, but five years after, that number has decreased to 129. It costs more than one billion dollars to develop the drug, but during the process of clinical trials mandated

1,2 by Food and Drug Administration, ~92% fail to receive clinical approval .​ For example, ​ just a few weeks ago, Biogen announced its ending of Alzheimer’s drug trials after years

3 of hard work, and as a result, its stock prices plunged by 26% .​ We see people suffering ​ from brain disorders within our immediate vicinity. Depression, stroke, anxiety, addiction, autism, Parkinson’s disease, multiple sclerosis, epilepsy, attention deficit disorder, eating disorders, sleep disorders, schizophrenia, Alzheimer’s disease, vision loss, spinal cord injury, and the list goes on.

How are treating brain disorders a lot more challenging than treating other diseases?

There are multiple reasons. The brain has a unique structure for each person, as the cellular structure of every single neuron in the brain is a product of both genetic instruction and personal experience over time. At the same time, we still don’t have the

4 complete understanding of how information processing works in the brain .​ There are ​

1

many mechanisms of electrical and chemical communications that are still being revisited and newly discovered. To mention a few, some of these include direct electrical connections, retrograde signaling by diffusible messengers, ephaptic coupling via extracellular electric fields, and other new modes of computation routinely being

5-7 discovered .​ These two factors alone already significantly complicate the effort towards ​ understanding the brain. However, there are more to these. Neuronal network connectivity is extremely complex. By virtue of their complicated geometry, the neural circuits and their patterns are highly diverse - different from the marked structural redundancy in other organs, such as muscles and bones. Also, neuronal cell types are

8-10 far more diverse than other systems .​ The retina alone has more than 50 different cell ​ types, whereas the liver has close to five. The complete number of cell types in the brain are yet to be determined. Finally, the sheer scale of efforts needed to study the brain is appalling. There are on average 86 billion neurons in the human brain, accounting for roughly 100 trillion synapses.

Although mapping the complexity of neural circuitry and molecular composition is an extremely challenging task, it is a much needed effort. The fact that brain disorders are deemed difficult to treat can be attributed the absence of the framework with which we can rationalize and quantify the structural and functional aspects of the brain. When tracing the nerves, they have to be imaged at nanometer resolution and followed over long distances to understand how they connect to other nerves. Traditional high-resolution imaging technique that is often used for this task has been electron

2

microscopy (EM), and EM has demonstrated early successes in attaining the

11 connectome of C. elegans and small portions of the mammalian brains .​ However, ​ despite tremendous innovations in this area, EM has its own limitations with scalability.

It can only be used on thin specimen slices - in the range of 100s of nanometers, whereas the brains of more advanced organisms, such as humans, are orders of magnitude thicker. In addition, the optical system needed to perform such studies are very expensive and require years of training. Moreover, even if we succeed in obtaining the data needed for brain mapping, analyzing them is also tremendously demanding.

For mapping a complete human cortex, we would have to process a zetabyte of data –

12 an amount of data equivalent to all the information available globally today .​ To give ​ some perspective, decoding the neuronal map of a C. elegans nematode (roundworm)

13 alone took almost 8 years, and it only has 302 neurons .​ There is a clear need for us to ​ significantly improve the tools used for brain mapping, such that the global research community can tackle this issue collaboratively and efficiently.There is a clear need to significantly improve the tools that are used for brain mapping, so that the the task can be carried out more efficiently in a collaborative manner. There are two areas of improvement that can be made to the current practice: addition of molecular and genetic information and scalability.

Addition of molecular and genetic information can be achieved in various ways, but ultimately, I believe that attaining them in-situ would be the most useful and facile way to access and utilize the information. As an instance, the most widely practiced way to

3

perform genetic sequencing is to extract genetic materials from the biological samples and read out without the contextual information associated with them in its native

14 environment .​ If one were to study localized epigenetics across the brain, there needs ​ a series of analyses and sample processing steps required to reach “statistically convincing” conclusion. Instead, if one attain, for example, RNA sequences and their information in-situ, the information can be directly attributed to the regions, cell types, and many other contextual features in question, and subsequently reveal their

15 phenotypic, as well as functional associations in the brain .​ ​

One modality that enables this pursuit is super-resolution imaging techniques with probes that reveal the information of interest. Super-resolution imaging allows for biomolecules of interest at nanometer scale to be visualized and stored as quantified

16-25 information - all in-situ .​ There are probes and genetic approaches for extracting ​ signals from live neurons, understanding the spatial interaction of DNA or RNA with certain that are associated with their functionality, as well as probes for glycocalyx on the cell surface, which effectively acts as the identification tag for different cell types in the brain. Examples of these pursuits include imaging the super-enhancers and their spatial positions with the nuclear membrane and intracellular organelles in their vicinity, imaging synaptic proteins that denote the chemical communication between different neurons at the presynaptic and postsynaptic regions, and lastly locating certain RNA strands that relate to various biological studies along neuronal projections and processes. These super-resolution techniques are especially useful for

4

this pursuit, especially when compared to electron microscopy, for EM sample preparation processes destroy the biomolecules and it has been shown to be extremely challenging to probe them. The added benefit of super-resolution microscopy is that its multi-color imaging capability allows for processing of information in a multiplexed fashion, allowing, to name an example, proteins and RNAs to be imaged at the same time together.

Table 1. Super-resolution light microscopy methods. Recently, several super-resolution or nanoscopy methods have been proposed, built, and matured - from the breadboard stage to commercially available imaging systems. This table provides an overview of different technologies being used in imaging research today.16 ​

However, as shown in Table 1 above, conventional super-resolution techniques have scalability limitations. Stochastic Optical Reconstruction Microscopy (STORM) imaging,

5

for example, provides down to 20 nm resolution, but it can only image very thin samples in the range of 100 nm - few µm in the z stack range. Also, STORM imaging requires ​ fluorophores with STORM-specific characteristics, compromising its broader adaptability. Due to the nature of how the image is constructed, heavy image processing is also required per one image slice per sample, and sometimes it requires more than 1000 raw images to construct one final image for the slice. Stimulated emission depletion (STED) microscopy is a little better - it can image samples that are

16 as thick as 20 µm .​ However, considering that the brains are orders of magnitude ​ thicker than the thickness scales mentioned before, it is inevitable that there should be a new imaging modality that addresses the limitations of super-resolution technologies available right now.

For this reason, we are proposing expansion microscopy, a new imaging modality that allows for effective nanometer scale resolution to be achieved on a conventional

26-42 confocal microscope, as the suitable tool for brain mapping .​ By constructing an ​ expanding polymer network inside the biological specimen, conjugating the biomolecules of interest to the matrix, and letting it expand after getting rid of everything else we are not interested in imaging, the physical distance between the biomolecules anchored to the polymer matrix increases, effectively overcoming the diffraction-limit of the conventional confocal microscope and thereby increasing the effective resolution of the microscope down to nanometer scale. Due to the nature of how expansion microscopy is carried out (described below in chapter 2), the biological sample to be

6

studied is made transparent, and the effective imaging depth of the sample only becomes limited by the objective lens itself.

Chapter 2 goes over the basics of expansion microscopy and its cookbook-style step-by-step protocol is suggested to demonstrate its low barrier to utilization and

40,42 scalability .​ Chapter 3 introduces a recent innovation in expansion microscopy that ​ allows it to be repeated on the same specimen in an iterative fashion, such that the effective resolution can be scaled exponentially; we have validated this technology

38 down to ~20-25 nm effective resolution .​ Chapter 4 describes my core work pursued ​ during my graduate studies. As electron microscopy reveals structural information of biological specimen at nanometer resolution, my colleague Manos Karagiannis created intercalating lipid membrane probes in the context of expansion microscopy to make lipid membrane imaging of the brain possible. With this, we aim to attain EM-like imaging of the brain with an increase in scalability and a broader range of functionalities, as this technology can be integrated with standard immunohistochemistry for visualizations that have not been possible with electron microscopy with ease. Chapter 5 shows our first attempt at addressing the extent to which expansion microscopy can be applied to map the brain. Although this project is at its preliminary stage, we modified the technology, such that we can expand and image the whole larval zebrafish at age 4-6 dpf with thickness in the range of few hundreds of micrometers. In addition to this, we have devised a way to image the extracellular space in the zebrafish by using biocytin, which stays outside the neuronal

7

membranes due to charge interactions. Extracellular space imaging is particularly useful for brain mapping purposes, as it can serve as contrasting agents to other membrane or intracellular information that are traditionally used to trace the neurons. Finally, Chapter

6 describes the current state of how academic research funding is deployed and scientific innovations are created towards impact - from three different sectors: government, private, and academia. As brain mapping is the type of research that requires significant amount of capital and resources to achieve, throughout my graduate studies, I was motivated to spent nontrivial amount of time thinking through the structure of scientific discoveries and its landscape. Upon graduation, I aim to be further developing my academic career in this domain and continue to make a meaningful contribution to the society in a scalable manner.

8

Chapter 2

Expansion Microscopy

Expansion Microscopy (ExM) is an emerging technology that uses conventional, diffraction-limited microscopes, standard lab equipment, and commercially-available, inexpensive reagents to achieve nanoscale resolution. During ExM, a swellable polymer network is constructed throughout a biological specimen, where biomolecules of interest, such as proteins and RNAs, are anchored to the polymer network via simple chemical steps. Then, after homogenization by enzymatic digestion, the specimen goes through isotropic three-dimensional physical expansion when immersed in water. As a result, the biomolecules of interest are spatially separated, and the effective resolution of the microscope is increased; for example, after ~4.5 times linear expansion of the specimen, the effective resolution of a lens with 300-nm diffraction-limited resolution becomes ~60-70 nm (~300 nm divided by 4.5). Since the basic concept behind ExM

42 was established a few years ago ,​ many different versions of ExM and their ​ improvements have been introduced – to name a few, protein retention expansion microscopy40 (proExM; Tillberg et al., 2016) is most optimized for anchoring proteins to ​ the network, expansion fluorescence in situ hybridization39 (ExFISH; Chen et al., 2016) ​ allows for expansion and visualization of both proteins and RNA, iterative expansion microscopy38 (iExM; Chang et al., 2017) allows ExM to be done repetitively on the ​ specimen and achieves exponential increase in the effective resolution (scaled by

~4.5N times for N rounds of expansion), and expansion lattice light sheet microscopy26 ​

9

(ExLLSM; Gao et al., 2019) that achieves fast and volumetric imaging with nanoscale resolution.

Figure 1. Expansion process. (a) After free radical polymerization, the gel keeps its collapsed state via electrostatic interactions of the anionic side chains and cation in the solution buffer. (b) As the gel is put into water, the cations are taken away from the gel, and anions push each other and lead the gel to the expanded state. (c) The gel keeps the expanded state in water as shown schematically here. (d) The expansion factor 42 scales linearly with the cross-linker concentration. Modified from Chen et al (2015) .​ ​

In terms of the factors that determine the hydrogel properties and subsequently the expansion factor, there are five important components: monomer type and concentration, cross-linker type and concentration, inhibitor type and concentration, catalyst type of concentration, and temperature at which free radical polymerization happens. These factors ultimately determine the chain length and pore size of the gel, which determine the expansion factor and hydrogel mechanical properties. To list a few,

43-35 these are the key considerations :​ ​

● Low concentration of initiators leads to longer polymer chain lengths, larger pore

size, and greater elasticity due to slower polymerization since oxygen has a

longer time to diffuse in and inhibit polymerization

10

● High temperature leads to shorter polymer chain lengths, smaller pore size, and

greater inelasticity due to increased polymerization termination before chain

growth since initiator decomposes much faster

● Lower crosslinker concentration leads to bigger pore size, higher expansion

factor, and greater elasticity due to bigger spacing between each polymer chain

All these factors were optimized to produce the expansion microscopy protocols that are practiced today, which ensure at least ~4.5 times expansion factor in linear dimensions with the robust structural integrity that prevents sample deformation after expansion in water. For example, by lowering the cross-linker concentration, we could achieve ~7 times expansion factor due to smaller pore sizes, but the final gel we attained after expansion had a weak structural integrity, such that when the gel is not in water, the sample was not able to withstand the gravity and collapse and deform. The images that were taken with this gel has distorted structures of microtubules, when compared to the ones taken with a conventional super-resolution microscopy techniques, such as

16-42 STORM .​ ​

Cookbook Style Expansion Microscopy Protocol

Expansion Microscopy is a technology developed for scalability and ease of use.

However, over the course of past few years since we developed and further optimized the technology, our group has received feedback from nontrivial number of scientists

11

that certain steps involved in the process are confusing. To address this issue and further emphasize this aspect of ExM, we set ourselves to optimize the expansion microscopy procedures, such that they can be followed like reading a cookbook. Below,

I am describing the proExM protocols for cells grown on glass coverslips. This allows the readers to expand the cells and image the microtubules in them after the cells are

4.5 times expanded (~90 nm effective resolution); the microtubules are considered the stereotypical structure for evaluating the quality of the super-resolution microscopy.

In proExM, primary amine groups of proteins, such as fluorescent proteins and antibodies, are modified with a commercially available small molecule

(6-((acryloyl)amino)hexanoic acid, succinimidyl ester, or AcX for short; Tillberg et al.,

2016) that has an amine-reactive NHS-ester and an acryloyl group. Through AcX modification, antibodies and fluorescent proteins are anchored to the gel network. Then, the polymer-embedded specimen is enzymatically digested using proteinase K (ProK), allowing homogenization of the specimen for expansion by water immersion. The digestion step mechanically homogenize the tightly-knit network of proteins and allows the specimen to swellable when immersed in water. Three workflows that were initially designed and proposed for the original proExM paper by Tillberg et al are reproduced below in Figure 2. The protocol described below follows the steps described in the first

40 row of the Figure 2 .​ ​

12

Figure 2. Three workflows for expansion microscopy with protein retention. Samples are chemically fixed and stained with antibodies, using conventional immunostaining protocols, before AcX treatment at room temperature (RT) and subsequent ExM processing (gelation, proteinase K treatment and expansion in water; top). Samples expressing FPs are chemically fixed (and optionally permeabilized) before AcX treatment and subsequent ExM processing (middle). Samples treated with AcX, followed by gelation, are then processed with a gentle homogenization procedure (for example, alkaline hydrolysis and denaturation, or digestion with LysC), and finally 40 antibody staining in the expanded state (bottom). Modified from Tillberg et al. (2016) .​ ​

Materials List Phosphate-buffered saline (PBS, Corning, catalog no. 21040CV)

Double-distilled water (referenced as “water” in steps below)

Reagents and Solutions for Culturing and Fixing HeLa Cells

1. HeLa Cells (ATCC, catalog no. CCL-2)

13

2. Cell culture medium (Dulbecco's Modified Eagle Medium (DMEM, Corning,

catalog no. 10013CV) supplemented with 10% (v/v) heat-inactivated fetal bovine

serum (FBS, ThermoFisher, catalog no.A3840001), penicillin-streptomycin (100

U/mL, ThermoFisher, catalog no.15140122)). This solution was made by mixing

445 mL of DMEM, 50 mL of heat-inactivated FBS, and 5 mL of 10,000 U/mL

penicillin-streptomycin.

3. 13-mm Diameter coverslip (ThermoFisher, catalog no. 174950) (this is used as

the removable cell culture substrate in a 24-well plate, note that the coverslip

need to be autoclaved for sterilization before using for cell culture)

4. 24-Well cell culture plate (Greiner Bio-one, catalog no. 662160)

5. Fixative (4% (w/v) formaldehyde in PBS). This solution was made by diluting

16% (w/v) formaldehyde (ThermoFisher, catalog no. 28906) (see Reagents and

Solutions).

6. Quenching solution (100 mM glycine in PBS; see Reagents and Solutions)

7. Permeabilization solution (0.1% (w/v) Triton X-100 in PBS; see Reagents and

Solutions)

Reagents for Immunostaining of Microtubules

1. Blocking buffer (1x PBS, with 0.1% (w/v) Triton X-100 and 5% (v/v) normal

donkey serum; see Reagents and Solutions)

14

2. Primary antibody (rabbit polyclonal antibody to β-tubulin, Abcam, catalog no.

ab6046)

3. Secondary antibody (donkey anti-rabbit IgG (H+L) highly cross-adsorbed

secondary antibody, Alexa Fluor 546 conjugate, ThermoFisher, catalog no.

A11040)

Reagents for Expansion Microscopy

1. 6-((acryloyl)amino)hexanoic Acid, Succinimidyl Ester (AcX) stock solution (10

mg/mL in dimethylsulfoxide (DMSO)). This solution was made by dissolving 5 mg

AcX (ThermoFisher, catalog no. A20770) in 500 μl anhydrous DMSO

(ThermoFisher, catalog no. D12345).

NOTE: The AcX/anhydrous DMSO stock solution can be divided into 10- to 20-μl

aliquots and stored at −20 °C for at least a month. The aliquots should be stored

in a sealed container with drying agents (e.g., Drierite) or in a desiccator to avoid

hydration.

2. Monomer solution (Stock X, 8.6% (w/v) sodium acrylate, 2.5% (w/v) acrylamide,

0.15% (w/v) N,N’-methylenebisacrylamide, 11.7% (w/v) sodium chloride, PBS). ​ ​ See reagents and solutions for details for making and storing this solution.

3. 4-Hydroxy-TEMPO (4HT) stock solution (10% (w/v) in water) (see Reagents and

Solutions)

15

4. N,N,N′,N′-Tetramethylethylenediamine (TEMED) stock solution (10% (w/v) in ​ water) (see Reagents and Solutions)

5. Ammonium persulfate (APS) stock solution (10% (w/v) in water) (see Reagents

and Solutions)

6. Digestion buffer (see Reagents and Solutions)

7. Proteinase K (ProK; NEB, catalog no. P8107S, conc.: 800U/ml)

Consumables and Equipment

1. Size 1 Paintbrush (Utrecht, catalog no. 09311-1001)

2. 24 x 60 mm, Rectangular, No. 1 coverslips (VWR, catalog no. 48404-455)

3. 22 x 22 mm, Square, No. 1.5 coverslips (VWR, catalog no. 48366-227)

4. 13 mm diameter, Circular coverslips (ThermoFisher, catalog no. 174950)

5. Petri dishes

6. Razor blades

7. Forceps

8. Parafilm

9. 50 ml conical tubes

10.6-well glass bottom plate, CellVis, catalog no. #P06-1.5H-N

11.Ice bath or cold block chilled to 4 ºC

12.37 ºC Incubator

13.Low-melting agarose (Sigma Aldrich, catalog no. A9045-25G)

16

Protocol Steps

Key steps involved in carrying out expansion microscopy on cultured adherent cells are described below with step-by-step instructions, as well as the reagents solutions accompanying them. Note that all steps up to the gelation (step 18) are performed on cells cultured on a 13-mm diameter coverslip placed in a 24-well plate. All the measures described below are for one well of a 24-well plate.

Choosing the fluorescent proteins and dyes

To ensure maximum possible compatibility with expansion microscopy, we recommend fluorescent proteins that form a β-barrel and dyes that are not part of the cyanine-family, such as Cy3 and Cy5. Please refer to Figure 3 for fluorescence retention after proExM

40 when planning the experiment .​ ​

Figure 3. (a) Retention of fluorescence for selected dyes conjugated with antibodies, after proExM treatment (mean ± s.d., n = 3 samples each), in mouse brain slice. (b) Quantified fluorescence for experiments after proExM treatment (mean ± s.d.; n = 4 transfection replicates each). Open bars, literature values of the brightness of these fluorophores, normalized to that of EGFP; literature values are used to avoid confounds due to differential expression, dependent on the user’s application, of different fluorophores. Crosshatched bars, literature values of brightness multiplied by the observed retention fraction for each fluorophore (e.g., the ratio of proExM and live, as in 40 a). Modified from Tillberg et al. (2016) .​ ​

17

*Please refer to Reagents and Solutions Section for all solutions described in protocol.

Part 1: Cell culture, fixation, and immunostaining

Day 1 (estimated experimental time is 1 h)

1. Culture HeLa cells on 13-mm diameter coverslip placed in a 24-well plate

following the protocol provided by ATCC. We usually plate 50k/well HeLa cells,

and incubate the plate in a humidified incubator with 5% CO the day before ​2 experiment so that the cells will reach 70% confluency at the day of experiment.

Day 2 (estimated experimental time is 6 h)

2. Aspirate the cell culture media, then add 300 µL of PBS to a well at room

temperature (RT), briefly incubate for 30 s, then proceed to next step.

3. Aspirate the PBS, then add 300 µL of fixative to a well, incubate at RT for 10 min.

4. Aspirate the fixative, then add 300 µL of quenching solution, incubate at RT for 5

min.

5. Aspirate the quenching solution, add 300 µL of PBS, incubate at RT for 5 min.

Repeat this step once more.

6. Aspirate the PBS, add 300 µL of permeabilization solution, incubate at RT for 15

min.

18

7. Aspirate the permeabilization solution, add 300 µL of blocking buffer, incubate at

RT for 15 min. During incubation, make primary antibody solution by diluting 2 µL

of primary antibody with 200 µL of blocking buffer in a 1.5 mL microcentrifuge

tube (1:100 dilution), mix the solution by pipetting up and down 10 times, 200 µL

of volume each time.

8. Aspirate the blocking buffer, then add the primary antibody solution. Place the

24-well plate on a shaker and shake with 60 rpm speed at RT for 1 h.

9. Aspirate the primary antibody solution, add 300 µL of blocking buffer, and shake

for 5 min at RT. Then aspirate the blocking buffer, add 300 µL of blocking buffer,

and shake for 5 min at RT, repeat this for additional two times to wash the

sample. During the last time of blocking buffer incubation, make secondary

antibody solution by diluting 2 µL of secondary antibody with 200 µL of blocking

buffer in a 1.5 mL microcentrifuge tube (1:100 dilution), mix the solution by

pipetting up and down 10 times using a p200 pipette.

10.Aspirate the blocking buffer, then add the diluted primary antibody solution. Place

the 24-well plate on an orbital shaker and shake with 60 rpm speed at RT for 2 h.

11.Aspirate the diluted secondary antibody solution, add 300 µL of blocking buffer,

and shake for 5 min at RT. Then aspirate the blocking buffer, add 300 µL of

blocking buffer, and shake for 5 min at RT, repeat this for additional two times.

Part 2: Gelation

19

12.During the last 5-min incubation with blocking buffer in step 11, dilute the

AcX/DMSO stock solution in PBS as follows: 3 µL of AcX/DMSO stock solution is

added to 297 µL of PBS for a total volume of 300 µL (1:100 dilution) in a

microcentrifuge tube placed in a cold block pre-chilled to 4 ºC.

13.Aspirate the blocking buffer, add the diluted AcX solution, and incubate overnight

at 4ºC in the dark (i.e., by wrapping the plate with aluminum foil).

Day 3 (estimated experimental time is 3 h)

14.Aspirate diluted AcX solution from the 24-well plate, add 300 µL of PBS

(pre-chilled on ice), briefly incubate for 30 s, then aspirate the PBS, add 300 µL

of PBS and place the 24-well plate on ice before step 16.

15.Prepare gelation solution by mixing Stock X, 4HT, TEMED, and APS stock

solutions in a 47:1:1:1 ratio (please also refer to Reagents and Solutions). After

mixing Stock X, 4HT and TEMED, mix the solution by pipetting up and down

twice (1 mL at a time), add APS, then mix again by pipetting up and down twice

(1 mL at a time). For this step, keep all the solutions and the plate chilled (close

to 4 ºC) on ice. This solution must be immediately used after preparation.

16.Aspirate PBS, and add 300 µL of gelation solution, and incubate for 30 minutes

at 4ºC in the dark.

17.During the 30 min incubation in step 16, prepare one gelation chamber for each

well. First, cover a No. 1 coverslip (the “bottom coverslip”) with a parafilm of the

same size (24 x 60 mm) and press the parafilm gently such that it sticks on the

20

surface. Then, place a No. 1.5 coverslip (22 x 22 mm) as the spacer at each

long-end of the parafilm. Move the spacers toward the center, such that the width

of the chamber should be smaller than the diameter of the 13 mm cell culture

coverslip. Gently press the spacers so that they can adhere to the sticky surface

of the parafilm. Side profile should resemble a U-shape. The length of the

chamber should be 22 mm, and the height of the chamber should be around

0.17mm that is the thickness of No. 1.5 coverslip.

18.After the 30 min incubation from step 16, apply 10 µL of gelation solution from

the well onto the center of the bottom coverslip of the gelation chamber. When

performing this step, keep the chamber chilled by having it on a cold block that is

kept cold at 4ºC. Free-radical polymerization involved in gelling is

temperature-sensitive, and any external factors that warm up the gelling solution

(i.e., holding and warming the tube containing the gelling solution with hands)

can lead to premature gelation.

19.Then, use a pair of forceps to carefully lift and pick up the cell culture coverslip

containing the cultured cells out from the 24-well plate. When picking up the cell

culture coverslip, using sharp tweezers with bent tips helps. Push the coverslip to

the edge of the well using the tweezers, place the tweezer tip underneath the

coverslip, lift up the coverslip, and then pick up the coverslip.

20.Invert the cell culture coverslip such that the cells are facing the bottom coverslip,

and gently place it on top of the spacers. Fill the space between the bottom

21

coverslip, spacers, and inverted cell culture coverslip with gelation solution by

carefully pipetting gelation solution to the edge of the cell culture coverslip, the

chamber should be filled with gelation solution because of capillary force. Cells

are effectively now encased between coverslips, and the chamber is complete -

ready for free-radical polymerization.

21.Transfer the chambers carefully to an incubator at 37ºC. Leave for 1h as gelation

solution polymerizes.

Part 3: Digestion

22.During the 1 h polymerization time in step 19 and prior to taking out the gelled

chamber from the incubator, prepare the digestion buffer (see Reagents and

Solutions). Mix proteinase K (ProK) with digestion buffer in a 1:100 (v/v) ratio at

RT. Prepare 1.5 ml of ProK/digestion buffer mixture per cell culture coverslip in a

microcentrifuge tube.

23.Take out the gelation chambers from 37ºC incubator to RT. Fill the chamber and

wet the gelled sample formed in the chamber by pipetting 10 µL of the proteinase

K digestion buffer mixture around the edge of the chamber.

24.Slowly insert a razor blade from the side between the cell culture coverslip (“the

top coverslip”) and the bottom coverslip to pry the cell culture coverslip off. Due

to the parafilm layer on the bottom coverslip, the gelled sample should be

attached to the top coverslip. Once the top coverslip is off, trim the gelled sample

22

to approximately 10 x 10 mm in shape using a razor blade and immerse the

gelled sample and the accompanying coverslip in the 50 ml falcon tube

containing 10 ml of the ProK digestion buffer (the ProK/digestion buffer mixture is

prepared in advance). Keep the sample on shaker with 60 rpm speed at RT for

overnight in the dark.

Day 4 (estimated experimental time is 4 h)

25.After digestion is complete, the coverslip should have been detached from the

sample. Remove the coverslip from the digestion buffer, pick up the gelled

sample with a paint brush, and transfer gelled sample from the 1.5 mL

microcentrifuge tube containing PBS.

26.Wash the gelled sample with PBS on shaker at RT for 15 minutes. Aspirate the

PBS, and repeat for three washes of PBS in total. Note during these incubations,

the tube containing the sample should be protected from light by wrapping with

aluminum foil. Gels can be stored in PBS at 4 ºC in the dark before expansion

and imaging.

Part 3: Expansion and Imaging using an inverted confocal microscope

27.Scoop with a paint brush to transfer gelled sample from the 1.5 mL

microcentrifuge tube into a well of a 6-well glass bottom plate containing 10 mL

of water. Keep the sample immersed in water for 30 minutes at RT. Replace

23

liquid in the well with 10 mL of fresh distilled water, and let the gel expand for

another 30 min at RT. Repeat this step. The gel should now be fully expanded.

28.Mount and immobilize the sample in the well using 0.5% (w/v) low-melting

agarose solution, diluted in water. Prior to use, the agarose solution should be

kept in a 40ºC water bath to prevent it from premature hardening. Place the

6-well plate containing the sample on a bed of ice and squirt the agarose solution

at the edges of the gel using a pipette. Avoid getting agarose underneath the gel,

so that the gel is not elevated from the bottom of the bottom glass plate by the

agarose layer. Ice on the bottom surface enables close to instant hardening of

the agarose solution.

29.After the agarose is hardened, add 5 mL of water in the well using a pipette. The

gel immobilized with agarose should be covered in water to keep the gel

hydrated during imaging and further storing.

30.Image the sample using an inverted confocal microscope.

24

Reagents and Solutions Monomer Solution (“Stock X”) Table 2. Monomer Solution Recipe

Monomer Solution Stock solution Amount (mL) Final concentration (“Stock X”) concentration (g/100 mL solution) (g/100 mL solution)

Sodium Acrylate 38 (33 wt% due to 2.25 8.6 higher density), diluted in water

Acrylamide 50, diluted in water 0.5 2.5

N,N - 2, diluted in water 0.75 0.15 Methylenebisacrylami de

Sodium Chloride 29.2 (5M), diluted in 4 11.7 water

PBS, 10x stock 10x 1 1x

Water 0.9

Total 9.4

The Stock X solution can be divided into aliquots of 940 µL each (10 aliquots from the recipe above) and stored at −20°C for a month. The catalog numbers and storage conditions for the chemicals used for preparing Stock X are as follows:

● Sodium Acrylate (Sigma Aldrich, catalog no. 408220): stored at −20°C for 6

months

● Acrylamide (Sigma Aldrich, catalog no. A8887-500G): stored at RT indefinitely

● N,N′-Methylenebisacrylamide (Sigma Aldrich, catalog no. M7279-25G): stored at

RT indefinitely

25

● Sodium Chloride (Thermo Fisher, catalog no. BP358-212): stored at RT

indefinitely

● PBS, 10x stock (Thermo Fisher, catalog no. 70011-044): stored at RT indefinitely

4-Hydroxy-TEMPO (4HT) Table 3. 4HT Recipe

4HT stock solution Stock solution concentration (g/100 mL solution)

4HT 0.5, diluted in water

The 4HT stock solution can be prepared in aliquots of 50 µL each and stored at −20°C for at least a month. The catalog number and storage condition for 4HT is as follows:

● 4-Hydroxy-TEMPO (Sigma Aldrich, catalog no. 176141): stored at 4°C for 6

months

Tetramethylethylenediamine (TEMED) Table 4. TEMED Recipe

TEMED stock solution Stock solution concentration (g/100 mL solution)

TEMED 10, diluted in water

The TEMED stock solution can be prepared in aliquots of 50 µL each and stored at

−20°C for at least a month. The catalog number and storage condition for TEMED is as follows:

26

● Tetramethylethylenediamine (Thermo Fisher, catalog no. 17919): stored at RT

for 6 months

Ammonium Persulfate (APS) Table 5. APS Recipe

APS stock solution Stock solution concentration (g/100 mL solution)

APS 10, diluted in water

The APS stock solution can be prepared in aliquots of 50 µL each and stored at −20°C for at least a month. The catalog number and storage condition for APS is as follows:

● Ammonium Persulfate (Thermo Fisher, catalog no. 17874): stored at RT for 6

months

Gelling Solution Table 6. Gelling Solution Recipe

Gelling Solution Stock solution Amount (µL) Final concentration concentration (g/100 mL solution) (g/100 mL solution)

Stock X (see recipe 940 above)

4HT (see recipe 0.5 20 0.01 above)

TEMED (see recipe 10 20 0.2 above)

APS (see recipe 10 20 0.2 above)

Total 1000

27

The gelling solution must be prepared at 4°C on ice or in a cold block, to prevent temperature-driven premature gelation. As mentioned before, mix Stock X, 4HT, and

TEMED first before adding APS, as APS is a very reactive free radical polymerization initiator.

Digestion Buffer Table 7. Digestion Buffer Recipe

Digestion Buffer Amount Final concentration (/100 mL solution)

Triton X-100 2.5 g 0.50 g

EDTA, disodium (0.5 M, pH 1 mL 0.2 mL 8)

Tris-Cl (1 M) aqueous 25 mL 5 mL solution, pH 8

Sodium Chloride 23.38 g 4.67 g

Water Add up to a total volume of 500 mL

Proteinase K (800 U/ml) 1:100 dilution 800 U (= 8 U/mL)

Total 500 mL

The digestion buffer can be stored at RT without Proteinase K, which is stored at

−20°C. The catalog numbers and storage conditions for the chemicals used for preparing the digestion buffer are as follows:

● Triton X-100 (Sigma Aldrich, catalog no. 93426): stored at RT indefinitely

28

● EDTA, disodium, 0.5 M, pH 8 (Thermo Fisher, catalog no. 15575020): stored at

RT indefinitely

● Tric-Cl (1 M) aqueous solution, pH 8 (Life Technologies, catalog no. AM9855):

stored at RT indefinitely

● Sodium Chloride (Thermo Fisher, catalog no. BP358-212): stored at RT

indefinitely

● Proteinase K (New England Biolabs, catalog no. P8107S): stored at −20°C at 6

months

29

Chapter 2

Iterative Expansion Microscopy

One of the very first projects I worked on as a graduate student was developing and validating a new type of expansion microscopy protocol that allows ExM to be performed repeatedly multiple times on the same biological specimen. This technique called Iterative Expansion Microscopy (iExM), and theoretically speaking, it achieves an exponential increase in the effective resolution (scaled by ~4.5N times for N rounds of

38 expansion) .​ Instead of using the chemically robust non-cleavable crosslinker ​ (N,N’-methylenebis (acrylamide) (BIS)) from the original ExM protocol published by

42 Chen et al in Science (2015) ,​ chemoselective cleavable crosslinkers, such as ​ N,N′-(1,2-Dihydroxyethylene) bisacrylamide (DHEBA) and (+)-N,N′-Diallyltartramide

46,47 (DATD), are used to expand a sample ~4.5 times initially .​ After this step, the ​ expanded sample is re-embedded with the similar polymer formulation used for the first expansion - but without sodium acrylate, which leads to osmotic pressure driven expansion when the gel is immersed in water. The mental picture that helps here is that the re-embedding gel “fills up” the space that has become newly available after the specimen expands. This step ensures that the expanded gel can be treated with high-salt buffers that may be required for subsequent sample treatment steps while preventing the gel from shrinking back to its original unexpanded state. The re-embedded gel inside the expanded gel from the first round of ExM physically maintains the expanded state and compete against the osmotic pressure in the presence of high-salt buffers. However, the re-embedding gel formulation still does

30

contain some salt - mainly attributed ammonium persulfate that is used for initializing free radical polymerization - and we have noticed that the expanded gel shrinks by

~10% in linear dimensions during the gelation step. After this re-embedding step, a new swellable polymer network (made with cross-linkers that are chemically orthogonal to the first crosslinker, in terms of cleavability) is then formed inside the re-embedded sample. Similar to how the biological information was transferred to the polymer network using chemical modifiers and linkers (i.e., functional oligonucleotides) during the first

ExM step, the information from the first gel is transferred to the second gel, and by cleaving the first gel with appropriate means that do not interfere with the second gel’s structural integrity, the gel can be expanded ~4.5 times one more time. For example, if

DHEBA is used as a cleavable crosslinker for the first expansion, BIS can be used as the second gel’s crosslinker, during which DHEBA can be cleaved using 0.2M NaOH buffer without also degrading the BIS crosslinker. If expansion microscopy were to be performed three times, N,N'-cystaminebisacrylamide (BAC) would be used as as the first crosslinker, and DHEBA and BIS would be used as the second and third crosslinkers, respectively. Here, BAC is cleaved using 0.25 M tris

(2-carboxyethyl)phosphine (TCEP; 1 M stock solution of TCEP diluted in 1 M Tris–HCl pH 8.0), and this is crosslinker reducing step does not affect the integrity of DHEBA and

BIS; DHEBA reducing step does not affect the integrity of BAC. If the sample were to be expanded multiple times, BAC and DHEBA could be used interchangeably - throughout first fel formation, second gel formation, cleaving, third gel formation, cleaving, and so

46-48 on, and as a final step, BIS could be used as the crosslinker for the final gel .​ Figure ​

31

4 depicts the steps involved in iterative expansion microscopy (iExM) graphically. Using this method, Brainbow AAV-labeled mouse brain samples were expanded 20 times and

38 microtubules of cultured BS-C-1 cells was expanded ~53 times, to name a few .​ ​

32

Figure 4. Iterative expansion microscopy (iExM) concept. (a–e) Schematic of iterative expansion. (b) First, a swellable polyelectrolyte gel network containing a cleavable crosslinker is formed throughout a specimen, then (c) it is mechanically homogenized and expanded. (d) After expansion, a second swellable polyelectrolyte gel network is formed throughout the first, and then (e) it is expanded after dissolving the first gel. (f–j) Molecular view of the iExM process. (f) Biomolecules of interest (gray circles) are first labeled with a primary antibody (shown also in gray) followed by a secondary antibody conjugated to a DNA (purple, sequence A′) molecule, then a complementary DNA (green, sequence A) bearing a gel-anchoring moiety (acrydite, black dot), as in our original ExM procedure. (g) The sample (two example biomolecules are labeled “1” and “2,” to be followed throughout subsequent diagram panels) is embedded in a cleavable swellable polyelectrolyte gel (blue mesh). This gel incorporates the DNA of sequence A at the gel-anchoring site, and it is expanded. (h) A DNA oligo with the original A′ sequence (purple strand) bearing a fluorophore (yellow star) and a new gelanchoring moiety (acrydite, black dot) is hybridized to the anchored A sequence DNA (green). (i) A second swellable gel (orange mesh) is formed that incorporates the final fluorophore-bearing DNA oligo (sequence A′, purple). (j) The gel expands the labels away from each other after digesting the first and re-embedding gel through cross-linker 38 cleavage. Modified from Chang et al. (2017) .​ ​

33

Resolution Validation for Expansion Microscopy

To re-emphasize, the effective resolution of iExM is scaled exponentially. Theoretically, the estimated performance of a diffraction-limited microscope with a resolution of 300 nm in conjunction with 20x physical magnification of iExM (two rounds of expansion)

38 provides (300/20 =) ~15nm effective resolution .​ To validate this claim, three key ​ analytical steps can be carried out. First, registering before and after expansion images and calculating the resultant deformation vector fields quantify the degree of deformation and isotropicity of iExM. Then, how much effective resolution error is added from gel-anchoring linkers and expansion protocol can be quantified by comparing measurement statistics of known, stereotyped features imaged with different microscopic techniques. This error can be also estimated via mathematical modelling.

Throughout the studies, microtubules are used as a model biological feature for analyses, since they are prevalently expressed in many different biological specimens and have nanoscale, stereotyped structures (for this reason, they are widely used by other super-resolution techniques for resolution quantification as well). Depending on the resolution in question, other biological features may also be used for analyses, including clathrin coated pits and microtubules, nuclear pore complexes, and synthetic

DNA strands diffused in samples prior to performing the iExM protocol.

34

Figure 5. Empirical validation of the effective resolution of iExM. (a) Overlay, using only a rigid registration, of a STORM image (magenta) of cultured BS-C-1 cells stained with anti-tubulin pre-expansion with a confocal image (green) of the same sample post-expansion. (p) RMS length measurement error of biological measurements calculated using the distortion vector field method using STORM microscopy pre-expansion followed by confocal imaging of iExM-processed samples (~20× expanded) (blue line, mean; shaded area, ±1 s.d.; n = 3 samples). Confocal imaging of cultured BS-C-1 cells with labeled microtubules after ~20-fold expansion via iExM. (c) Single xy-plane image at the bottom of the cell. The inset in upper right zooms in on the small box at left. (d; left) Transverse profile of microtubules in the boxed region (dotted lines) of the inset in c after averaging down the long axis of the box and then normalizing to the peak value (blue dots), with superimposed fit with a sum of two Gaussians (red lines). (d; right) Population data for 110 microtubule segments from two samples (mean ± s.d.), showing a histogram of peak-to-peak distances. Modified from 38 Chang et al (2017) .​ ​

For the first analytical step, microtubules of BS-C-1 cells are stained with primary antibodies and secondary antibodies conjugated to fluorophores. They were then first imaged with a super-resolution microscope of choice, and for validating the effective

35

resolution of iExM, stochastic optical reconstruction microscopy (STORM) was used.

After first imaging, iExM protocol is carried out, and the exact same microtubules are imaged using a conventional, diffraction-limited confocal microscope. The two pre- and post- expansion images are then co-registered, as shown in Figure 5(a) as green and red, respectively, via either rigid or non-rigid transformation using commercial image processing software and custom algorithms. The resultant deformation vector fields across the registered image are then quantified to provide the root-mean-square (RMS) alignment error, which demonstrates the degree of deformation and isotropicity of

42 expansion40, .​ As for iExM, the error came out to be only about 2%, over scales of ​ several microns, as shown in Figure 5(b). For the second step, the fluorescence signal profile is measured across microtubule segments after deconvolution, as shown in

Figure 5(c). This is then superimposed fitted with a sum of two Gaussians to approximate experimentally measured microtubule diameter. In the original iExM paper, hundreds of segments were measured to give the diameter around 58.7 ± 10.3 nm, as shown in Figure 5(d). Before expansion, the variation in the diameter along the microtubule axis is attributed to actual variations in microtubule size and how primary and secondary antibodies are conjugated to them. Therefore, by comparing the standard deviation of the diameter measured by STORM and iExM, one would know much additional error is added from the gel-anchoring linkers and the ExM protocol.

Over distance of 400 nm along the microtubule axis, the standard deviation of

2 2 1/2 measured diameter was 4.7 nm for STORM. The difference of variances ((10.3 -4.7​ )​ ​ ​ ​

38 =) 9.2 nm is thus attributed to the additional error from iExM .​ ​

36

Figure 6. Mathematical modelling for error estimation. (a) Schematic of simulation strategy, along with depiction of the calculated point-spread function of the microscope we used (a spinning disk confocal microscope with a pinhole size of 50 μm equipped with a 40x NA1.15 objective lens). Scale bar, 200 nm. (b) Confocal microscope image of microtubules from cultured BS-C-1 cells after 20-fold expansion via iExM. (c) Cross section of a cylindrical protein complex of interest with a diameter of 25 nm, labeled from all sides. (d) Schematic of antibodies and DNA around a microtubule. The 5’ acrydites of the DNA oligos are distributed in the purple shaded region, corresponding to the cylinder of Figure 6(a). We constructed a detailed schematic showing a possible arrangement of antibodies and DNA around microtubules. As shown in Figure 6(c), the radius of the microtubule is 12.5 nm, as previously measured by transmission electron microscopy (TEM). The radius of an immunostained microtubule (stained with conventional primary and secondary antibodies) is 30 nm, as measured by previous TEM imaging. The 5’ acrydites of the DNA are distributed in a cylinder with an inner radius of 26.7 nm and outer radius of 33.5 nm, as derived in Chang et al (2017). As can be seen in Supplementary Figure 6(c), the DNA-conjugated secondary antibody makes the radius of the microtubule 3.5 nm larger than the microtubule labeled with regular 38 antibodies (outer radius of 30 nm). Modified from Chang et al (2017) .​ ​

37

Considering that the diameter of microtubules measured by transmission electron microscopy (TEM) and STORM are 25 nm and 38 nm, respectively, the error from the gel-anchoring linkers and expansion protocol must be further understood and accounted for, when discussing the effective resolution of iExM. The third step utilizes mathematical modelling to estimate this error. For microtubules labeled with primary and fluorophore-conjugated secondary antibodies, the peak-to-peak distance of the fluorescence signal turns out to be 35 nm, based on the simulation shown in the iExM paper. This is much smaller than one might expect, since unlabeled microtubules imaged with TEM has a diameter of 25 nm and antibodies are 8.75 nm long. However, for cylindrical structures, such as microtubules, the fluorescence peak positions appear to move inward, due to the extra signals from the secondary antibodies attached to the top and bottom of the protein complex. The extra signals are denoted as blue ovals in

Figure 6(a). This explains why the diameter of STORM imaged microtubules is typically

38 nm. With this understanding, similar modelling can be replicated for iExM-processed samples with gel-anchoring linkers, namely DNA strands and DNA-antibody complexes.

By illustrating that 5’ acrydite tags (hybridized/conjugated to other linkers), which ultimately determine the resolution of iExM, move between two shells of a cylinder, their contribution to error in effective resolution can be quantified. Points representing the positions of 5’ acrydite tags can be randomly distributed between the two shells. In

Figure 6(b), these two shells are identified by Ri and Ro, which denote inner and outer ​ ​ ​ shell radius, respectively. The points are then convolved, based on the microscope used for imaging, and fit to the experimental data by best matching their peak-to-peak

38

positions. Example of this process is shown in Figure 6(c), where blue dots represent the experimental values, the red curve represents the Gaussian fit, and the green curve represents the simulated curve made with the simulation. After applying this process to

166 segments of microtubules, the average thickness of the 5’ acrydite layer is shown to be 12.5 ± 5.3 nm. Then, the aforementioned model can be drawn, where the tags are distributed in a cylinder with an inner diameter of 23.5 nm and outer diameter of 35.8 nm, as shown in Figure 6(d). Here, the purple shaded region represents the 5’ acrydite layer, derived before. When compared to STORM-imaged microtubules labelled with regular antibodies, the DNA-conjugated antibodies turn out to add up to 4 nm of positional error to a protein complex. Following the above three steps, expansion uniformity and effective resolution can be thoroughly quantified for iExM. For 20x expansion, it is shown with microtubules that the resolution is ~26 nm, as opposed to the theoretical ~15 nm, as argued before. Using custom synthesized modular DNA strands, one gains a broader range of access to validating a smaller resolution for the given technique, as the DNA strands can be variable in length - depending on the

38 resolution to be tested and validated .​ ​

39

Chapter 3

Membrane Expansion Microscopy (MxM)

Due to the aforementioned advantages, the ExM technologies could be the favorable toolset for mapping the brain. To further facilitates this vision, we set ourselves to design and develop small molecule chemical stains that can be densely applied to visualize the boundary of neurons, such as the lipid membranes, in the context of expansion microscopy. By combining such chemical stains with recent improvements in the ExM technologies that are aimed towards iterative expansion and exponentially increasing effective resolution, detailed structural information of the brain can be recorded while also being able to study its molecular composition. For this reason, we created intercalating lipid staining probes, which we call palm-GeLy, that enable membrane membrane visualizing expansion microscopy (MxM) together with improvements in the current ExM schemes. We aim to achieve molecularly-annotated

EM-like multicolor imaging of the brain by integrating the extensive toolset of ExM technologies and our recent innovation with MxM.

Below, the overall scheme for performing MxM for lipid visualization is described both in words and schematically for cells and mouse brain slices. MxM can also be performed iteratively, and this process is named Iterative Membrane Expansion Microscopy

(iMxM). Finally, we also demonstrated that MxM and iMxM are compatible with standard immunohistochemistry, where proteomic information can be attained at nanometer effective resolution only using a conventional microscope in the context of MxM-driven

40

lipid membrane imaging. In addition, our early efforts toward combining MxM with

ExFISH are showcased with Thy1-YFP mouse brain slices. We envisioned that continued improvements of this technology can lead to the aforementioned multi-color

EM-like imaging of brain samples over hundreds of micrometer scales at nanometer effective resolution.

Intercalating Lipid Staining Probes: palm-GeLy

Traditional membrane labeling molecular probes (i.e., DiI and DiO) consist of long hydrophobic chains with fluorophores attached to them. Such labels intercalate in lipid membranes and exhibit a two dimensional diffusion pattern along the surface of membranes limiting their transport only on the neurons where they are deposited and

49 not throughout a larger volume of tissue .​ Such labels have been traditionally used for ​ neuronal tracing both in live and fixed brain tissue samples. Recent iterations of such molecules (i.e., FM-143FX and mCLING) include hydrophilic moieties like primary

50-55 amines that permit their three dimensional diffusion throughout the tissue .​ Such ​ labels though include a fluorophore in their structure which gets degraded during the free radical polymerization process in ExM. Furthermore, they contain a limited number of chemical handles that can be used to perform ExM iteratively for a higher expansion factor. To address the limitations that the existing membrane intercalating probes exhibit we designed a novel membrane molecular probe. The design criteria for the probe include: (1) optimal amphiphilicity between the hydrophobic chains and hydrophilic primary amines for lipid membrane intercalation and diffusion in 3D space, (2) presence

41

of chemical handles for the chemoselective conjugations of fluorophores post gelation, and (3) presence of gel-anchorable sites for incorporating the probe in the ExM-gel matrix and subsequently performing ExM iteratively. Considering the aforementioned criteria, our novel membrane-intercalating probe is constructed on a peptidic backbone of oligo-lysines. Lysines contain primary amines on their side chains that allow the conjugation of gel-anchorable sites, such as acryloyl –X (AcX), which enables the incorporation of the label in the ExM gels. The amines are also positively charged, conferring to an additive interaction of the probe with negatively charged plasma membranes attributed to the presence of cell surface proteoglycans. To achieve lipid membrane intercalation, we placed a lipid chain on the amine terminus of the peptide backbone. Following common lipid post translational protein modifications we chose to use two different types of lipid tails, a palmitoyl tail and a myristoyl tail separated from the lysines with a glycine. Among these two options, we observed that the membrane coverage with the probes containing a palmitoyl lipid group was denser than the one we achieved with myristoyl, which yielded a discontinued and sparsely labeled tissue.

Between the palmitoyl tail and the lysines, a glycine exists to provide mechanical flexibility to the lipid tail of the amine terminus. Furthermore, we limited its size to molecular weight lower than 1 kDa. This serves two reasons: to allow the fast diffusion throughout the tissue and to achieve dense labeling of membranes. Membrane probes of larger molecular weight and sizable hydrodynamic radius would create non-continuous patterns of labeling on membranes. To limit the molecular weight of the construct to 1kD, we used 5 lysines and a glycine as a backbone. By having 5 lysines,

42

we also increase the probability that our probe reacts with AcX during the subsequent steps of ExM. Here, we constructed the backbone out of D-amino acids (D-Lysines as there is no glycine chiral) to prevent its degradation during the proteolytic tissue digestion and homogenization steps during ExM. To enable the chemoselective attachment of a fluorophore post gelation to the probe, we also introduced a chemical handle to the carboxy-terminus of the peptidic backbone, at the terminal lysine. We explored 2 such different handles, an azide and a biotin. The azide allows the one-to-one attachment of a fluorophore following a dibenzocyclooctyne group based click-chemistry procedure, whereas the biotin attaches to one of the four active sites of a fluorescently labeled streptavidin, which can also later react with biotinylated fluorophores. The latter is a more scalable scheme for fluorescence amplification.

Fluorescently labeled streptavidin contain 3-4 fluorophores attached to sites other than the biotin binding epitopes, and if one of the epitopes is attached to the probe, there are

3 epitopes left to attach 3 extra fluorophores yielding a total of 6-7 fluorophores per lipid probe. We initially constructed backbones containing both an azide and a biotin, but later it became evident that the 7-fold signal amplification achieved with the biotin was crucial for signal detection in the expanded gels especially during iterative ExM.

Bringing all these together, the probe that we deem optimal and decided to work with is a glycine and penta-lysine D-peptidic backbone with a palmitoyl lipid group on the amino-terminus and a biotin on the carboxy-terminus, which we call palm-GeLy. The structure of palm-GeLy is shown in Figure 7(a).

43

Figure 7. Intercalating Lipid Staining Probes, palm-GeLy, and its expansion microscopy workflow, Membrane Expansion Microscopy (MxM). (a) Structure of palm-GeLy consists of three components: (1) lipid chain on the (2) amine terminus of the peptidic backbone, which also has (3) a chemoselective biotin handle for fluorescence amplification. Between the palmitoyl tail and the lysines, a glycine exists to provide mechanical flexibility to the lipid tail of the amine terminus. (b) Electron microscopy image of a mouse brain slice stained with palm-GeLy. After applying palm-GeLy on a 200 µm mouse brain slice, which was later stained with streptavidin and gold nanoparticles conjugated to biotins. Gold nanoparticles were then used as proxies for imaging palm-GeLy probes in the brain slice to determine the density of the probes. (c-i) palm-GeLy compatible expansion microscopy, MxM, workflow. (c) After perfusing a mouse, the brain is taken out and sliced at 4°C. (d) The slice is then incubated in the palm-GeLy probes overnight at 4°C. During this step, the probes intercalate throughout lipid membranes in the tissue.

44

(Continued). AcX is then applied to the palm-GeLy treated sample to functionalize the probes to be gel-anchorable. (e) Either with a cleavable crosslinker or a non-cleavable cross-linker, the slice undergoes the standard expansion microscopy free-radical polymerization process. (f) The gelled sample is then incubated in the digestion buffer for tissue homogenization and subsequently expansion without distortion. If the proteinase K digestion buffer is used for this step, expansion microscopy workflow ends here and the sample can be expanded in water for imaging. If the immunohistochemistry FR buffer is used, the workflow proceeds to the next step. (g) Standard primary antibody staining is then performed on the sample digested with the FR buffer. (h) Standard secondary antibody staining performed next on the sample. (i) The expanded gel now can be imaged for lipids and proteins that are tagged fluorescent secondary antibodies, allows for both structural morphological information to be attained at nanometer scale with a conventional diffraction-limited microscopy technique, while also being able to attain proteomic information together with this.

To validate the functionality of our membrane probe and examine the cell membrane coverage that we can achieve in brain tissue, we incubated a 200 μm thick fixed brain tissue section with 100 μM palm-GeLy, labeled with streptavidin, post labeled with 3 nm biotinylated gold nanoparticles and imaged with an electron microscope. To evaluate the efficacy of our probe to label membranes we counter stained the tissue with osmium tetroxide and we are in the process of calculating the percent co-localization and coverage of membranes with palm-GeLy considering that with osmium we can achieve total labeling of lipids. Out of other lipid tail candidates we tested before, the a palmitoyl lipid group turned out to show the highest density in the EM data we have generated so far, and this helped us decide the palmitoyl lipid group as the best selection for our palm-GeLy intercalating lipid probes. This electron microscopy (EM) image that was produced from a palm-GeLy labelled and gold nanoparticle labelled mouse brain slice is shown in Figure 7(b). In addition to validating the efficacy of our probe it became

45

evident that palm-GeLy can be used as an alternative EM stain for membranes and can be used in parallel with established EM stains for multicolor electron microscopy.

Once we established that the lipid probes label tissue membranes with high efficacy we applied the probe to fixed cells and brain tissue. We identified that the optimal tissue fixation condition for lipid retention in the context of expansion microscopy can be achieved if we fix samples with 4% paraformaldehyde (PFA) and 0.1% glutaraldehyde that are buffered in PBS. By increasing the glutaraldehyde concentration we can potentially achieve better lipid retention, but this limits the expansion degree of the sample, as higher glutaraldehyde concentrations like the ones used for electron microscopy (for example 2.5%) too strongly fix the sample and do not allow proteolytic digestion step to homogenize the tissue during expansion. Furthermore, when we performed sample labeling at room or higher temperatures we observed deterioration of the labeling. We hypothesized that this is attributed to loss of lipids from lateral diffusion, which is accelerated at higher temperatures. Thus, we performed all of our labeling procedures in 4⁰C. At this temperature, lipid membranes exist in a semi-crystalline state limiting their diffusivity thus preserving most of the lipids in the sample. Therefore, we fixed cells with 4% PFA and 0.1% glutaraldehyde in 4⁰C or perfused mice with the same fixative in ice cold buffer and applied the palm-GeLy probe at 4⁰C. After functionalizing the samples with AcX at 4⁰C to make them gel-anchorable, the samples are pre-incubated with polymer monomers at 4⁰C, such that the monomers diffuse throughout the samples. Next, the pre-incubated samples are moved to a 37⁰C

46

incubator, and the free-radical polymerization is initiated in the sample at this temperature. Finally, the gelled samples are homogenized with the proteinase-K digestion buffer and post-labeled with fluorescent streptavidin.

In both cases of cell and tissue staining, we achieved dense labeling of all of the membranes including structures that morphologically resemble those of organelles including mitochondrial, endoplasmic reticulum and golgi apparatus. Using palm-GeLy, lipid membrane visualizing expansion microscopy is possible, and we call this membrane expansion microscopy (MxM). The MxM workflow is also chemically described in Figure 7(c)(i). Depending on which digestion buffer is used, MxM can be either compatible or not compatible with standard immunohistochemistry. The former case is described further in the section titled immunohistochemistry-compatible MxM.

This allows the tissue sample inside the gel to be mechanically disrupted in a homogenized fashion for isotropic expansion, while maintaining its antigenicity for standard antibody labelling processes. Otherwise, the original ExM proteolytic digestion buffer that utilizes proteinase K enzyme can be used for lipid-only MxM and imaging, which is shown in Figure 8 below. Figure 8(a) shows three images across the z-axis of the stack depicting a lipid-labelled HeLa cell after MxM, whereas Figure 8(b) shows nine images across the z-axis of the stack depicting a lipid-labelled neuron from a mouse brain slice after MxM. Both images were taken with a conventional confocal microscope after the sample was expanded roughly ~4.5 times using the proteolytic digestion buffer.

In the images, detailed structural information of organelles, such as mitochondria, golgi ​

47

apparatus, and endoplasmic reticulum, in the cells and neurons. Here, being

“lipid-labelled” means labelling was done with palm-GeLy intercalating lipid probes on the aforementioned biological specimens. Currently, palm-Gely enabled MxM schemes are being tested on other organisms and specimens, including human clinical samples, larval zebrafish, and nematodes, to fully address the scalability and versatility of the

MxM technologies.

Figure 8. MxM done on HeLa cells and a 100 µm mouse brain slice, using the standard ​ ExM proteolytic proteinase K digestion buffer. (a) Three images across the z-axis stack of HeLa cells after MxM. (b) Nine images across the z-axis stack of a neuron in the mouse brain slice after MxM. All of the images were taken with a conventional confocal microscope after ~4.5x expansion (~50 nm effective resolution). Key intracellular organelles shown here are mitochondria, golgi apparatus, and endoplasmic reticulum, among many others. For all images, lipid membranes labelled with palm-GeLy are ​ colored in either grayscale or inverted grayscale. Scale bars (in white): (a) 10µm; (b) ​ ​ 40µm. ​

48

Immunohistochemistry-compatible MxM

To validate that we can achieve labeling of the membranes of all of these organelles we used antibodies for surface markers of the organelles to estimate the co-localization of the markers with the lipid probes. Combining antibody and lipid labeling is not trivial.

Common immunostaining protocols require permeabilization of the sample with detergents, which enable the penetration of the antibodies in the cells and throughout the tissue. Detergents though have detrimental effect on the preservation of lipids as they solubilize them and permeabilize all of the membranes yielding lipid labeling impossible. We tried a variety of methods to resolve this issue. Such methods included for example the use of milder detergents like saponin, which although preserved the lipids better than any other detergent, created labeling artifacts and non-complete expansion. We also tried to immobilize the lipid probes in the tissue before antibody labeling by creating a chemical mesh containing the probes. This was achieved in two ways.

First, the palm-GeLy probes can be post-fixed mildly with 0.1% PFA; thus, the tissue is double fixed. Second, prior to formulating the ExM gel, a cleavable non-expanding gel is created inside the tissue in-situ in the absence of sodium acrylate, such that the lipid labels are initially preserved in the clevable non-expanding gel. Then, lipids that are not incorporated in the cleaval gel matrix can be removed via the standard detergent based immunohistochemistry permeabilization step, making room for antibodies to diffuse through the samples. After performing immunohistochemistry, a non-cleavable

49

expanding gel (“the ExM gel”) can be formed in the existing sample cast in the cleavable gel, during which the lipid labels, as well as the antibodies, become part of the expanding gel network. In both cases, although we had complete preservation of the lipids after antibody labeling, the expansion quality, which consists of expansion factor and homogeneity to name a few, was suboptimal.

Thus, we developed a post-expansion antibody labeling protocol where we first label the tissue with the lipid probes, gel and, instead of homogenizing the tissue with the proteinase-K digestion buffer, we thermally treat the sample to preserve the antibody-binding epitopes while loosening up the mechanical integrity of the tissue for homogenization. This protocol is similar to the one introduced previously in the post expansion protein ExM (proExM) paper; the only difference we have here is that we use a buffer that contains DTT and SDS in a pH8 Tris buffer, as opposed to the antigen

40,56,57 retrieval buffer .​ This was of processing the sample was inspired by the practices ​ that are traditionally used in tissue proteomics to achieve fixation reversal in formalin-fixed paraffin-embedded (FFPE) tissues. We boil the sample in this fixation reversal buffer (FR) for half hour at 100⁰C and for 2 hours at 80⁰C. After staining the fixed tissue with palm-GeLy and formulating the ExM gel, we thermally processed the samples in the FR buffer, blocked and labelled with primary antibodies and secondary antibodies with fluorophores for organelle surface markers.

50

Figure 9. Immunohistochemistry-compatible MxM demonstrated over six proteins of interest. For all images shown in Figure 9, the FR digestion buffer, instead of proteolytic proteinase K digestion buffer, was used to homogenize the sample after polymer matrix is formed inside the sample. (a-f) show (i) lipid channel (grayscale), (ii) proteins of interest (red), (iii) composite, and (iv) zoomed-in region of interest - dotted boxe inset in the composite (iii) for Yellow Fluorescent Protein (YFP), Calnexin, Myelin Basic Protein (MBP), Tom20, NUP98, and Ginatin, respectively. Scale bars (in black) are 20 µm for all ​ images here.

51

These markers include Tom20 for mitochondria, Giantin for golgi apparatus, Calnexin for endoplasmic reticulum and NUP98, a member of the Nuclear Pore Complex, for the nuclear membrane. In addition, we labeled myelin using a Myelin Basic Protein antibody and in the case of tissue derived from a mouse model sparsely expressing YFP in the cytoplasm (thy1-YFP mouse) we labeled with an anti-GFP antibody. As shown in Figure

9, it has been demonstrated with these antibodies that standard immunohistochemistry is compatible with our MxM scheme utilizing the FR buffer. We have also observed that the protein labelling done this way results in a higher labelling density, as compared to other ExM techniques, such as proExM. We believe that this is attributed to the fact that antibody labelling done after gel digestion (after the gel goes through the digestion step, it stays ~1.5x expanded), there is now more room for antibodies to diffuse through the tissue-gel composite. Prior to moving forward with the FR buffer and aforementioned tissue treatment conditions, we screened a variety of buffers including common antigen retrieval buffers (i.e., those containing detergents and β-mercaptoethanol in Tris buffer with varying pH levels) or commercially available protein extraction buffers (i.e., RIPA buffer or the Liquid Tissue MS Protein Prep Kit) to determine the best-working condition.

Regarding the optimization of the heat treatment condition, we processed the samples at multiple different temperatures, including autoclaving them for different amounts of time or boiling them in temperature gradients. We observed that with the commercially available protein extraction buffers we cannot achieve tissue expansion, and when performing antigen retrieval with autoclaving, there was only a limited number of

52

antibodies that were still functional, making autoclaving useful only on a case-by-case basis for antibodies.

Directly Anchoring Lipids to the ExM Gel

Other than using the intercalating lipid staining probes, we also tried directly modifying the native lipids in the tissue chemically and conjugating them to the ExM gel. Chemical modification of lipids yields denser labeling of the cellular membranes for ExM, similarly to osmium tetroxide labeling for electron microscopy. To achieve this, we peroxidized unsaturated lipids of the membranes via free radical oxidation to induce peroxidation to

58 carbonyls and homolytic β-scission of their chains to aldehydes .​ Later, we conjugated ​ hydrazide-modified linkers that specifically react with the oxidized lipids. In the latest iteration of the lipid labeling scheme, following a brief oxidation with hydrogen peroxide of the tissue, we conjugate the carbonyls with hydrazide alkynes. To achieve further immobilization of the hydrazide alkyne in the polyelectrolyte gel, we have developed a trifunctional linker, which contains an azide for a chemoselective moiety needed for performing expansion microscopy iteratively, an acrylic group that gives the lipids gel-anchorable sites, and a biotin for further fluorescence amplification via a fluorophore-containing streptavidin (the same scheme described for palm-GeLy). The hydrazide alkyne is conjugated to the azide of the trifunctional linker via copper-catalyzed azide/alkyne cycloaddition. Such cycloaddition requires monovalent copper cations and utilizes copper sulfate as a source of copper bivalent cations and ascorbic acid as a reductor to generate the catalytic monovalent copper species in situ.

53

To limit the reducing activity of the ascorbic acid and the further generation of catalytic monovalent copper on the lipid membrane we are using a lipid containing version of the acid (ascorbic acid 6-palmitate) thus limiting the cycloaddition only on the lipid membranes. Although directly anchoring the lipids to the ExM gel matrix provides, theoretically speaking, the densest level of lipid retention, the palm-GeLy approach resulted in comparable results, in terms of the image quality. Also, the lipid peroxidation step needed for directly anchoring the lipids was deemed destructive for antibody epitope retention and also modified the carbohydrates, such that designing a MxM protocol that is selectively visualizing the lipid membranes while being compatible with immunohistochemistry was shown to be difficult. For this reason, the use of palm-GeLy was chosen as the primary choice for carrying out MxM in a repeatedly promising fashion, and the direct anchoring approach was not pursued for more careful studies and optimizations.

Iterative MxM (iMxM)

After performing the first round of MxM with either the proteinase-K digestion buffer or

38 the FR buffer, iterative expansion microscopy (iExM) techniques can be performed .​ ​ The first gel being formed to carry out the initial MxM procedure utilizes chemically-cleavable crosslinker called the (+)-N,N′-Diallyltartramide (DATD). Following the similar fashion iExM is carried out, a non-expanding gel consisting of DATD and acrylamide can then be formed in the expanded gel. Due to the small amount of salt attributed to the free radical polymerization initiator ammonium persulfate (APS) in the

54

monomer solution, the expanded gel shrinks about 90% in linear dimensions. This process is called “re-embedding,” and this allows us to proceed with subsequent steps in a salt-buffered conditions, such as AcX-ing and signal amplification, without shrinking the gel further. After the re-embedding process, the gel is then treated once again with

AcX, so that remaining amines on the palm-GeLy, streptavidin, and, for the immunohistochemistry-compatible MxM, primary antibodies and secondary antibodies with fluorophores are activated to have gel-anchorable sites. Then, the non-cleavable second gel is cast inside the re-embedded gel with non-cleavable crosslinker bisacrylamide (BIS). Shortly after this step, the DATD-based gels are degraded with acetate buffer containing sodium meta-periodate and the sample is now put into water for further expansion. Throughout the process, fluorophores attached to palm-GeLy and secondary antibodies are chosen carefully by considering their ability to retain fluorescence after being degraded during multiple rounds of free radical polymerization, such that they are still bright enough for imaging. This process is called the iterative

MxM (iMxM), and it allows us to visualize lipid membranes and organelles in finer details. As shown in Figure 10(b), neuronal membrane, mitochondria, and endoplasmic reticulum can be roughly traced and annotated manually. However, to address the multi-color ability of iterative expansion microscopy, we have also designed and implemented iMxM that is compatible with standard immunohistochemistry - an extension of the MxM version described prior to this section. We challenge ourselves to visualize synaptic proteins in one color and the lipid membranes in another color, in the context of iMxM.

55

Figure 10. iMxM on 100 µm mouse brain slices. (a) One field of view image depicting a ​ ​ ~20x expanded 100 µm mouse brain slice labelled with palm-GeLy. (b) Image in (a) ​ manually annotated for cell membrane (magenta), mitochondria (red), and endoplasmic reticulum (green). (c) Image in (a) with inverted grayscale. (d) Zoomed-in image of the red-boxed inset in (c). At the interface between two neurons in the image, the cell membrane can be seen. (e-g) are three images of the z-axis stack taken for a ~20x expanded mouse brain slice after iMxM was performed on the 100 µm mouse brain ​ ​ ​ slice (~20 nm effective resolution). For all images, lipid membranes labelled with palm-GeLy are colored in either grayscale or inverted grayscale. Scale bars (in white): ​ (a-g) 20µm. ​ ​

iMxM with Immunohistochemistry

Making iMxM compatible with standard immunohistochemistry was a straightforward extension from the immunohistochemistry-compatible MxM scheme that was introduced previously. After staining the proteins of interest with primary and secondary antibodies in the gelled sample that has gone through the FR buffer digestion step, streptavidin is applied to give biotin-active sites to the palm-GeLy anchored to the gel matrix. Then, the sample is re-embedded, and the re-embedded gel is treated with AcX to give

56

gel-anchorable handles to both palm-GeLy anchored streptavidin and antibodies. The second gel is formed next, and after the digestion of the first gel, the final gel would expand ~20 times, as compared to the initial size of the sample. In the final gel, both secondary antibodies and streptavidin are conjugated to the matrix, which represent proteins of interest and lipid membranes as proxies during imaging, respectively. The fluorophores conjugated to the secondary antibodies are carefully chosen, such that the fluorescence survives multiple rounds of free-radical polymerizations (first gelling, re-embedding, and second gelling), as well as the first gel digestion step involving 20 mM sodium meta-periodate at pH 5.5. That said, what we have identified as robust fluorophores are Alexa Fluor 488, Alexa Fluor 546, and CF 633. The degradation of fluorescence for palm-GeLy probes is not considered a concern here because fluorescence-conjugated biotins are brought in after the final expansion and get attached to the streptavidin that carried palm-GeLy all the way from the first gel matrix to the second gel matrix. Consequently, both proteomic information and structural morphological information from lipid membranes can be attained and images in the same field of view at nanometer effective resolution, just using a conventional confocal microscope, as shown in Figure 11. We can identify the nuclear pore complex proteins

NUP98 along the membrane, postsynaptic density proteins PSD95 in their stereotypical shapes and arrangement at this level of effective resolution, and finally synaptobrevin proteins VAMP2 along the axonal membrane of the neurons. Although more works needs to be done to further optimize this process, the current results serve as promising

57

proof-of-concept towards multi-color EM-like imaging of both lipid membranes and proteins with ~20 nm effective resolution. ​

Figure 11. Immunohistochemistry-compatible iMxM. The arrangement of images in Figure 11 is similar to the ones in Figure 9. (a-c) show (i) lipid channel (grayscale), (ii) proteins of interest (red), (iii) composite, and (iv) zoomed-in region of interest - dotted boxe inset in the composite (iii) for NUP98, PSD95, and VAMP2 antibodies, respectively. Scale bars (in white) are 10 µm for all images here. ​

MxM with Fluorescent in-situ Hybridization for RNA imaging

Beyond imaging proteomic information in the context of MxM, we have also started experimenting with combining MxM with expansion fluorescent in-situ hybridization

(ExFISH) for imaging both lipid membranes and RNA together in one field of view. In order to achieve this, the aforementioned MxM protocol had to be modified slightly. For

ExFISH to work, It is absolutely necessary for us to permeabilize the biological

58

specimen and apply LabelX tags that give RNA inside the specimen gel-anchorable handles, due to the fact without LabelX the current ExFISH scheme does not allow for

39 RNA information to be retained after the digestion and expansion .​ Figure 12(a) shows ​ how LabelX is constructed from reacting commercially-available AcX and Label-IT together. Following this is Figure 12(b), which shows the overall ExFISH steps that are uniquely different from the original expansion microscopy protocol. Prior to performing

ExFISH, palm-GeLy intercalating lipid probes are first applied to the mouse brain slice with thickness of interest. In this case, a 100 µm Thy1-YFP slice was used. Then, by ​ mildly fixing the slice again in 0.1% PFA buffered in PBS, we can actually fix the palm-GeLy in place, as the probes have amines on the backbone. This way, a tight-knit palm-GeLy network is formed along the lipid membranes, which allows us to permeabilize the tissue samples without significantly disrupting the lipid membrane information we want to image and attain after expansion. After this, LabelX is applied, as conventionally done in the ExFISH protocol paper, which is followed by the AcX treatment step. Then, a gel matrix is formed, and after proteolytic proteinase K digestion, the sample expands. After the hybridization chain reaction (HCR) probes and amplifies the RNA strand of our interest, the sample can be imaged together with morphological information provided by the intercalating lipid tags. One caveat of performing MxM this way is that, after the lipid network is tight-knitted via double fixation

(a mild fixation after the application of palm-GeLy probes in the tissue), the sample expands less than ~4.5 times due to the increase mechanical integrity of the sample.

Considering that ExFISH protocol alone does not achieve ~4.5 times expansion since

59

the expansion occurs in a salted environment, the final expansion we were able achieve from this preliminary efforts toward combining MxM with ExFISH was ~2.5 times, providing the effective resolution close to ~110 nm. As shown in Figure 12, we have done this preliminary study by co-localizing YFP RNA, native YFP fluorescence, and lipid membrane signals all in the same field of view. This serves as the first attempt at combining all proteomic imaging, lipid membrane imaging, and ExFISH together, in the context of expansion microscopy.

Figure 12. ExFISH-compatible MxM performed on a 100 µm Thy1-YFP mouse brain ​ slice . The standard ExFISH protocol with slight modifications on the MxM protocol, which are described in the main text. (a) Acryloyl-X SE (top left) is reacted to Label-IT amine (top right) via NHS-ester chemistry to form LabelX (middle), which serves to make RNA gel anchorable by alkylating its bases (e.g., the N7 position of guanines) (bottom). (b) Workflow for ExFISH: biological specimens are treated with LabelX (left), which enables RNA to be anchored to the ExM gel (middle). Anchored RNA can be probed via hybridization (right) after gelation, digestion, and expansion. Modified from 39 Chen et al (2016) .​ Images in Figure 12 are depicting (c) YFP RNA (red), (d) lipid ​ membranes visualized with palm-GeLy (grayscale), (e) native YFP fluorescence (green), and (f) composite of (c-e) images. ExFISH-treated sample only expanded ~2.5 times, giving this set of sample the effective resolution close to ~110 nm. Scale bars (in white): (a-d) 10 µm. ​

60

Methods

Mouse tissue slices preparation

Mice are anesthetized using isoflurane in oxygen. Then, they are first perfused with 1x

PBS at 4⁰C (stored in ice) until the blood runs clear, and perfused with 15 mL of fixative solution (4% paraformaldehyde and 0.1% glutaraldehyde in 1x PBS) again at 4⁰C

(stored in ice). After the perfusion step, the brain is harvested and stored in the same fixative at 4⁰C overnight. Next, the brain is liced on a vibratome (Leica VT1000s) to a thickness of 100 µm and the slices are kept at 4⁰C until use. ​ ​

palm-GeLy staining and AcX incubation

Our custom-synthesized palm-GeLy probes is stored in 53% DMSO and 47% water at

10 mg/mL concentration. The probes are then diluted in 1x PBS at 1:100, and 100 µm ​ mouse brain slices are incubated in the diluted palm-GeLy solution for 8 hours at 4⁰C.

The slices are then treated with AcX (stored in -20⁰C; 10 mg/mL) that is mixed in 1x

PBS with 1:100 dilution, after which every biomolecule with amines in the tissue, including the palm-GeLy probes, gain gel-anchorable sites. At this point, the tissue is ready to be go through membrane expansion microscopy procedures. Again, all these procedures are performed at 4⁰C to ensure that lipids in the membranes are not diffused away due to thermodynamic instability.

Membrane Expansion Microscopy (MxM)

61

After treating the fixed tissue slices with palm-GeLy probes and AcX solution as described above, the tissue is first pre-incubated in the monomer solution at 4⁰C for 30 minutes to ensure that the monomers diffuse throughout the tissue prior to free radical polymerization. The monomer solution consists of 1x PBS, 2.5% (w/w) acrylamide,

7.5% (w/w) sodium acrylate, 0.2% (w/v) tetramethylethylenediamine (TEMED), 0.01%

(w/w) 4-hydroxy-2,2,6,6-tetramethylpiperidin-1-oxyl (H-TEMPO), 0.05% (w/w) ammonium persulfate (APS), 2M NaCl, and 0.5% (w/w) (+)-N,N′-Diallyltartramide

(DATD). When preparing the monomer solution, extra-careful measures must be put into making sure the temperature of the solution is kept at 4⁰C, as described in the second chapter of this thesis, since increase in temperature can lead to premature gelation that can compromise the entirety of expansion microscopy. Then, the tissue incubated in the monomer solution at 4⁰C, it is placed between two pieces of #1 coverglass separated by another #1 coverglass and incubated at at 37⁰C for two hours.

Following this is the tissue homogenization step using the proteolytic digestion buffer

(50 mM Tris pH 8, 1 mM EDTA, 0.5% Triton-X100, 1 M NaCl) containing proteinase K of

8 units/mL (1:100 dilution). The tissue is kept in the digestion buffer overnight at 37⁰C with gentle shaking. After the digestion is completed, the sample is washed in 1x PBS four times, 30 minutes each at room temperature, to effectively stop the reaction. This procedure does not allow immunohistochemistry to be compatible with MxM, as proteolytic steps leads the sample to lose antigenicity during the process.

Immunohistochemistry-compatible MxM procedures can be found below.

62

MxM with Immunohistochemistry

Every step involved in this protocol is the same as above, but the only difference here is that instead of the proteolytic digestion buffer with proteinase K that was used for tissue homogenization, we use the fixation reversal (FR) buffer for this purpose. Following the

37⁰C gelation step, the tissue is incubated in the FR buffer half hour at 100⁰C and for 2 hours at 80⁰C. The FR buffer consists of DTT and SDS in a pH8 Tris in water, and this allows the byproducts of fixation to be reversed (except for the methylene bridges) and thereby restore the antigenicity of the sample. The sample is then washed in 1x PBS four times, 30 minutes each at room temperature, to effectively stop the reaction. Next, the gelled sample is incubated in blocking buffer (MAXBlock blocking buffer) at room temperature overnight with gentle shaking. Then, the standard immunohistochemistry steps follow. Primary antibodies of interest that are diluted at 1:200 in the staining buffer

(MAXBlock staining buffer) are applied to the sample for 6 hours at room temperature, and they are washed in the washing buffer (MAXBlock washing buffer) four times, 30 minutes each at room temperature. Finally, secondary antibodies that are conjugated to either Alexa Fluor 488, Alexa Fluor 546, or CF 633 that are diluted at 1:200 in the same staining buffer used for primary antibody staining are applied to the sample. It’s important to note that the aforementioned fluorophores have been tested out of other options that are commercially available and chosen, as they have shown to survive a few more steps involved in iterative membrane expansion microscopy (iMxM), which degrade the fluorescence. Therefore, we highly encourage these fluorophores to be

63

used if further iMxM procedures are considered in advance. If this is not the case, other fluorophores of interest can be used.

Conjugating fluorophores to palm-GeLy probes after MxM

After the digestion step using either the proteolytic buffer or FR buffer, the sample is treated with streptavidin (stock concentration is 5 mg/mL in 1x PBS) diluted in 1x PBS for 8 hours at room temperature. The sample is then washed in 1x PBS four times, 30 minutes each at room temperature. Next, biotins conjugated to Atto 565 (stock concentration is 10 mg/mL in DMSO) is diluted in 1x PBS at 1:1000 are applied to the sample for 8 hours at room temperature. Again, the sample is washed in 1x PBS four times, 30 minutes each at room temperature. If imaging the lipid membranes without immunohistochemistry is the primary goal, the sample is then put in water for 30 minutes twice at room temperature for osmotic pressure driven expansion. For immunohistochemistry-compatible MxM, this step is done after secondary antibody staining. If the sample is intended to go through the iMxM protocols, the biotin conjugation step should be skipped since biotin-fluorophore conjugation can still happen after the fluorescence-degrading re-embedding and second gelling without compromising the signal intensity of the fluorophores.

Iterative Membrane Expansion Microscopy (iMxM)

● Re-embedding: As mentioned before, the biotin-fluorophore conjugation step

should be skipped from the above protocol if the sample were to go through the

64

iMxM protocol described below. After MxM or MxM with immunohistochemistry,

the sample is incubated with the streptavidin solution and subsequently

expanded in water as described above, and the sample is re-embedded. Then,

the sample is incubated in the re-embedding solution at room temperature for 30

minutes with gentle shaking. Here, the re-embedding solution consists of 10%

(w/v) acrylamide, 0.75% (w/v) DATD, 0.2% (w/v) tetramethylethylenediamine

(TEMED), and 0.2% (w/v) APS. After this room temperature incubation step, the

sample is placed between two pieces of #1 coverglass, placed in a nitrogen-filled

chamber, and then incubated at 37⁰C for 1.5 hours. Then, the sample is

removed from the nitrogen-filled chamber and incubated in AcX buffered in 1x

PBS overnight at 4⁰C to get ready for the transfer of antibodies and

streptavidin-conjugated palm-GeLy to the second gel.

● Second gelling: the re-embedded sample is incubated at 4⁰C for 30 minutes the

in fresh monomer solution (called Stock X) made with a non-cleavable

crosslinker N,N - Methylenebisacrylamide. The composition of Stock X is as

follows: 7.425% (w/v) sodium acrylate, 2.5% (w/v) acrylamide, 0.15% (w/v) N,N -

Methylenebisacrylamide (BIS), 2M NaCl, 1x PBS, 0.2% (w/v)

tetramethylethylenediamine (TEMED), 0.01% (w/w)

4-hydroxy-2,2,6,6-tetramethylpiperidin-1-oxyl (H-TEMPO), and 0.05% (w/w)

ammonium persulfate (APS). Then, the sample is placed between two pieces of

#1 coverglass and then placed in a nitrogen-filled chamber, followed by being

65

incubated at 37⁰C for 2 hours. Next, the sample is removed from the

nitrogen-filled chamber and incubated in 20 mM sodium meta-periodate buffered

in 1x PBS at pH 5.5 for 30 minutes at room temperature, with gentle shaking.

The latter step cleaves the first DATD gel and the DATD-based re-embedding

gel, allowing the BIS-gelled sample to expand further. The final sample is then

washed in 1x PBS four times, 30 minutes each at room temperature, and stained

with biotin conjugated to Atto 565, buffered in 1x PBS, overnight at room

temperature, with gentle shaking. Finally, the sample is washed in water 4 four

times, 30 minute each at room temperature, and during this process, the sample

expands and is ready for imaging under the confocal microscope.

Confocal Imaging

Imaging was performed on an Andor spinning disk confocal microscope with a 40x 1.15

NA water-immersion objective or Nikon Eclipse Ti inverted microscope with the same objective. When needed (i.e., longer-term time scale imaging), the sample was put inside a 6-well glass-bottom plate and immobilized with 0.5% (w/v) low-melting agarose solution, diluted in water. Prior to use, the agarose solution should be kept in a 40ºC water bath to prevent it from premature hardening. The 6-well plate containing the sample was placed on a bed of ice and the agarose solution was squirted at the edges of the sample using a pipette. Here, getting agarose underneath the sample must be avoided, so that the sample is not elevated from the bottom of the bottom glass plate by

66

the agarose layer. Ice on the bottom surface enables close to instant hardening of the agarose solution.

Expansion factor measurement.

To determine the expansion factors for each round of expansion, the whole specimens were imaged with a widefield microscope before and after the expansion of the first gel.

The expansion factor for the first round was then determined by measuring the distance between two landmarks in the specimen before and after the first round of expansion.

The expansion factor of the second round was determined in the same way as well.

67

Chapter 4

Extracellular Space Labeling of the Zebrafish using Expansion Microscopy

The morphology of neurons in brain circuits has classically been reconstructed from lipid stain or intracellular filling (e.g., via biocytin fills or brainbow expression). However, in principle, neurons could be reconstructed by imaging the surrounding extracellular space (ECS), given a suitable contrast agent. That is, the cellular space is the negative space constrained by the labeled extracellular volume. In such as case, the labeling of the extracellular space could be synergistic with, and facilitate the analysis of, cellular labeling with a small-molecule or genetically encoded lipid or cytoplasmic marker. That is, the labeled extracellular space could help with error correction of the tracing of the lipid or cytoplasmic stain. As a method of nanoscale resolution imaging of intact brain circuits, we utilize our recently invented expansion microscopy, which permits 3-D imaging of intact neural circuits with down to 20 nm precision in its most recently published form. In short, ExM enables few-nanometer resolution imaging through the physical magnification of biological specimens, followed by imaging on conventional, high-speed, diffraction-- limited optics.

Beyond the development of MxM, the lipid labelling expansion microscopy technique described before, we have also been screening through libraries of chemicals, including small molecules as well as proteins, to determine which ones optimally enable the labeling of the extracellular space in the brain of the larval zebrafish, an important model organism for which we recently developed expansion microscopy protocols by Freifeld

68

32 et al (2017) .​ By identifying dense labels of the extracellular space that are compatible ​ with larval zebrafish brains, and chemically compatible with ExM, we aim to enable connectomics of entire vertebrate brains to become a routine and feasible technique for deployment into everyday neuroscience. It is our hope that someday all the ExM technologies, namely MxM and ExFISH, can be integrated with the extracellular space staining approach to help with tracing of neurons in a scalable fashion, while also bringing out important information regarding biological composition of neurons.

Whole Larval Zebrafish Expansion Microscopy

As expansion microscopy has a special emphasis on scalability, one of the key challenges we aim to address is evaluating whether or not expansion microscopy technologies can be utilized for investigating a whole organism - instead of studying parts of it. As mentioned before, zebrafish are a popularly studied organism for many aspects of neuroscience and disease modelling due to the fact that, to mention a few, they can be genetically engineered to be have a transparent skin color (for behavioral studies, this factor helps with real-time imaging of its internal actions), as well as faster reproduction timeline (for genetic engineering, researchers can test and iterate in a timely fashion with this benefit). For these reasons, we chose a larval zebrafish as a model organism and undertook this challenge. When thinking about performing expansion microscopy on zebrafish, the most immediate hurdle that faced us was mechanically homogenizing its tissue during the digestion step. This was the only step in the original ExM protocol that had to be modified to make sure that a whole larval

69

zebrafish could be expanded and be accessible for imaging a complete organism with nanometer scale effective resolution.

That said, we were able to solve this challenge by combining the bone dissolving step with the existing expansion microscopy protocol. After fixing the fish with 4% PFA buffered in PBS, we immersed the fish in 50 mM EDTA with 0.75% Triton X-100 for 6 hours at room temperature, on shaker. Calcification of bones starts for fish after reaching 4 days-post-fertilization (dpf), and without dissolving the bones, expanded fish

59 have distorted physical structures .​ After the decalcification step, the fish can then be ​ treated with standard immunohistochemistry if needed and other subsequent expansion microscopy steps, namely AcX treatment, gellation, and proteolytic proteinase K digestion. As described in the PNAS paper written by Freifeld et al in 2017 that was referenced earlier, the digestion step involved for fish requires proteinase K at 1:50 dilution, as opposed to 1:100 dilution mentioned in the original ExM paper written by

Chen et al in 2015. This is because the fish at the 4-6 dpf range could be as thick as

300-500 µm, and it is more difficult to digest the tissue inside, compared to the 100 µm mouse brain slices. This process workflow is schematically depicted in Figure 13(a), with which a whole larval zebrafish can be expanded ~4 times in linear dimensions (~90 nm effective resolution). Figure 13(b) shows one region of the fish hindbrain that expresses green fluorescent proteins, imaged after expansion. Figure 13(c) and (d) are the 4dpf zebrafish image taken before and after expansion, showing a great deal of isotropicity without any noticeable distortion in its morphology.

70

Figure 13: Expansion microscopy procedure for a 4dpf whole zebrafish larvae. (a) The fish is fixed after being incubated in extracellular space labels. Chemoselective anchors necessary for expansion microscopy are then applied to the fish, which is subsequently gelled, digested, and expanded ~4 times in linear dimensions. (b) FP-expressing (membrane specific) neurons in the zebrash, imaged after expansion with a high NA, water-immersion 40x objective. (c) Larval zebrafish before expansion. (d) Larval zebrafish after expansion. Scale bars: (b) 50 µm, (c) 200 µm, and (d) 600 µm.

Extracellular Space Labelling with Whole Larval Zebrafish Expansion

After devising an approach to expand a whole larval zebrafish with the modified expansion microscopy scheme, we screened through various types of tags for visualizing the extracellular space in the context of ExM. So far, we investigated (1) species-bioorthogonal, membrane impermeant, fixable small molecules and proteins to

71

be administered to the organisms, (2) those that do not cross the membranes but that instead remain with intact in the extracellular space (i.e., biocytin, horseradish peroxidase, and other molecules), and (3) the molecules that could either freely diffuse through the extracellular space until fixation or bind to key molecules in the extracellular matrix (i.e., stains of hyaluronan, glycosaminoglycans, and proteoglycans) either before fixation or during fixation. Although we are still in the process of thoroughly evaluating these candidates, we have achieved early success with utilizing biocytin for filling up the extracellular space of a larval zebrafish, as shown in Figure 14. We fed the fish 0.5 mg/mL of biocytin while it is still alive, and due to charge exclusion, biocytin remains outside the cell membranes. After a few hours, we then fixed the fish with 4% PFA, during which amines on the biocytins are also fixed and conjugated to the extracellular matrix proteins. Then, the fish that is filled with biocytin and fixed with 4% PFA can adopt the expansion microscopy protocol described above, and the biocytins can later be visualized by conjugating fluorescent streptavidin to them after the digestion and expansion steps.

Taken with a 10x air objective on a conventional confocal microscope, Figure 14(a-c) are showing forebrain, midbrain, and hindbrain of the ~4x expanded zebrafish, respectively. Here, regions with high density of neuropils appear saturated with biocytin;

Figure 14(d-e) are showing the tail regions of the expanded zebrash, and the distinct periodic patterns in the muscles are kept preserved after expansion. Figure 14(f) shows the hindbrain region (shown as a red-boxed inset) taken with a 40x water-immersion

72

objective, and it can be seen that, visually speaking, our extracellular space stains are suitable to be used a contrast agent, as purposed before; we are now integrating with the MxM protocol to assess whether the images obtained from integrating the two can lead to making manual or computational neuronal tracing and annotation more scalable.

One noticeable concern is that, during fixation, osmolarity difference between the fixative and the native cell environment is balanced, and this results in cell membrane rupturing, as the water flows into the cells and swell them up. For this reason, some regions show that biocytin had entered inside the intracellular parts of the neurons, not being the perfect contrast agent for discerning inside and outside of the cell membrane.

There needs to be more investigation needed to balance the osmolarity difference to make sure we can debug away this problem. On a related note, it is also important to preserve the native extracellular space volume by optimizing the fixative. Studies of the structure of rapidly frozen mouse brain tissue suggest total extracellular space volume to be about 15-25% of total brain volume across different brain regions. However, when the brain is fixed with aldehyde-based fixatives, total extracellular space volume

60-68 reduces to below 5% .​ Other colleagues in the lab and I have been also actively ​ addressing this issue by experimenting with different types of fixatives - for both mouse brains and whole zebrafish larvae. Based on the literature demonstrating other groups’ attempts and success with addressing this issue, we have been adding sucrose, sodium cacodylate, and mannitol at different concentrations to understand the optimal osmotic

73

balance in the hippocampus region of the mouse brains and forebrain region of the zebrafish larvae.

Figure 14. ~4x expanded whole zebrafish larvae stained with custom-synthesized ECS labels. (a-c) are showing forebrain, midbrain, and hindbrain of the expanded zebrash, respectively. Here, regions with high density of neuropils appear saturated with ECS labels. (d-e) are showing the tail regions of the expanded zebrash, and the distinct periodic patterns in the muscles are kept preserved after expansion. (a-c) are all taken with a 10x air objective, on a conventional confocal microscope. (f) shows one region of the zebrafish’s hindbrain, taken with a high-NA, 40x water-immersion objective. The density of applied labels needs to be further optimized to ensure continuity in ECS/membrane boundaries. No major distortions are apparent along the expanded zebrash. (g-h) ~4x expanded whole zebrafish larvae stained with both ECS labels and GFP antibodies. GFP is expressed sparsely along the zebrash body. (g) and (h) are both acquired from the tail, taken with a high-NA, 40x water-immersion objective.Scale bars: (a-e) 200 µm, (f-h) 50 µm.

As a next step, we also combined the expansion microscopy technique that visualizes the extracellular space in a whole zebrafish larva with the conventional immunohistochemistry involving primary antibodies and secondary antibodies. We repeated the aforementioned ECS labelling (with biocytin tags) and ExM procedures

74

with a transgenic fish that sparsely expresses green fluorescent proteins (GFP) on the membranes. After the bone dissolving step involving high concentrations of EDTA and

Triton X-100, the zebrafish was washed thoroughly with PBS. Anti-GFP antibodies and the corresponding secondary antibodies with fluorophores were then applied to the fish, which were both later treated with AcX to become gel-anchorable. As shown in Figure

14(g-h), it can be seen that GFP (in green) and ECS (in red) can be co-localized and be imaged together in one field of view. The images were taken at the tail region of the

4dpf larval zebrafish.

75

Chapter 5

Bridging the Gap between Research and Impact: Past and Future

Throughout my graduate studies, I had an enormously fortunate experience with tackling some of the most pressing challenges - understanding how the brain works at the molecular level and thinking about how to connect this back to useful applications that benefit the world. However, it did not take long for me to understand the immensity and complexity of this challenge, which discouraged me from time to time. I questioned and doubted myself whether or not I was really capable of making a meaningful contribution to addressing this. We talk a lot about "positive impact", but it is hard to put that concept into concrete execution. The term itself embodies an infinite spectrum of societal issues, all of which need sincere attention and effort from capable individuals.

Energy and environment, mental health, physical health, poverty, equality, and the list goes on. Taking a step back, I also wondered if the conventional academic environment and how the infrastructure built around it were optimal for value creation. When not in the labs, I spent significant amount of time attempting to investigate different designs and systems that are making great progress with connecting basic research to social impact. The excerpt below summarizes my preliminary thoughts on this conundrum, while considering and comparing multiple government, industry, and academic initiatives, in terms of what can be further optimized or modified for producing maximum benefit to the society. Ultimately, with my background in sciences and engineering, I wish to be part of designing a new system and create an optimized environment for

76

accelerating innovative scientific discoveries and their beneficial applications in the real world.

From human health to social inequality to the environment, our greatest challenges require a concerted, persistent effort from a wide array of disciplines and fields. Over time, we have excelled at addressing some of the pieces involved in this process. For example, on the research side, one could even argue that we already have the portfolio

69 of tech innovations necessary to solve the climate crisis .​ On the capital side, we have ​ accumulated trillions of dollars for such purpose. What’s missing?

Application is one of key gaps between research and impact. Brilliant ideas developed in laboratories struggle to leave the academy. These groundbreaking innovations fail to attract the necessary execution vehicles, such as monetary support from public and private sectors or synergistic partnerships. Thanks to limited networks and administrative experience, scientists are not necessarily equipped to shepherd their ideas from lab to society.

The problem with misallocation of resources is also notable. Philanthropists are undoubtedly fully committed to help hard-earned research innovations overcome the

"Valley of Death" stage in their development cycles. And yet, they continue to face the challenge of identifying and evaluating early-stage research projects that hold promise for impact. We need a mechanism that bridges funding and opportunity and real impact.

77

Having this in place will facilitate bidirectional knowledge transfer and iterative development for greater impact.

This is not to say that such measures don’t exist already. In fact, we have systems that demonstrate success in nurturing paradigm-shifting ideas in three major domains: governments, accelerator programs in the private sector, and academic institutions.

Government agencies like Defense Advanced Research Projects Agency (DARPA),

Department of Energy (DOE), and Small Business Vouchers and Small Business

Innovation Research (SBIR) initiatives are extremely good at sourcing and funding high-potential research projects that are burgeoning at laboratories around the nation.

These agencies already have tight relationships with academic institutions, and a deep understanding of research and development cycles. Small Business Vouchers and

SBIR grants provide support for early-stage projects to meet commercial milestones.

Nevertheless, bringing research to the market-ready stage is one thing, and catalyzing it toward real-life applications is another. Insights and inputs from the private sector play a critical role in the latter. Accelerator programs in the private sector, backed largely by philanthropic investors, bring expertise in business formation and growth to the table.

Looking at the numbers, Breakout Labs (BOL) and PRIME Coalition (PC) stand out as notable examples. For example, BOL provides leadership training and public relations support on a regular basis to scientist-turned-entrepreneurs. However, BOL falls a little short in its capability to be hands-on with the individuals in action. Therefore, it's hard to

78

say that they address the full research-to-impact cycle as of now. Although BOL and PC have strong academic relationships (BOL recruits scientists to be its ambassadors at major universities, and PC is located right between Harvard and MIT), their proximity to where research is really happening may not be completely sufficient.

In that regard, academic institutes have also been implementing radical changes as to how they take research to market and maximize the impact. Whitehead Institute, Broad

Institute, and Wyss institute at MIT and Harvard are doing exceptionally well. In-house administrative specialists efficiently coordinate efforts among researchers, government agencies, and other organizations, utilizing right resources at the right times. They also have their own in-house tech transfer experts so that they can be more laser-focused about determining the best custom strategies for the research discoveries made there.

As a result, hundreds of licensing agreements have been granted, and thousands of publications have been produced. Many of the companies originating from there have exited successfully through either acquisition with a hefty valuation or Initial Public

Offering, reaching millions of people in the world.

Still, their practice of achieving accelerated impact has been on a case-by-case basis, made possible only via a concentrated, tight-knit network - similar to the earlier versions of the internet. The impact generation should be scalable and repeatable across different verticals regardless of individuals, organizations, and entities pursuing it.

Aforementioned initiatives fall under the Impact 1.0 category, whereas Impact 2.0

79

should resemble something akin to cloud computing, and therefore be more accessible to a greater audience with minimized barrier for practice. In this review, we take a closer look at Impact 1.0 initiatives and learn their founding philosophy, structure, source of funding, and key strategies. By doing so, we hope to assess areas of improvement that can ultimately inspire a new, rational design for our Impact 2.0 pursuits.

Government Agencies

When designing an impact-generation-formula that can be deployed for repeated success, government agencies cannot be overlooked. Over the past decades, they have been crafting their know-hows into a structured practice that can be deployed at labs across the nation. The following agencies are particularly interesting: Defense

Advanced Research Projects Agency (DARPA), ARPA-E (Advanced Research Projects

Agency-Energy), and Small Business Innovation Research Program (SBIR), as well as

National Science Foundation (NSF)

I am particularly fascinated by DARPA’s model and practice and will only review

DARPA for now, as this review is not intended to be an exhaustive list. Nevertheless, it should be noted that many other agencies such as the Department of Energy (DoE),

National Institute of Health (NIH) and National Aeronautics and Space Administration

(NASA), have also been incredibly successful at impact generation and are an inseparable variable in the scientific discovery and market adoption formula.

80

Defense Advanced Research Projects Agency (DARPA)

Formed in 1958 and headquartered in Virginia, DARPA is responsible for “the development of emerging technologies for use by the military.” Although it now has the word “defense” and “military” in its name and mission statement, DARPA was once called ARPA, without the D. President Eisenhower established (D)ARPA to keep the

70 United States at the forefront of global tech innovations .​ That said, many of DARPA’s ​ projects often do not stem from immediate military needs. Currently, it has 6 technical offices for basic research and 2 support offices for special projects and transition efforts, listed as follows:

● Adaptive Execution Office (AEO): focuses on ensuring bidirectional knowledge

transfer between research (innovations captured by the other offices) and DoD

capabilities (thorough robust connections to the warfighter community)

● Defense Sciences Office (DSO): pursues high-risk, high-reward fundamental

research initiatives without discipline specificity

● Information Innovation Office (I2O): pursues cyber security, data analytics, and

human-machine symbiosis as core research areas

● Microsystems Technology Office (MTO): pursues heterogeneous microchip-scale

integration of electronics, photonics, and microelectromechanical systems as

core research areas

● Biological Technologies Office (BTO): pursues gene editing, biotechnologies,

neurosciences, and synthetic biology as core research areas

81

● Strategic Technology Office (STO): pursues basic and applied research

regarding fighting strategy development

● Tactical Technology Office (TTO): pursues ground, maritime, air, and space

systems as core research areas

Over the past 60 years, DARPA has produced an exceptional number of radical inventions, including the internet, global positioning satellites, stealth technology, and gallium arsenide semiconductor devices found in our phones. As Regina Dugan and

Kaigham Gabriel pointed out, “What makes DARPA’s long list of accomplishments even more impressive is the agency’s swiftness, relatively tiny organization, and comparatively modest budget.” Just with 240 employees (of which close to half are support staff) and ~$3B annual budget, DARPA funds around 200 programs with its

70 “special forces” model of innovation .​ ​

This model consists of three key elements:

(1) Ambitious Goals: when choosing the projects to pursue, DARPA considers two key ​ ​ questions. Has a scientific field emerged or reached an inflection point where it can provide a practical solution for an important problem? Are there urgent needs that current innovations cannot address? They pursue ambitious goals with realistic measures for execution, as determined by technical program managers (drawn from academic and/or industry leaders in the relevant fields).

82

(2) Independence: unlike other government entities, DARPA has the flat structure and ​ ​ autonomy to move fast and bring in the brightest talents as project leaders. The majority of these leaders have PhDs in their domains of expertise, and still possess the skills and mindsets of successful startup CEOs. They implement their own styles of managing the nitty-gritty details of the programs, while following DARPA’s overall organizational timeline.

(3) Temporary project terms: Lastly, DARPA projects have a finite time limit (up to five ​ ​ years). This encourages those involved in the projects to set clear vision and milestones, act quickly, and recruit whoever is needed with a high turnover rate; all the qualities required for productive project execution.

Still, the DARPA model is not without pitfalls. Many of the “high-risk, high-payoff” projects it pursues may require more than five years just to show proof of concept - let alone their further developments for great impact generation. The fixed time imposed on the projects may discourage project leaders from choosing visions and milestones that could take more than five years to meet. A sense of urgency is valuable for project execution, but it should not compromise the factors involved in addressing an issue with gravity. To be fair, DARPA inventions and projects were originally intended to demonstrate accelerated examples of proof-of-concept, not full-on market implementation (despite the fact that DARPA has two offices in place to address research-to-impact transition, traditionally speaking). Undoubtedly some portion of the

83

DARPA-model could be adopted for certain types of early-stage research, but it in order for groundbreaking ideas to mature in a broader sense, a more flexible model is necessary.

Accelerator Programs (Private Sector)

Recently, there have been exciting new value creation models from private sector, in the form of - what they call - “acceleration” programs. These entities sometimes come up with their own ideas to address various societal problems or invest and incubate early stage companies (and sometimes these companies are as early stage as only having an idea written on a piece of paper). Notable entities I have been looking into are

Breakout Labs (a revolving fund made by Silicon Valley investor Peter Thiel’s tax-advantaged capital to help hard tech scientific companies derisk their technologies before they go out and raise funding from traditional institutional investors), PRIME

Coalition (a Cambridge-based donor-advised fund that is actively making philanthropic investments in energy startups at various growth stages), PureTech (a venture creation and incubation lab and funding vehicle founded by MIT Professor Robert Langer),

Flagship Pioneering (similar to PureTech; it has incubated and exited more than 100 companies so far), and finally 100&Change initiative at the MacArthur Foundation (an initiative that gives out $100M grants to projects that are addressing important cause areas). All of these programs are absolutely fascinating to me, as they are tackling what has traditionally been addressed by academia as a private lab without strong ties to the corporate industry, which can be limited with the impact maximizing agenda.

84

Here, I am only describing Breakout Labs in detail, but when given a chance, I intend to spend more time looking into the other entities mentioned above in more details. As I have always been interested in addressing the impact maximizing strategy from various angles, I actually worked at Breakout Labs prior to starting my PhD at Harvard

University. I was involving in writing their energy investment thesis and made the campus ambassadors program - we would recruit graduate students, postdocs, and sometimes early-career principal investigators to help us with deal sourcing. Working at

Breakout Labs was very intriguing. I got to experience first-hand the infrastructural need between research and impact. For example, Breakout Labs funded a startup made by three faculty members at a prominent research university, and I personally had the impression that these high-profile individuals would know what they were doing.

However, the startup eventually ended up not being able to achieve its key milestones due to lack of directional guidance regarding venture formation and development. As a result, Breakout Labs revisited their strategies and decided to offer most of the administrative and training needed for scientists to excel in their entrepreneurial pursuits. We looked at very early stage companies; the only requirement for the startups to receive our funding was that they were incorporated in Delaware as a C Corp. From that point on, IP specialists, legal counsel, HR experts, partnerships with various industry and government entities that provide laboratory space (i.e., LabCantral,

Cyclotron Road, etc), and further infrastructural help were provided in a streamlined fashion. In addition, I have been a huge supporter of Sarah Kearney, the founder of

PRIME Coalition, ever since we met a few years ago. PRIME Coalition is redefining how

85

philanthropic investment is done by designing carefully-crafted impact metrics and success measures that create a win-win situation for both investors/donors and the companies they invest in. Lastly, PureTech and Flagship Pioneering are different than

Breakout Labs and PRIME Coalition, as they address value creation all the way from idea inception to creating commercial, societal impact. As far as I understand, Flagship

Pioneers provides US$ 1 million dollars as a seed investment to an idea to pursue initial de-risking and if the technology passes the proof-of-concept stage, a follow-on funding between US$ 10 - 20 million dollars are put in as a series A investment. Both Puretech and Flagship Pioneering devised and optimized a systematic way to bring technologies out of the lab for real-life applications, and I am very excited to learn more about their strategies.

Academic Institutions

On the more practical side (as I am interested in pursuing my next career within academia), the following academic institutes at Harvard and MIT have been my topic of interest, since they have been demonstrating their prowess as innovation powerhouses as Impact 1.0 labs: Whitehead Institute for Biomedical Research, Eli and Edythe L.

Broad Institute of MIT and Harvard, and Wyss Institute for Biologically Inspired

Engineering.

Other Impact 1.0 labs worth noting are the Allen Institute (close ties with University of

Washington), and the Chan and Zuckerberg Initiative (CZI; close ties with UC San

86

Francisco, UC Berkeley, and Stanford). CZI is particularly interesting because it is registered as a limited liability company, in which the hub has the exclusive right to ​ commercialize inventions. The rationale behind this decision is still being panned out,

71 and I am closely following their updates .​ ​

Whitehead Institute for Biomedical Research

Founded in 1982, the Whitehead Institute was built as a financially independent entity from its home university, MIT, to address “improving human health through basic biomedical research.” What's special about the Whitehead Institute is that it was a pioneer among early Impact 1.0 labs that stressed the importance of establishing a wholly independent and self-governing entity.The founding members, philanthropist

Edwin Whitehead (who made his wealth from clinical diagnostics industry) and Nobel

Laureate David Baltimore, believed that:

“Such a center would encourage collaboration among young scientists without the constraints inherent in a large teaching institution. New ideas would come to fruition more rapidly because of the shared mission: The pursuit of excellence in biomedical research.”

after securing a handful of senior scientists from around the world, they established the

Whitehead Fellow Program and went on to recruit promising young scientists Just like

Junior Fellows at Harvard, Whitehead Fellows receive utmost freedom to pursue research of their interest for 3-5 years. The program continues to be a breeding ground for the next-generation of world class game-changers, including Eric Lander (key

72 founding member of Broad Institute) .​ ​

87

In deep dedication to following its core philosophy, the Whitehead Institute has been demonstrating a tremendous amount of success. It has published more than 120 high-impact peer-reviewed publications, and has granted 100 licensing agreements so far. One of the most impressive achievements Whitehead pursued is the establishment of the Center for Genome Research, which has become the single largest contributor to

Human Genome Project. In addition, many companies such as Sanofi Genzyme and

Alnylam Pharmaceuticals, have originated from intellectual properties developed there.

What’s notable is that Whitehead achieved all of these just with merely 16 principal investigators and an annual budget under $100M. According to its 2014 Annual Report,

73 the Whitehead Institute financials were as follows in Figure 15 :​ ​

73 Figure 15. Figure from the 2014 Annual Report on Whitehead Institute Financials .​ ​

88

Leadership at Whitehead believes that its edge as an innovation powerhouse comes from its following approach to solving hard problems in biomedical sciences:

“We’ve begun by asking ourselves fundamental questions about, for example, the emerging research challenges we should address; how to continue attracting the best young researchers and mentor them in the most effective way; and what new resources our building should have.”

Regarding its tech transfer efforts, Whitehead created its own Intellectual Property

Office almost two decades ago - separate from MIT’s Technology Licensing Office.

Under the notion that having in-house, domain-specific experts who work side-by-side with the scientists would optimize the research-to-impact transition, Whitehead’s IP

Office has been working tirelessly to augment the transfer efforts traditionally practiced by MIT. Moreover, after realizing that licensing efforts could be streamlined if the IPs could be shared and managed online, Whitehead also developed WIIPS™: the

Whitehead Institute Intellectual Property System (A Relational Database for IP

Management and Technology Transfer). This free database assists technology transfer

74 offices in developing countries .​ ​

Overall, Whitehead is a true pioneer among the Impact 1.0 labs and constantly strives to be the leading impact generator. Other than its ambitious goals, it places huge emphasis on educating the next-generation of scientists via the famed Whitehead

Fellowship program, and guiding bidirectional inspiration between the young and the established. It’s development of WIIPS also indicates that it has been conceiving the idea of becoming an Impact 2.0 lab in some fashion as well.

89

Eli and Edythe L. Broad Institute of MIT and Harvard

Founded in 2004, the Eli and Edythe L. Broad Institute of MIT and Harvard (“Broad

Institute”) is a biomedical and genomic research center affiliated with the following institutions: MIT, Harvard, the Beth Israel Deaconess Medical Center, the Boston

Children's Hospital, the Brigham and Women's Hospital, the Dana–Farber Cancer

Institute, and the Massachusetts General Hospital. Broad Institute was founded under real pressure from MIT and Harvard to create an institute that is “open, collaborative, cross-disciplinary, and able to organize projects at any scale” between them two -

75 especially on the genomic research that was happening at the Whitehead Institute .​ In ​ fact, Broad Institute is actually an offspring of Whitehead Institute’s Center for Genomic

Research. That said, Broad aims to:

“Improve human health by using genomics to advance our understanding of the biology and treatment of human disease, and to help lay the groundwork for a new generation of therapies.”

The institution strives to create an environment in which multidisciplinary teams from biology, chemistry, mathematics, computation, and engineering come together to address the need for accelerated advancement in genomic research. Broad considers itself as an “experiment” in the sense that it strives to demonstrate a collaborative yet integrated approach to scientific discovery and impact generation at industry scale.,

Broad Institute is home to a community of more than 4,300 members, including world-class scientists, physicians, biologists, chemists, computer scientists, engineers, and administrative staff. Similar to the Whitehead Institute, Broad also offers Broad

Fellowships to young scientists who gain a similar level of autonomy that Whitehead

90

Fellows receive. One may argue that Broad is the “energized” version of Whitehead.

Broad has published more than 8,715 publications so far, and has a significantly larger revenue stream, as depicted in the chart below in Figure 16:

Figure 16. Figure from the 2018 Annual Report on Broad Institute Financials.75 ​

75 Its achievements can be attributed to the following seven values :​ (1) Propelling the ​ understanding and treatment of disease, (2) Collaborating deeply, (3) Reaching globally, (4) Empowering scientists, (5) Building partnerships, (6) Sharing data and knowledge, and (7) Promoting inclusion. Considering the fact that genomic research is increasingly becoming more computational, more accessible by global communities, and more collaborative, Broad places special emphasis on sharing data and knowledge.

75 Its “Principles for Disseminating Scientific Innovations” states the following :​ ​

91

● Large foundational datasets: Broad continually strives to make the discoveries ​ it produces as accessible as possible, so that thousands of labs around the world

can stay up-to-date with the most recent breakthroughs made in genomic

research. For example, all their data and computational tools are available online

and free of charge.

● Sharing with academic institutions: Along the same lines as above, Broad’s ​ work is available for use by other academic and non-profit research institutions at

no cost.

● Interactions with industry: Broad clearly understands that industry plays an ​ essential role in maximizing impact with research discoveries, as industry brings

in “funding at a scale that can typically be obtained only from private investment;

specialized scientific expertise about drug development that is not typically

available in academia; or the ability and infrastructure to run large clinical trials.”

Therefore, it maintains very strong relationships with various industry partners via

its own tech transfer management team. There are two notable points to be

made here:

○ Recently, it has also formed a “first-of-its-kind partnership” with investment

firm Deerfield Management, in which Deerfield is committing more than

$50M over 5 years to advance therapeutics research. Follow-on funding is

also being discussed to support the creation of new entities originating

92

from the research. Moreover, Deerfield will be dedicating all profits made

from their investment to its healthcare philanthropic arm, the Deerfield

76 Foundation .​ ​

○ Broad has been facing some controversy regarding licensing. It grants

both non-exclusive and exclusive licenses to companies, depending on

what Broad believes is the best way to maximize impact without

compromising other important values, such as safety. “An example is the

composition-of-matter of drug. Without an exclusive license, a company

would be reluctant to invest hundreds of millions of dollars in a clinical trial

to demonstrate safety and efficacy, because competitors could

subsequently ‘free-ride’ on their results to bring the same product to

market.” In addition, it recently gave exclusive licensing rights to Editas

Inc. regarding the CRISPR technologies developed there, raking in

millions of dollars in revenue. Of course, the money goes back to Broad

and will fund other research projects there, but whether or not Broad as a

nonprofit organization should fulfil its duty to maximize public benefits

76-78 (what’s expected from a non-profit organization) is being questioned .​ ​ Broad’s argument though has a compelling point; it says they did it to

ensure sustainable growth of the technology (to avoid the misuse by

giving it to the people that have the best understanding of the tech). This

is a one time deal Broad must have negotiated hard with MIT and

93

Harvard, but it’s being seen as an act that is redefining what a non-profit

75 can be .​ ​ ● Policy considerations: due to the fact that genome editing technologies are ​ subject to mal-manipulation and require cautious measures sometimes, Broad

puts a heavy priority on policy guidelines regarding its technologies before

78 dissemination .​ ​

Indubitably, Broad is doing an excellent job at creating unprecedented world-wide impact. Their impact generation formula is very unique, and many aspects of it can be used as a source of inspiration for designing Impact 2.0 labs after careful consideration.

Wyss Institute for Biologically Inspired Engineering

Founded in 2009 and headquartered at Harvard, Wyss Institute for Biologically Inspired

Engineering (“Wyss Institute”) is a cross-disciplinary, inter-departmental research center with a mission to “discover the biological principles that Nature uses to build living things, and to harness these insights to create biologically inspired engineering innovations to advance human health and create a more sustainable world.”

Besides its research mission, it also has very unique agenda for impact generation that are geared towards addressing the translational challenges between research and

79 commercialization .​ When establishing the center, the founding philanthropist and ​

94

medical device businessman Hansjörg Wyss (donated $250M to Wyss Institute) noted that:

“Big companies will not take risks. I try to convince my people to change directions, and ​ they do not respond; it’s like trying to turn a huge tanker. You academics do innovative things, but all you do is publish papers and make widgets. What I would like to see is an innovative startup in the world’s greatest academic environment that will take risks and have positive near term impact on the world.

Rephrasing Wyss’ argument in a more organized manner, the problem Wyss Institute

79 indemnifies is as follows :​ ​ ● “Companies tend to be optimized to generate intellectual property, protect its

value, and develop products efficiently; however, as they are primarily focused

on maximizing shareholder value, they often do not support pursuit of high-risk

ideas that arise in their organizations.”

● “Difficulties in fostering innovative technology development are typically

compounded in industry by most companies’ unwillingness to change market

direction and refocus operations due to the large capital investments they often

have made for scale up and manufacturing of existing products in already

well-defined market segments.”

● “In contrast, academia is a cauldron of innovation and creativity, but the focus is

on publications with little focus or investment in intellectual property creation and

95

its translation into useful products; thus, only few research advances have any

direct impact on society at large”

Clearly taking this as inevitable homework for accelerated impact generation, Wyss

Institute meticulously broke apart existing structures in academia, industry, government and devised a new technology translation model that is extremely unique and effective, as evidenced by its impressive track records of success. Within eight years, the members of Wyss Institute published more than 1,700 articles, submitted close to 2,000 patent filings, and launched more than 19 startups, including Prof. George Church’s

ReadCoor.

Discussing its technology translation model first starts from breaking down all the translational activities Wyss conceives to take idea conception to commercialization.

Diagram below is taken from Wyss, and it depicts the “Technology Innovation Funnel” that involves six key stages: (1) idea conception, (2) early technical development, (3) technical de-risking, (4) business development, (5) intellectual property protection, and

(6) eventual commercialization. This is a very systematic approach with room for creative flexibility. At every stage, Wyss provides relevant support to inventors of

79 technologies to go from one stage to the next :​ ​

● During the “Skunkworks” stage, (1) idea conception and (2) early technical

development are carried out. This stage is optimized for harnessing the creative

96

freedom found in academia, and Wyss provides each faculty member with salary

and funding to hire two students and fellows to best support this pursuit. Wyss

also has practical education programs setup within the institute, so that

researchers are taught how to write and disclose reports of invention (ROIs) and

patent filing processes as early in the development process as possible.

Figure 17. The institute’s innovation funnel. The funnel maps the institute’s technology development and translational activities from idea conception through 79 commercialization. Modified from Tolikas et al (2017) .​ ​

● During the “Concept Validation” stage, (3) technical de-risking, (4) business

development, and early initiatives for (5) intellectual property protection take

place. Wyss provides much-needed and human resources to make overcoming

these steps a lot more efficient than what researchers in traditional academic

setting would image:

○ “Advanced Technology Team” (ATT): among >350 full-time staff

members, close to 40 of them are scientists and engineers from industry

who bring in domain-specific knowledge and commercial development

97

experience to the institute. They are closely integrated with the institute

researchers and make sure bidirectional knowledge transfer between

research and industry happens at all times for accelerated technology

development.

○ Business development staff: working closely with Harvard Office of

Technology Development, business development staff provide practical

feedback on various factors involved in research-to-market process and

work hands-on with researchers on business formation (i.e., strategic

partnership formation with hospitals).

○ In-house intellectual property attorney: they help with reviewing ROIs even

as early as when they are just idea pieces and provide any conceivable

legal feedback deem necessary for successful technology development.

● During the “Technology Refinement” stage, pursuits carried out during “Concept

Validation” stage meet an elevated level of intensity. Creative flexibility happens

here the most. Sometimes Wyss attracts institutional or philanthropic investors

who are willing to provide funding for technologies at this stage to overcome the

“Valley of Death” and further de-risking. Wyss also goes out and identifies

potential end-users who are willing to provide resources and funding for the

technologies of their interest. If needed, Wyss recruits

Entrepreneurs-In-Residence (EIRs) to incubate these promising inventions as

98

“virtual startups” within the institute until they are mature enough to compete

effectively in the traditional venture capital or private investment market.

Other infrastructural support provided by Wyss through the process are, to name a few, industry-grade manufacturing and prototyping facility startup-minded administrative staff, and constantly expanding list of academic collaborators, government agencies, strategic partnerships with hospitals, companies, investors, and other various types of organizations. Wyss Institute is doubtlessly taking bold steps and reinventing the way academia perceives research-to-market translations.

99

Chapter 7

Conclusion

Upon entering Professor Edward Boyden’s group at MIT, I had an ambitious dream to demonstrate and pioneer the proof of concept work towards the feasibility of brain mapping in a scalable fashion. I believe integrated expansion microscopy technologies, namely proExM, iExM, ExFISH, MxM, and ExM-compatible extracellular space filling, has the highest potential to produce the type of data that can be best utilized for neuronal segmentation and, ultimately, the molecularly annotated connectome. As we develop and integrate these technologies to full maturity, there will be many collaborative efforts from researchers with different expertise. We would need continuous use of other super-resolution techniques, such as STORM, to further cross-validate our expansion microscopy technologies; we would need increased support from electron microscopy specialists to troubleshoot our intercalating lipid probes palm-GeLy and MxM; we would need genomic sequencing experts to help us understand and debug our in-situ sequencing data gained from ExFISH. These are just on the imaging side. Our group has been collaborating significantly with computer science and mathematics labs and industry partners across the world to determine which molecular annotations and image qualities are needed for facilitating neuronal segmentation in the brain images we collect. And it is also very plausible that the technologies available today may not be able to provide us with the complete picture.

As I am writing this thesis right now, I am going in and out of our lab continuing to experiment with new types of probes that illuminate glycocalyx on the cellular

100

membranes, as well as the probes that allow for imaging of multiple proteins at once

(considering the spectral overlap, we oftentimes only limit ourselves to imaging four colors at a time). However, the bottom line is expansion microscopy is an easily accessible and cost-efficient super-resolution microscopy technique that allows brain mapping effort to be democratized. Imaging the brain over the length scales that conventional biology laboratories do not regularly deal with will definitely require a concerted inputs from multiple research groups and centers all over the world. When doing so, these collaborating partners should not be limited to fully exercise their ability to contribute meaningfully towards solving one of the greatest challenges we face today

- understanding the brain. For this reason, I am most excited about my next career trajectory. I want to spend time thinking about and designing what would be the right monetary and infrastructural vehicles for the world to work together towards this important challenge. The way scientific funding is distributed and vetted may need to be re-engineered. Impact generated per dollar spent may need to be re-imagined - not only to provide clearest transparency and motivation to the scientists working on them, but also to attract a larger pool of donors and raise more funds for future growth. I am truly excited about the future.

101

Biblioraphy

1. Miller, G. Beyond DSM: Seeking a Brain-Based Classification of Mental Illness. ​ ​ Science 327, (2010) ​ ​

2. Choi, D. et al. Medicines for the Mind: Policy-Based “Pull” Incentives for Creating ​ ​ Breakthrough CNS Drugs. Neuron 84, (2014). ​ ​

3. Feuerstein, A. How does Biogen move beyond its Alzheimer’s drug blowup? Listen for the clues. STAT, (2019).

4. Marblestone, A & Boyden, E. Designing Tools for Assumption-Proof Brain Mapping. Neuron 83, (2014). ​ ​

5. Buzsáki, G. et al. The origin of extracellular fields and currents -- EEG, ECoG, ​ ​ LFP and spikes. Nat. Rev. Neurosci. 13, (2012). ​ ​

6. Brown, S. & Hestrin, S. Cell-type identity: a key to unlocking the function of neocortical circuits. Curr. Opin. Neurobiol. 19, (2009). ​ ​

7. Hardingham, N. et al. The role of nitric oxide in pre-synaptic plasticity and ​ ​ homeostasis. Front. Cell. Neurosci. 7, (2013). ​ ​

8. Morgan, J. & Lichtman, J. Why not connectomics? Nat Methods 10, (2013). ​ ​

9. Engert, F. The Big Data Problem: Turning Maps into Knowledge. Neuron 83, ​ ​ (2014).

10.Laughlin, R. Physics, Emergence, and the Connectome. Neuron 6, (2104). ​ ​

11.White, J. et al. The structure of the nervous system of the nematode ​ ​ Caenorhabditis elegans. Royal Society 314, (1986). ​ ​ ​

12.Lichtman, J. et al. The big data challenges of connectomics. Nat. Neurosci. 17, ​ ​ ​ ​ (2014).

13.Emmons, S. The beginning of connectomics: a commentary on White et al. (1986) ‘The structure of the nervous system of the nematode Caenorhabditis ​ elegans.’ Royal Society 370, (2015). ​ ​ ​

14.Venter, J. et al. The Sequence of the Human Genome. Science 291, (2001). ​ ​ ​ ​

15.Lee, J. et al. Fluorescent in situ sequencing (FISSEQ) of RNA for gene ​ ​ expression profiling in intact cells and tissues. Nature Protocols 10, (2015). ​ ​

102

16.Schermelleh, L. et al. A guide to super-resolution fluorescence microscopy. J. ​ ​ Cell. Biol. 190, (2010). ​ ​

17.Baddeley, D. et al. Measurement of replication structures at the nanometer scale ​ ​ using super-resolution light microscopy. Nucleic Acids Res. 38, (2010). ​ ​

18.Bates, M. et al. Multicolor superresolution imaging with photo-switchable ​ ​ fluorescent probes. Science 317, (2007). ​ ​

19.Betzig, E. & Trautman, J. Near-field optics: microscopy, spectroscopy, and surface modification beyond the diffraction limit. Science 257, (1992). ​ ​

20.Hell, S. Far-field optical nanoscopy. Science 316, (2007). ​ ​

21.Hell, S. & Kroug, M. Ground-state depletion fluorescence microscopy, a concept ​ for breaking the diffraction resolution limit. Appl. Phys. B. 60, (1995). ​ ​

22.Hell, S. & Wichmann, J. Breaking the diffraction resolution limit by stimulated emission: stimulated-emission-depletion fluorescence microscopy. Opt. Lett. 19, ​ ​ (1994).

23.Hell, S. et al. Measurement of the 4pi-confocal point spread function proves 75 ​ ​ nm axial resolution. Appl. Phys. Lett. 64, (1994). ​ ​

24.Hell, S. et al. Confocal microscopy with an increased detection aperture: type-B ​ ​ 4Pi confocal microscopy. Opt. Lett. 19, (1994). ​ ​

25.Hess, S. et al. Ultra-high resolution imaging by fluorescence photoactivation ​ ​ localization microscopy. Biophys. J. 91, (2006). ​ ​

26.Gao, R. et al. Cortical column and whole-brain imaging with molecular contrast ​ ​ and nanoscale resolution. Science 363, (2019). ​ ​

27.Wassie, A. et al. Expansion microscopy: principles and uses in biological ​ ​ research. Nature Methods 16, (2018). ​ ​

28.Gambarotto, D. et al. Imaging cellular ultrastructures using expansion ​ ​ microscopy (U-ExM). Nature Methods 16, (2019). ​ ​

29.Igarashi, M. et al. New observations in neuroscience using superresolution ​ ​ microscopy. Journal of Neuroscience 38, (2018). ​ ​

30.Asano, S. et al. Expansion Microscopy: Protocols for Imaging Proteins and RNA ​ ​ in Cells and Tissues. Curr. Proto. in Cell Bio. 80, (2018). ​ ​

103

31.Karagiannis, E. & Boyden, E. Expansion microscopy: development and neuroscience applications. Curr. Opinion in Neurobiology 50, (2018). ​ ​

32.Freifeld, L. et al. Expansion microscopy of zebrafish for neuroscience and ​ ​ developmental biology studies. Proceedings of the National Academy of Sciences 114, (2017). ​ ​

33.Richter, K. et al. Glyoxal as an alternative fixative to formaldehyde in ​ ​ immunostaining and super-resolution microscopy. The EMBO Journal 37, (2017). ​ ​

34.Yoon, Y. et al. Feasibility of 3D Reconstruction of Neural Morphology Using ​ ​ Expansion Microscopy and Barcode-Guided Agglomeration. Frontiers in Computational Neuroscience 11, (2017). ​ ​

35.Wang, Y. et al. Rapid Sequential in Situ Multiplexing with DNA Exchange ​ ​ Imaging in Neuronal Cells and Tissues. Nano Letters 17, (2107). ​ ​

36.Zhao, Y. et al. Nanoscale imaging of clinical specimens using ​ ​ pathology-optimized expansion microscopy. Nature Biotechnology 35, (2017). ​ ​

37.Gao, R. et al. Q&A: Expansion microscopy. BMC Biology 15, (2017). ​ ​ ​ ​

38.Chang, J. et al. Iterative expansion microscopy. Nature Methods 14, (2017). ​ ​ ​ ​

39.Chen, F. et al. Nanoscale Imaging of RNA with Expansion Microscopy. Nature ​ ​ Methods 13, (2016). ​ ​

40.Tillberg, P. et al. Protein-retention expansion microscopy of cells and tissues ​ ​ labeled using standard fluorescent proteins and antibodies. Nature Biotechnology 34, (2016). ​

41.Zhang, Y. et al. Hybrid Microscopy: Enabling Inexpensive High-Performance ​ ​ Imaging through Combined Physical and Optical Magnifications. Scientific Reports 6, (2016). ​ ​

42.Chen, F. et al. (2015) Expansion Microscopy. Science 347, (2105). ​ ​ ​ ​

43.Tanaka, T. et al. Phase Transitions in Ionic Gels. Phys. Rev. Lett. 45, (1980). ​ ​ ​ ​

44.Ohmine, I. Salt effects on the phase transition of ionic gels. J. Chem. Phys. 77, ​ ​ (1982).

45.Buchholz, F. Trends in the Development of Superabsorbent for Diapers Superabsorbent Polymers 573, (1994). ​ ​

104

46.O’Connell, P. & Brady, C. Polyacrylamide gels with modified crosslinkages. Anal. Biochem. 76, (1976). ​ ​

47.Kurenkov, V. et al. Alkaline hydrolysis of polyacrylamide. Russ. J. Appl. Chem. ​ ​ 74, (2001). ​

48.Cipriano, B.H. et al. Superabsorbent hydrogels that are robust and highly ​ ​ stretchable. Macromolecules 47, (2014). ​ ​

49.Gan, W. et al. Multicolor “DiOlistic” labeling of the nervous system using lipophilic ​ ​ dye combinations. Neuron 27, (2000). ​ ​

50.Cochilla, A. et al. Monitoring secretory membrane with FM1–43 fluorescence. ​ ​ Annu Rev Neurosci. 22, (1999). ​ ​

51.Revelo, N. et al. A new probe for super-resolution imaging of membranes ​ ​ elucidates trafficking pathways. J. Cell. Biol. 205, (2014). ​ ​

52.Fujiwara T. et al. Phospholipids undergo hop diffusion in compartmentalized cell ​ ​ membrane. J. Cell. Biol. 157, (2002). ​ ​

53.Reis, A. & Spickett, C. Chemistry of phospholipid oxidation. Biochim. Biophys. Acta. Biomembr. 1818, (2012). ​ ​

54.Tomida, J. et al. Detection of phosphorylation on large proteins by western ​ ​ blotting using Phos-tag containing gel. Nat Protoc 232, (2008). ​ ​

55.Yeung, T. et al. Membrane Phosphatidylserine Regulates Surface Charge and ​ ​ Protein Localization. Science 319, (2008). ​ ​

56.Vollert, C. et al. Formaldehyde scavengers function as novel antigen retrieval ​ ​ agents. Scientific Reports 5, (2015). ​ ​

57.Thavarajah, R. et al. Chemical and physical basics of routine formaldehyde ​ ​ fixation. J. Oral. Maxillofac. Pathol. 16, (2012). ​ ​

58.Kasthuri, N. et al. Formaldehyde scavengers function as novel antigen retrieval ​ ​ agents. Cell 162, (2015). ​ ​

59.Greenbaum, A. et al. Bone CLARITY: Clearing, imaging, and computational ​ ​ analysis of osteoprogenitors within intact bone marrow. Science Trans. Medicine 9, (2017). ​

105

60.Van Harreveld, A. & Khattab, F. Perfusion fixation with glutaraldehyde and post-fixation with osmium tetroxide for electron microscopy. Journal of Cell Science 3, (1968). ​ ​

61.Van Harreveld, A. & Fifkova, E. Rapid freezing of deep cerebral structures for electron microscopy. The Anatomical Record 182, (1975). ​ ​

62.Pallotto, M. et al. Extracellular space preservation aids the connectomic analysis ​ of neural circuits. eLife 4, (2015). ​ ​

63.Van Harreveld, A. & S. K. Malhotra. Extracellular space in the cerebral cortex of the mouse. Journal of Anatomy 101, (1967). ​ ​

64.Cragg, B. Brain extracellular space fixed for electron microscopy. Neuroscience Letters 15, (1979). ​ ​

65.Rapoport, S. Opening of the blood-brain barrier by acute hypertension. Experimental Neurology 52, (1976). ​ ​

66.Horikawa, K. & Armstrong, W. A versatile means of intracellular labeling: injection of biocytin and its detection with avidin conjugates. Journal of Neuroscience Methods 25, (1988). ​ ​

67.Diamandis, P. & Christopoulos, T. The biotin-(strept) avidin system: principles and applications in biotechnology. Clinical Chemistry 37, (1991). ​ ​

68.Ruoslahti, E. Brain extracellular matrix. Glycobiology 6, (1996). ​ ​

69.Burger, S. et al. The Investment Gap that Threatens the Planet. Stanford Social ​ ​ Innovation Review, (2018).

70.Dugan, R. & Gabriel, K. “Special Forces” Innovation: How DARPA Attacks Problems. Harvard Business Review, (2013).

71.Kosnitzky, M. The Chan Zuckerberg Initiative: LLC for Philanthropy. Pillsbury, (2016).

72.Paradigm. Whitehead Institute, (2017).

73.Whitehead Institute Annual Report, (2017).

74.Hamzaoui, A. WIIPS: Whitehead Institute Intellectual Property System: A Relational Database of IP Management and Technology Transfer. In Intellectual Property Management in Health and Agricultural Innovation: A Handbook of Best Practices, (2007).

106

75.Broad Institute, (2019).

76.Schwartz, D. Broad Institute in first-of-its-kind partnership with investment firm to tackle serious unmet medical needs. University-Industry Engagement Week, (2017).

77.Kozubek, J. The Broad Institute is testing the limits of what ‘nonprofit’ means. STAT, (2017).

78.Contreras, J. & Sherkow, J. CRISPR, surrogate licensing, and scientific discovery. Science 355, (2017). ​ ​

79.Tolikas, M. et al. The Wyss institute: A new model for medical technology ​ ​ innovation and translation across the academic#industrial interface. Bioengineering & Translational Medicine 2, (2017). ​ ​

107