COSYNE 2012 Workshops February 27 & 28, 2012 Snowbird, Utah

Total Page:16

File Type:pdf, Size:1020Kb

COSYNE 2012 Workshops February 27 & 28, 2012 Snowbird, Utah COSYNE 2012 Workshops February 27 & 28, 2012 Snowbird, Utah Monday, February 27 Organizer(s) Location 1. Coding and Computation in visual short-term memory. W.J. Ma Superior A 2. Neuromodulation: beyond the wiring diagram, M. Higley Wasatch B adding functional flexibility to neural circuits. 3. Sensorimotor processes reflected in spatiotemporal Z. Kilpatrick Superior B of neuronal activity. J-Y. Wu 4. Characterizing neural responses to structured K. Rajan Wasatch A and naturalistic stimuli. W.Bialek 5. Neurophysiological and computational mechanisms of D. Freedman Magpie A categorization. XJ. Wang 6. Perception and decision making in rodents S. Jaramillio Magpie B A. Zador 7. Is it time for theory in olfaction? V.Murphy Maybird N. Uchida G. Otazu C. Poo Workshop Co-Chairs Email Cell Brent Doiron, Pitt [email protected] 412-576-5237 Jess Cardin, Yale [email protected] 267-235-0462 Maps of Snowbird are at the end of this booklet (page 32). 1 COSYNE 2012 Workshops February 27 & 28, 2012 Snowbird, Utah Tuesday, February 28 Organizer(s) Location 1. Understanding heterogeneous cortical activity: S. Ardid Wasatch A the quest for structure and randomness. A. Bernacchia T. Engel 2. Humans, neurons, and machines: how can N. Majaj Wasatch B psychophysics, physiology, and modeling collaborate E. Issa to ask better questions in biological vision J. DiCarlo 3. Inhibitory synaptic plasticity T. Vogels Magpie A H Sprekeler R. Fromeke 4. Functions of identified microcircuits A. Hasenstaub Superior B V. Sohal 5. Promise and peril: genetics approaches for systems K. Nagal Superior A neuroscience revisited. D. Schoppik 6. Perception and decision making in rodents S. Jaramillio Magpie B A. Zador 7. Is it time for theory in olfaction? V. Murphy Maybird N. Uchida G. Otazu C. Poo Workshop Co-Chairs Email Cell Brent Doiron, Pitt [email protected] 412-576-5237 Jess Cardin, Yale [email protected] 267-235-0462 2 Schedule Each workshop group will meet in two sessions from 8 to 11 am and from 4:30 - 7:30 pm. Workshop summaries and schedules are available starting on page 3 of this booklet. Transportation Marriott Downtown to Snowbird: Free shuttle provided for registered attendees (leaves @ 5pm on Sunday, 27-Feb). Snowbird to Salt Lake City Airport: Shuttle can also be arranged at Snowbird, or online at: https://store.snowbird.com/products/index.php?product_category_idx=2. Further information about transportation to/from Snowbird is available at: http://www.snowbird.com/about/accessibility.html For further information on transportation or other logistics please contact Denise Soudan ([email protected]). Lift tickets Discounted workshop rates Snowbird Chairlifts only: $62 Snowbird Tram & Chairlifts: $72 Pick up at the Cliff ticket window (level 1 of the Cliff Lodge next to the ski rental shop) or at the ticket window on the top level of the Snowbird Center (the plaza deck). Meals included with registration Breakfast (Day 1 and Day 2) - The Cliff Ballroom Dinner (Day 2) - The Cliff Ballroom Coffee breaks during morning and afternoon sessions 3 Monday, Feb 27 1. Coding and computation in visual short-term memory Organizer Wei Ji Ma, Baylor College of Medicine Visual short-term memory (VSTM) is essential for detecting changes between visual scenes, integrating information across eye fixations, and planning reaching movements. Like other forms of short-term memory, it is generally thought to be resource-limited. As the number of objects to be remembered increases, observer performance on VSTM tasks typically decreases rapidly. A key theoretical debate centers around whether VSTM resource is a discrete or a continuous quantity, and related to this, how the encoding precision of stimuli in VSTM should be modeled. In recent years, converging efforts in psychophysics, systems neuroscience, and mathematical modeling have furthered our understanding of VSTM coding and limitations. Single-neuron recordings in parietal and frontal areas have revealed neural signatures of limited resources, new developments in fMRI are making the decoding of the contents of VSTM possible, and models have reconceptualized VSTM tasks like change detection as probabilistic inference problems. The workshop will present a broad overview of recent developments in the study of VSTM. 4 Monday, Feb 27 Coding and computation in visual short-term memory Morning Session 8:20-8:30am Introduction 8:30-9:00am Ed Awh. Discrete resource limits during encoding, selection and storage 9:00-9:30am George Alvarez. The structure and content of working memory representations 9:30-10:00am Coffee Break 10:00-10:30am Wei Ji Ma. Visual short-term memory limitations from variable, continuous resources 10:30-11:00am Earl Miller. Neural dynamics for working memory and cognitive capacity Afternoon Session 4:30-5:00pm Martin Pare. A change detection approach to study visual working memory in the macaque monkey 5:00-5:30pm Frank Tong. Decoding reveals the contents of visual working memory in early visual areas 5:30-6:00pm Chris Sims. Information theory and visual working memory: A new (old) theoretical foundation for modeling capacity limits 6:00-6:30pm Coffee Break 6:30-7:00pm Robert Jacobs. A probabilistic clustering theory of the organization of visual short-term memory 7:00-7:30pm General discussion 5 Monday, Feb 27 2. Neuromodulation: Beyond the wiring diagram, adding functional flexibility to neural circuits Organizer Michael J. Higley, Yale University Flexibility is a critical property of neural circuits, allowing adaptive changes in activity over a range of time scales and enabling context-dependent behavior. While technological advances have enhanced our understanding of the brain’s “wiring diagram”, increasing attention is being paid to processes that confer dynamic flexibility to established circuits. One such example is neuromodulation, a ubiquitous nervous system phenomenon that spans the phylogenetic spectrum of research models. In contrast to classical fast neurotransmitters that directly excite or inhibit postsynaptic neurons, neuromodulators alter the intrinsic membrane properties of cells and modify synaptic transmission. The actions of modulators like acetylcholine, dopamine, and norepinephrine are critical for cognitive functions including working memory, focused attention, and action selection. Additionally, perturbation of modulatory systems is strongly implicated in psychiatric and neurological illnesses such as schizophrenia, depression, Parkinson’s and Alzheimer’s Diseases. Despite overwhelming evidence for their general importance, the specific cellular and network mechanisms by which neuromodulators influence behavior remain elusive. However, a surge in technological, experimental, and computational advances have opened up new avenues towards understanding the actions of neuromodulators in vitro and in vivo. In this one-day workshop, we bring together neuroscientists working across multiple levels of analysis to discuss recent findings from cellular, circuit, behavioral, and computational perspectives. Key focal questions for discussion will include: (1) What are the relevant time scales for neuromodulatory actions? (2) Is neuromodulation best understood at the cell, circuit, or organismal level? (3) Can activity in distinct neuromodulatory systems be causally linked to specific behaviors? (4) What new methodologies are being developed for the study of neuromodulation? (5) How are neuromodulatory processes best incorporated into computational models? 6 Monday, Feb 27 Neuromodulation: Beyond the wiring diagram, adding functional flexibility to neural circuits Morning Session 8:15-8:30am Michael Higley. Introduction 8:30-9:00am Daniel Durstewitz. Dopamine modulation of prefrontal cortical attractor states revealed by multi-single unit recordings 9:00-9:30am Jeremy Seamans. What do we really know about phasic signaling in the mesocorticolimbic pathway? 9:30-10:00am Coffee Break 10:00-10:30am Xiao-Jing Wang. Complexity of neuromodulation in the prefrontal cortex 10:30-11:00am Randy Bruno. Modulating local circuit dynamics versus long range inputs Afternoon Session 4:30-5:00pm Alexander Thiele. Cholinergic and glutamatergic control of attention in V1: a double dissociation 5:00-5:30pm Martin Sarter. Cholinergic double-duty: top-down control of cortical glutamatergic-cholinergic transients to optimize cure- detection 5:30-6:00pm Michael Higley. Co-release of GABA and acetylcholine: bidirectional control of cortical interneurons 6:00-6:30pm Coffee Break 6:30-7:00pm Karl Deisseroth. Optogenetic neuromodulation: tools and applications 7:00-7:30pm General Discussion and Wrap-up 7 Monday, Feb 27 3. Sensorimotor processes reflected in spatiotemporal patterns of neuronal activity Organizers: Zackeray Kilpatrick, University of Pittsburgh Jian-Young Wu, Georgetown University Abstract Cortical sensory and motor processes are often subserved by spatiotemporal patterns of neuronal activity. Analysis of such activity often considers coarse-grained statistics of the discharges of neuronal networks involved. Recent advances in voltage sensitive dyes, optical imaging, and multi-electrode recording allow population activity to be studied both in vivo and in slice preparations with excellent spatiotemporal resolution. The genesis of coherent neural activity can then be explored using external stimuli, receptor antagonists, and even optogenetic manipulation. Results from these studies then guide experimental and computational models that consider the role dynamics play in functions
Recommended publications
  • Complex Internal Representations in Sensorimotor Decision Making: a Bayesian Investigation Doctor of Philosophy, 2014
    COMPLEXINTERNALREPRESENTATIONS INSENSORIMOTORDECISIONMAKING: A BAYESIAN INVESTIGATION luigi acerbi I V N E R U S E I T H Y T O H F G E R D I N B U Doctor of Philosophy Doctoral Training Centre for Computational Neuroscience Institute of Perception, Action and Behaviour School of Informatics University of Edinburgh 2014 Luigi Acerbi: Complex internal representations in sensorimotor decision making: A Bayesian investigation Doctor of Philosophy, 2014 supervisors: Prof. Sethu Vijayakumar, Ph.D., FRSE Prof. Daniel M. Wolpert, D.Phil., FRS, FMedSci ABSTRACT The past twenty years have seen a successful formalization of the idea that percep- tion is a form of probabilistic inference. Bayesian Decision Theory (BDT) provides a neat mathematical framework for describing how an ideal observer and actor should interpret incoming sensory stimuli and act in the face of uncertainty. The predictions of BDT, however, crucially depend on the observer’s internal models, represented in the Bayesian framework by priors, likelihoods, and the loss function. Arguably, only in the simplest scenarios (e.g., with a few Gaussian variables) we can expect a real observer’s internal representations to perfectly match the true statistics of the task at hand, and to conform to exact Bayesian computations, but how humans systemati- cally deviate from BDT in more complex cases is yet to be understood. In this thesis we theoretically and experimentally investigate how people represent and perform probabilistic inference with complex (beyond Gaussian) one-dimensional distributions of stimuli in the context of sensorimotor decision making. The goal is to reconstruct the observers’ internal representations and details of their decision- making process from the behavioural data – by employing Bayesian inference to un- cover properties of a system, the ideal observer, that is believed to perform Bayesian inference itself.
    [Show full text]
  • Dynamic Compression and Expansion in a Classifying Recurrent Network
    bioRxiv preprint doi: https://doi.org/10.1101/564476; this version posted March 1, 2019. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-NC-ND 4.0 International license. Dynamic compression and expansion in a classifying recurrent network Matthew Farrell1-2, Stefano Recanatesi1, Guillaume Lajoie3-4, and Eric Shea-Brown1-2 1Computational Neuroscience Center, University of Washington 2Department of Applied Mathematics, University of Washington 3Mila|Qu´ebec AI Institute 4Dept. of Mathematics and Statistics, Universit´ede Montr´eal Abstract Recordings of neural circuits in the brain reveal extraordinary dynamical richness and high variability. At the same time, dimensionality reduction techniques generally uncover low- dimensional structures underlying these dynamics when tasks are performed. In general, it is still an open question what determines the dimensionality of activity in neural circuits, and what the functional role of this dimensionality in task learning is. In this work we probe these issues using a recurrent artificial neural network (RNN) model trained by stochastic gradient descent to discriminate inputs. The RNN family of models has recently shown promise in reveal- ing principles behind brain function. Through simulations and mathematical analysis, we show how the dimensionality of RNN activity depends on the task parameters and evolves over time and over stages of learning. We find that common solutions produced by the network naturally compress dimensionality, while variability-inducing chaos can expand it. We show how chaotic networks balance these two factors to solve the discrimination task with high accuracy and good generalization properties.
    [Show full text]
  • Moira Rose (Molly) Dillon
    Moira Rose Dillon Department of Psychology, New York University, 6 Washington Place, New York, NY 10003 Email – [email protected] Departmental Website – http://as.nyu.edu/psychology/people/faculty.Moira-Dillon.html Lab Website – https://www.labdevelopingmind.com Employment New York University New York, NY Assistant Professor, Department of Psychology, Faculty of Arts and Sciences (July 2017-present) Concurrent Positions New York University New York, NY Faculty Affiliate, Institute of Human Development and Social Change, Steinhardt School of Culture, Education, and Human Development (May 2019-present) Massachusetts Institute of Technology Cambridge, MA Invited Researcher, Abdul Latif Jameel Poverty Action Lab (J-PAL), Foundations of Learning (April 2021-present) Education Harvard University Cambridge, MA (August 2011-May 2017) Ph.D., Psychology (May 2017) A.M., Psychology (May 2014) Yale University New Haven, CT (August 2004-May 2008) B.A., Cognitive Science; Art (May 2008) Funding 2019-2024 National Science Foundation (PI: $1,718,437) CAREER: Becoming Euclid: Characterizing the geometric intuitions that support formal learning in mathematics (PI: $24,671) CLB: Career-Life Balance Faculty Early Career Development Program Supplement 2019-2023 DARPA (Co-PI; to NYU: $1,703,553; to Dillon: $871,874) Cognitive milestones for Machine Common Sense, Co-PI: Brenden Lake 2019-2021 Jacobs Foundation (PI: 150,000 CHF) Early Career Research Fellowship 2018-2019 Institute of Human Development and Social Change at NYU (PI: $14,848) The arc of geometric
    [Show full text]
  • Dynamics of Excitatory-Inhibitory Neuronal Networks With
    I (X;Y) = S(X) - S(X|Y) in c ≈ p + N r V(t) = V 0 + ∫ dτZ 1(τ)I(t-τ) P(N) = 1 V= R I N! λ N e -λ www.cosyne.org R j = R = P( Ψ, υ) + Mγ (Ψ, υ) σ n D +∑ j n k D k n MAIN MEETING Salt Lake City, UT Feb 27 - Mar 2 ................................................................................................................................................................................................................. Program Summary Thursday, 27 February 4:00 pm Registration opens 5:30 pm Welcome reception 6:20 pm Opening remarks 6:30 pm Session 1: Keynote Invited speaker: Thomas Jessell 7:30 pm Poster Session I Friday, 28 February 7:30 am Breakfast 8:30 am Session 2: Circuits I: From wiring to function Invited speaker: Thomas Mrsic-Flogel; 3 accepted talks 10:30 am Session 3: Circuits II: Population recording Invited speaker: Elad Schneidman; 3 accepted talks 12:00 pm Lunch break 2:00 pm Session 4: Circuits III: Network models 5 accepted talks 3:45 pm Session 5: Navigation: From phenomenon to mechanism Invited speakers: Nachum Ulanovsky, Jeffrey Magee; 1 accepted talk 5:30 pm Dinner break 7:30 pm Poster Session II Saturday, 1 March 7:30 am Breakfast 8:30 am Session 6: Behavior I: Dissecting innate movement Invited speaker: Hopi Hoekstra; 3 accepted talks 10:30 am Session 7: Behavior II: Motor learning Invited speaker: Rui Costa; 2 accepted talks 11:45 am Lunch break 2:00 pm Session 8: Behavior III: Motor performance Invited speaker: John Krakauer; 2 accepted talks 3:45 pm Session 9: Reward: Learning and prediction Invited speaker: Yael
    [Show full text]
  • COSYNE 2014 Workshops
    I (X;Y) = S(X) - S(X|Y) in c ≈ p + N r V(t) = V0 + ∫dτZ 1(τ)I(t-τ) P(N) = V= R I 1 N! λ N e -λ www.cosyne.org R j = γ R = P( Ψ, υ) + M(Ψ, υ) σ n D n +∑ j k D n k WORKSHOPS Snowbird, UT Mar 3 - 4 ................................................................................................................................................................................................................. COSYNE 2014 Workshops Snowbird, UT Mar 3-4, 2014 Organizers: Tatyana Sharpee Robert Froemke 1 COSYNE 2014 Workshops March 3 & 4, 2014 Snowbird, Utah Monday, March 3, 2014 Organizer(s) Location 1. Computational psychiatry – Day 1 Q. Huys Wasatch A T. Maia 2. Information sampling in behavioral optimization – B. Averbeck Wasatch B Day 1 R.C. Wilson M. R. Nassar 3. Rogue states: Altered Dynamics of neural activity in C. O’Donnell Magpie A brain disorders T. Sejnowski 4. Scalable models for high dimensional neural data I. M. Park Superior A E. Archer J. Pillow 5. Homeostasis and self-regulation of developing J. Gjorgjieva White Pine circuits: From single neurons to networks M. Hennig 6. Theories of mammalian perception: Open and closed E. Ahissar Magpie B loop modes of brain-world interactions E. Assa 7. Noise correlations in the cortex: Quantification, J. Fiser Superior B origins, and functional significance M. Lengyel A. Pouget 8. Excitatory and inhibitory synaptic conductances: M. Lankarany Maybird Functional roles and inference methods T. Toyoizumi Workshop Co-Chairs Email Cell Robert Froemke, NYU [email protected] 510-703-5702 Tatyana Sharpee, Salk [email protected] 858-610-7424 Maps of Snowbird are at the end of this booklet (page 38).
    [Show full text]
  • Why the Bayesian Brain May Need No Noise
    Stochasticity from function – why the Bayesian brain may need no noise Dominik Dolda,b,1,2, Ilja Bytschoka,1, Akos F. Kungla,b, Andreas Baumbacha, Oliver Breitwiesera, Walter Sennb, Johannes Schemmela, Karlheinz Meiera, Mihai A. Petrovicia,b,1,2 aKirchhoff-Institute for Physics, Heidelberg University. bDepartment of Physiology, University of Bern. 1Authors with equal contributions. 2Corresponding authors: [email protected], [email protected]. August 27, 2019 Abstract An increasing body of evidence suggests that the trial-to-trial variability of spiking activity in the brain is not mere noise, but rather the reflection of a sampling-based encoding scheme for probabilistic computing. Since the precise statistical properties of neural activity are important in this context, many models assume an ad-hoc source of well-behaved, explicit noise, either on the input or on the output side of single neuron dynamics, most often assuming an independent Poisson process in either case. However, these assumptions are somewhat problematic: neighboring neurons tend to share receptive fields, rendering both their input and their output correlated; at the same time, neurons are known to behave largely deterministically, as a function of their membrane potential and conductance. We suggest that spiking neural networks may have no need for noise to perform sampling-based Bayesian inference. We study analytically the effect of auto- and cross-correlations in functional Bayesian spiking networks and demonstrate how their effect translates to synaptic interaction strengths, rendering them controllable through synaptic plasticity. This allows even small ensembles of interconnected deterministic spiking networks to simultaneously and co-dependently shape their output activity through learning, enabling them to perform complex Bayesian computation without any need for noise, which we demonstrate in silico, both in classical simulation and in neuromorphic emulation.
    [Show full text]
  • Predictive Coding in Balanced Neural Networks with Noise, Chaos and Delays
    Predictive coding in balanced neural networks with noise, chaos and delays Jonathan Kadmon Jonathan Timcheck Surya Ganguli Department of Applied Physics Department of Physics Department of Applied Physics Stanford University,CA Stanford University,CA Stanford University, CA [email protected] Abstract Biological neural networks face a formidable task: performing reliable compu- tations in the face of intrinsic stochasticity in individual neurons, imprecisely specified synaptic connectivity, and nonnegligible delays in synaptic transmission. A common approach to combatting such biological heterogeneity involves aver- aging over large redundantp networks of N neurons resulting in coding errors that decrease classically as 1= N. Recent work demonstrated a novel mechanism whereby recurrent spiking networks could efficiently encode dynamic stimuli, achieving a superclassical scaling in which coding errors decrease as 1=N. This specific mechanism involved two key ideas: predictive coding, and a tight balance, or cancellation between strong feedforward inputs and strong recurrent feedback. However, the theoretical principles governing the efficacy of balanced predictive coding and its robustness to noise, synaptic weight heterogeneity and communica- tion delays remain poorly understood. To discover such principles, we introduce an analytically tractable model of balanced predictive coding, in which the de- gree of balance and the degree of weight disorder can be dissociated unlike in previous balanced network models, and we develop a mean field theory of coding accuracy. Overall, our work provides and solves a general theoretical framework for dissecting the differential contributions neural noise, synaptic disorder, chaos, synaptic delays, and balance to the fidelity of predictive neural codes, reveals the fundamental role that balance plays in achieving superclassical scaling, and unifies previously disparate models in theoretical neuroscience.
    [Show full text]
  • Wilken and Ma 2004
    Journal of Vision (2004) 4, 1120-1135 http://journalofvision.org/4/12/11/ 1120 A detection theory account of change detection Division of Biology, California Institute of Technology, Patrick Wilken Pasadena, CA, USA Division of Biology, California Institute of Technology, Wei Ji Ma Pasadena, CA, USA Previous studies have suggested that visual short-term memory (VSTM) has a storage limit of approximately four items. However, the type of high-threshold (HT) model used to derive this estimate is based on a number of assumptions that have been criticized in other experimental paradigms (e.g., visual search). Here we report findings from nine experiments in which VSTM for color, spatial frequency, and orientation was modeled using a signal detection theory (SDT) approach. In Experiments 1-6, two arrays composed of multiple stimulus elements were presented for 100 ms with a 1500 ms ISI. Observers were asked to report in a yes/no fashion whether there was any difference between the first and second arrays, and to rate their confidence in their response on a 1-4 scale. In Experiments 1-3, only one stimulus element difference could occur (T = 1) while set size was varied. In Experiments 4-6, set size was fixed while the number of stimuli that might change was varied (T = 1, 2, 3, and 4). Three general models were tested against the receiver operating characteristics generated by the six experiments. In addition to the HT model, two SDT models were tried: one assuming summation of signals prior to a decision, the other using a max rule. In Experiments 7-9, observers were asked to directly report the relevant feature attribute of a stimulus presented 1500 ms previously, from an array of varying set size.
    [Show full text]
  • Using Games to Understand Intelligence
    Using Games to Understand Intelligence Franziska Brandle¨ 1, Kelsey Allen2, Joshua B. Tenenbaum2 & Eric Schulz1 1Max Planck Research Group Computational Principles of Intelligence, Max Planck Institute for Biological Cybernetics 2Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology Over the last decades, games have become one of the most isons between human and machine agents, for example in re- popular recreational activities, not only among children but search of human tool use (Allen et al., 2020). Games also also among adults. Consequently, they have also gained pop- offer complex environments for human-agent collaborations ularity as an avenue for studying cognition. Games offer sev- (Fan, Dinculescu, & Ha, 2019). Games have gained increas- eral advantages, such as the possibility to gather big data sets, ing attention during recent meetings of the Cognitive Science engage participants to play for a long time, and better resem- Society, with presentations on multi-agent collaboration (Wu blance of real world complexities. In this workshop, we will et al., 2020), foraging (Garg & Kello, 2020) and mastery (An- bring together leading researchers from across the cognitive derson, 2020) (all examples taken from the 2020 Conference). sciences to explore how games can be used to study diverse aspects of intelligent behavior, explore their differences com- Goal and scope pared to classical lab experiments, and discuss the future of The aim of this workshop is to bring together scientists who game-based cognitive science research. share a joint interest in using games as a tool to research in- Keywords: Games; Cognition; Big Data; Computation telligence. We have invited leading academics from cogni- tive science who apply games in their research.
    [Show full text]
  • Wei Ji Ma CV Apr 2021 Center for Neural Science
    Wei Ji Ma CV Apr 2021 Center for Neural Science and Department of Psychology New York University 4 Washington Place, New York, NY 10003, USA, (212) 992 6530 [email protected] Lab website: http://www.cns.nyu.edu/malab POSITIONS 2020-present Professor, Center for Neural Science and Department of Psychology, New York University ● Affiliate faculty in the Institute for Decision-Making, the Center for Data Science, the Neuroscience Institute, and the Center for Experimental Social Science ● Collaborating Faculty of the NYU-ECNU Institute of Brain and Cognitive Science at NYU Shanghai 2013-2020 Associate Professor, Center for Neural Science and Department of Psychology, New York University (with affiliations as above) 2008-2013 Assistant Professor, Department of Neuroscience, Baylor College of Medicine ● Adjunct faculty in the Department of Psychology, Rice University TRAINING 2004-2008 Postdoc, Department of Brain and Cognitive Sciences, University of Rochester. Advisor: Alexandre Pouget 2002-2004 Postdoc, Division of Biology, California Institute of Technology. Advisor: Christof Koch 1996-2001 PhD in Theoretical Physics, University of Groningen, the Netherlands Jan-Jun 2000 Visiting PhD student, Department of Physics, Princeton University. Advisor: Erik Verlinde 1994-1997 BS/MS in Mathematics, University of Groningen 1993-1996 BS/MS in Physics, University of Groningen RESEARCH INTERESTS Planning, reinforcement learning, social cognition, perceptual decision-making, perceptual organization, visual working memory, comparative cognition, optimality/rationality, approximate inference, probabilistic computation, efficient coding, computational methods. TEACHING, MENTORING, AND OUTREACH Improving the quality of undergraduate teaching, improving mentorship, diversity/equity/inclusion, science outreach, science writing, science advocacy, translating science to policy, social justice. Founder of Growing up in science Founding member of the Scientist Action and Advocacy Network (ScAAN) Founding member of NeuWrite NYU 1 PUBLICATIONS 1.
    [Show full text]
  • Minian: an Open-Source Miniscope Analysis Pipeline
    bioRxiv preprint doi: https://doi.org/10.1101/2021.05.03.442492; this version posted May 4, 2021. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-NC-ND 4.0 International license. Title: Minian: An open-source miniscope analysis pipeline Authors: Zhe Dong1, William Mau1, Yu (Susie) Feng1, Zachary T. Pennington1, Lingxuan Chen1, YosiF Zaki1, Kanaka RaJan1, Tristan Shuman1, Daniel Aharoni*2, Denise J. Cai*1 *Corresponding Authors 1Nash Family Department oF Neuroscience, Icahn School oF Medicine at Mount Sinai 2Department oF Neurology, David GeFFen School oF Medicine, University oF CaliFornia bioRxiv preprint doi: https://doi.org/10.1101/2021.05.03.442492; this version posted May 4, 2021. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-NC-ND 4.0 International license. Abstract Miniature microscopes have gained considerable traction For in vivo calcium imaging in Freely behaving animals. However, extracting calcium signals From raw videos is a computationally complex problem and remains a bottleneck For many researchers utilizing single-photon in vivo calcium imaging. Despite the existence oF many powerFul analysis packages designed to detect and extract calcium dynamics, most have either key parameters that are hard-coded or insuFFicient step-by-step guidance and validations to help the users choose the best parameters.
    [Show full text]
  • Bayesian Modeling of Behavior Fall 2017 Wei Ji Ma
    Bayesian modeling of behavior Fall 2017 Wei Ji Ma This syllabus is subject to change. Changes will be announced in class and by email. Description Bayesian inference is the mathematical framework for making optimal decisions and actions when the state of the world is not exactly known. This course will provide an intuitive yet mathematically rigorous introduction to Bayesian models of behavior in perception, memory, decision-making, and cognitive reasoning. While this is primarily a psychology course, we will also discuss connections to economics and neuroscience. This course is not about Bayesian data analysis, but about theories that the brain itself is a Bayesian decision-maker. Nevertheless, we will spend some time on model fitting and model comparison. Prerequisites • Strong command of Calculus 1 or equivalent • Introductory course in probability or probability-based statistics. • Ability to code in Matlab. If you have not coded in Matlab before, you will be ok if you have other programming experience and do a tutorial before the course starts. Email Wei Ji if you have any questions about prerequisites. Lecturer Prof. Wei Ji Ma, [email protected], 212 992 6530 Weekly schedule Lecture Wednesdays, 4-6 pm Meyer 815 Recitation Thursdays, 2-4 pm Meyer 815 Office hours By appointment Meyer 754 (Wei Ji’s office) Materials • Bayesian modeling of perception, by Ma, Kording, and Goldreich. Will be distributed in electronic form. • You will need Matlab. If you have a laptop, please install Matlab on it before the course starts. Instructions for if you are on the Meyer building network: http://localweb.cns.nyu.edu/unixadmin/#august10-2015.
    [Show full text]