My Project

So, I'm starting an independent research project. I wrote a post a while ago http://neuraldatascience.blogspot.com/2019/09/repurposing-blog_2.html about how I would like to document my research and for this blog to be kind of a lab notebook, documenting my ideas and what I'm doing. Well, it starts now. Here's what I wrote today as the beginning of a research proposal.

Disentangling the functional patterns in sensory neural circuits through simulations implanted with neural recording data

Maria Kesa


Neural systems consists of two types of units: excitatory and inhibitory neurons. These two groups can be further subdivided according to their morphology, electrophysiology and molecular composition, but at the most basic level they are units that either cause their postsynaptic targets to either depolarize (excitatory neurons) or to hyperpolarize (inhibitory neurons).


We propose to investigate the curious feedback loops that emerge in networks of inhibitory and excitatory neurons in a sensory circuit. The first feedback loop is how the activity of functionally specialized excitatory neurons recruits widely-tuned inhibition and how this inhibitory feedback shapes the activity of the neural circuit. If the brain consisted only of excitatory neurons it would be subject to run-away excitation, where the activity of the recurrent circuit would be exponentially amplified. Inhibition serves to stabilize the activity of neuronal networks and allows for selective amplification of particular sensory-induced activity profiles.


Secondly, we are interested in how the activity patterns of neural circuits influence the synaptic connectivity matrix when the network is endowed with learning rules. We can study Hebbian, anti-Hebbian, homeostatic scaling and inhibitory plasticity rules. Activity patterns shape the connectivity between neurons, and the states of the synapses in turn influences the activity patterns. This bi-directional causality is difficult to disentangle using only experiments, because we don’t have a way to measure plasticity dynamics on a large-scale.


Our strategy for studying these questions is to combine simulations with real data. We inject real activity patterns from excitatory and inhibitory neurons into rate-based neurons and further endow the network with well-studied plasticity rules. We use an open data set with more than 10,000 cortical neurons recorded from the mouse V1 when the mouse was either in the dark, exposed to 2800 natural images or patterns of white noise. Our motive for injecting real data into a simulation into a circuit endowed with learning rules is to understand how the firing statistics of real neurons shape the weight dynamics of synapses.

It all started with an idea that I had. I'm working with large-scale calcium imaging recordings from the Pachitariu lab in Janelia. This data is publicly available https://figshare.com/authors/Carsen_Stringer/5183561 I had the idea of making a Hebbian neuron that would receive 1000 inputs as deconvolved calcium imaging data which roughly represents the intensity of firing in a cell within a 400 ms time bin. I was reading Dayan and Abbott and their chapters on synaptic plasticity. So I just made a very basic Hebbian neuron https://github.com/mariakesa/DayanAbbottExplorations/blob/master/HebbianExploration.ipynb That notebook has results from randomly selected neurons. But then I read this paper http://mouse.vision/pdfs/Cossell_2015_Nat.pdf "Functional organization of excitatory synaptic strength in primary visual cortex", Cossell et al, Nature 2015. They found that neurons with similar receptive fields are strongly connected and the rest of connections are weak. So that gave me the idea of using EnsemblePursuit, the algorithm that I've been working on to extract neurons in one sensory ensemble  as in here https://github.com/MouseLand/EnsemblePursuit/blob/master/Notebooks/BehaviorStimulusCorrsWithSpatialSpreadOfEnsemble.ipynb EnsemblePursuit is a matrix factorization algorithm that Marius Pachitariu at Janelia invented (my boss). It uses a sparsity penalty to find correlated neurons in large-scale recordings. You can see a poster about the algorithm that we made for the Statistical Analysis of Neural Data workshop in Carnegie Mellon here https://github.com/MouseLand/EnsemblePursuit/blob/master/sand_poster2019.pdf Here's the notebook where I used EnsemblePursuit https://github.com/mariakesa/DayanAbbottExplorations/blob/master/EnsemblePursuitSynapseExploration.ipynb I  inserted the time course of the neurons in the ensemble into a artificial neuron and then extracted its receptive field 
Voila, even without plasticity it looks a bit like a Gabor!

 In fact, when I add Hebbian plasticity it scrambles the receptive field.

It's fun that I got to use the algorithm that we developed in an actual analysis. The aim was really to make something useful for the community and I'm currently integrating the algorithm into the calcium imaging libarary suite2p.

Anyway, here it is a blog post over a long time. I've been intimidated to write about my work because it's not finished. But wouldn't it be lovely if I could perservere and document my entire PhD research in the form of lab blog? It would be sweet for me to look back in old age and think, wow, what a life I have lived.

Kommentaarid