I am a computational/theoretical neuroscientist, currently working as a Swartz postdoctoral fellow with Haim Sompolinsky at Center for Brain Science, Harvard University. Before that, I was working with Peter Latham at Gatsby Computational Neuroscience Unit, UCL as a research fellow, studying development and evolution of olfactory systems, and perturbation-based credit assignment mechanisms. I have done Ph.D at Laboratory for Neural Circuit Theory (Fukai Lab.) in RIKEN Brain Science Institute where I worked on normative and phenomenological models of dendritic synaptic plasticity and on spiking neural networks. 

Biography

Nov 2020 - : Swartz postdoctoral fellow, Center for Brain Science, Harvard University

Dec 2016 - Oct 2020: Research Associate/Fellow, Gatsby Unit, University College London

Apr 2016 - Nov 2016: Research Scientist, RIKEN Brain Science Institute

Mar 2016: PhD, The University of Tokyo

Major works

Bayesian synaptic plasticity

Learning in the brain is driven by diverse synaptic and neural plasticity mechanisms. In order to gain insights and guide future experiments on this complex machinery, it is crucial to develop a normative theory on how plasticity should work. Bayesian brain hypothesis, which has been guiding our understanding of animal behavior and underlying neural coding and computation, is a candidate normative framework but has rarely been applied to plasticity mechanisms. 
In Hiratani and Fukai (2018), we discovered the duality between plasticity and rewiring of multi-synaptic connections and a sample-based Bayesian filtering algorithm known as particle filtering. This work suggests that multi-synaptic contacts between nearby neurons commonly observed in the cortex are not redundancy, but might be a substrate for a sample-based implementation of the Bayesian optimization process. Moreover, the model suggests that the dendritic position of the synaptic contact is the primal factor of the synaptic strength, while the spine size indicates the importance of the connection, providing a novel view of synaptic function. 
In Hiratani and Latham (2020), we extended previous work on the variational implementation of Bayesian optimization, which was limited to supervised learning in a single neuron, to unsupervised learning in mammalian olfactory circuits. The derived model of near-optimal learning is consistent with the olfactory bulb circuitry and their learning dynamics, and also explains a rapid odor-reward association observed in experimental studies. This work thus suggests that even at the circuit level, the biological learning mechanism is well characterized by Bayesian synaptic plasticity.
Hiratani N, Fukai T, Proc. Natl. Acad. Sci. U.S.A. 115 (29) E6871-E6879 (2018).Hiratani N, Latham PE, Nature communications 11, 3845 (2020)

Neural architecture selection

Among species, neural circuit architecture shows remarkable regularity, indicating the presence of underlying optimality principles. In Hiratani and Latham (2020), we hypothesized that circuits have been optimized for efficient learning, and investigated whether the optimized circuitry derived from model selection theory is consistent with the actual circuitry. We addressed this question in the context of the olfactory circuit, which shows clear scaling laws: in mammals, the number of layer 2 neurons in the piriform cortex is proportional to the number of glomeruli to the 3/2 power (Srinivasan & Stevens, 2019), while in insects, the number of Kenyon cells is roughly cubic in the number of glomeruli (literature survey). 
We modeled the olfactory system as a three-layered nonlinear neural network and analytically derived scaling laws by estimating the network size that optimizes, over the lifetime of the animal, its ability to predict rewards associated with odors. We also extended the framework to the case when a fraction of the olfactory circuit can be genetically specified, not developmentally learned, and numerically demonstrate that when there are a small number of glomeruli, this makes the scaling steeper, as is observed among insects. This suggests that it may be possible to understand neural circuits from an optimality point of view, with an optimization that occurs over evolutionary timescales.
Hiratani N, Latham PE, PNAS, 119 (11), e2100600119 (2022)

Publications

PREPRINTS.

REFEREED JOURNAL ARTICLES.

Evolution of neural activity in circuits bridging sensory and abstract knowledge 

Mastrogiuseppe F, Hiratani N, Latham PE

ELife (in press)

[Preprint] 

12. Optimal quadratic binding for relational reasoning in vector symbolic neural architectures

Hiratani N, Sompolinsky H

Neural Computation 35 (2): 105–155.

[Paper] [Preprint]

11. On the stability and scalability of node perturbation learning

Hiratani N, Mehta Y, Lillicrap TP, Latham PE 

NeurIPS 2022

[OpenReview]

10. Developmental and evolutionary constraints on olfactory circuit selection

Hiratani N, Latham PE

Proc. Natl. Acad. Sci. U.S.A., 119 (11), e2100600119 (2022)

[Open Access]

9. The Olfactory Bulb Facilitates Use of Category Bounds for Classification of Odorants in Different Intensity Groups

Losacco J, George NM, Hiratani N, Restrepo D

Frontiers in Cellular Neuroscience 14, 430 (2020) 

[Open Access]

8. Rapid Bayesian learning in the mammalian olfactory system

Hiratani N, Latham PE

Nature communications 11, 3845 (2020)

[Open Access]

7. Interactive reservoir computing for chunking information streams.

Asabuki T, Hiratani N, Fukai T

PLOS Computational Biology, 14(10), e1006400 (2018). 

[Open Access]

6. Redundancy in synaptic connections enables neurons to learn optimally. 

Hiratani N, Fukai T

Proc. Natl. Acad. Sci. U.S.A. 115 (29) E6871-E6879 (2018).

[Full Text]

5. Detailed dendritic excitatory/inhibitory balance through heterosynaptic spike-timing-dependent plasticity. 

Hiratani N, Fukai T

Journal of Neuroscience, 37 (50) 12106-12122 (2017).

[Full Text]

4. Hebbian Wiring Plasticity Generates Efficient Network Structures for Robust Inference with Synaptic Weight Plasticity.

Hiratani N, Fukai T

Frontiers in Neural Circuits 10:41 (2016) .

doi: 10.3389/fncir.2016.00041 [Open Access]

3. Mixed Signal Learning by Spike Correlation Propagation in Feedback Inhibitory Circuits.

 Hiratani N, Fukai T

 PLOS Computational Biology11(4): e1004227 (2015).

 doi:10.1371/journal.pcbi.1004227 [Open Access]

2. Interplay between short- and long-term plasticity in cell-assembly formation.

Hiratani N, Fukai T

 PLOS ONE. 9(7):e101535 (2014).

doi: 10.1371/journal.pone.0101535. [Open Access]

1. Associative memory model with long-tail-distributed Hebbian synaptic connections.

Hiratani N, Teramae J-N, Fukai T 

 Frontiers in Computational Neuroscience 6:102 (2013).

doi: 10.3389/fncom.2012.00102 [Open Access]

REFEREED CONFERENCES.

10. Hiratani, N. Hebbian learning of a multi-layered cerebellar network with quadratic capacity. Mar 9-12 (2023) Computational and systems neuroscience (cosyne) 2023, Montreal, Canada.

9. Mastrogiuseppe, F., Hiratani, N., & Latham, P. E. Evolution of neural activity in circuitsbridging sensory and abstract knowledge. Mar. 17 (2022) Computational and systems neuroscience (cosyne) 2022, Lisbon Portugal.

8. Hiratani N, Latham PE, "Developmental and evolutionary principles of olfactory circuit designs" Feb. 29 (2020), Computational and Systems Neuroscience (Cosyne) 2020, Denver, USA, Feb. 27- Mar. 3 (2020). 

7. Mastrogiuseppe F, Hiratani N, Latham PE, "Neural circuitry supporting approximate inference captures structure of correlations in hippocampus" Feb. 29 (2020), Computational and Systems Neuroscience (Cosyne) 2020, Denver, USA, Feb. 27- Mar. 3 (2020). 

6. Hiratani N, Latham PE, "Developmental and evolutionary principles of olfactory circuit designs" Sep. 18 (2019), Bernstein Conference 2019, Berlin, Germany, Sep. 17- Sep. 21 (2019). [Selected for contributed talks]

5. Hiratani N, Latham PE, "A Bayesian approach to unsupervised learning in the olfactory system" Feb. 28 (2019), Computational and Systems Neuroscience (Cosyne) 2019, Lisbon, Portugal, Feb. 28- Mar. 3 (2019). 

4. Hiratani N, Fukai T, "Detailed dendritic excitatory/inhibitory balance through heterosynaptic STDP" Feb. 24 (2017), Computational and Systems Neuroscience (Cosyne) 2017, Salt Lake City, USA, Feb. 23- Feb. 26 (2017). 

3. Hiratani N, Fukai T, "Optimal learning with redundant synaptic connections" Feb. 27 (2016), Computational and Systems Neuroscience (Cosyne) 2016, Salt Lake City, USA, Feb. 25- Feb. 28 (2016). 

2. Hiratani N, Fukai T, "Structural plasticity generates efficient network structure for synaptic plasticity" Mar. 7 (2015), Computational and Systems Neuroscience (Cosyne) 2015, Salt Lake City, USA, Mar. 5- Mar. 8 (2015)

1. Hiratani N, Fukai T, "Interplay between short- and long-term plasticity in cell-assembly formation" Feb. 27 (2014), Computational and Systems Neuroscience (Cosyne) 2014, Salt Lake City, USA, Feb. 27- Mar. 2 (2014)

BOOK CHAPTERS.

1. Hiratani N, Fukai T, "Selection of Synaptic Connections by Wiring Plasticity for Robust Learning by Synaptic Weight Plasticity" In: The Rewiring Brain: A Computational Approach to Structural Plasticity in the Adult Brain. (Van Ooyen, A. and Butz-Ostendorf, M., eds.), Academic Press, June 2017.

Contact Information

EMAIL.

n.hiratani(at_mark)gmail.com

ADDRESS.

Center for Brain Science

Harvard University

52 Oxford Street, Cambridge, Massachusetts, USA