Statement of Interests

Research Interests in Neuroscience

Neural computation and Information Theory:

Application of information theoretic measures on experimental data for neurophysiological questions regarding neural computations.

I am also interested in using theoretical tools form higher mathematics for this purpose, such as information geometry.

I divide my intended research interests into three groups:

  1. Information theory in neural systems:

    • i. Ecology of information flow; Causality and information flow dynamics in Bayesian networks,

    • ii. Synergy in correlated neural responses,

    • iii. Entropy estimation methods for bias correction (non-parametric and parametric methods).

    • iv. Using statistical modelling for neural populations (including the concepts of maximum entropy, sampling and information geometry).

  2. Spike train analysis: Information theoretic measures, neural encoding of sensory information.

  3. So far my research has been about relation between spiking activity and continuous neural responses (LFP, MEG). My research was about phase-of-firing and other neural codes involving continuous signals and neural oscillation.

  4. Related research interests: Studying the temporal “correlation structure” of the neural responses and its dynamics, dimension reduction for neuronal responses, and optimal decoding.

  5. How the circuits in the limbic system build approximate models of the external world using Markov models and the role of dopamine, serotonin, acetyl-choline, norepinephrine, and neuroactive steroids in these internal models. I am interested in modeling work in this are which has implications regarding the models of reward, surprise, uncertainty, prediction error, attention, the temporal difference model, the value-based decision making, etc.

  6. Other interests in neuroscience: My other research interests include the following topics:

    • Simplified single neuron models,

    • Generalized linear models (GLM),

    • Oscillatory representations and oscillatory neural codes,

    • Realistic and detailed modelling of single neurons,

    • Dopamine receptors,

    • Decision Making: Including the perceptual decision making (especially about Shadlen's random walk model) and value-based decision making.

    • Motor control (especially using the Bayesian framework),

    • Stochastic modelling of neural/brain function (Bayesian/Stochastic/Sampling):

      • Markov networks and Markov chains for probabilistic modelling,

        • Sampling hypothesis, Application of MCMC and various sampling methods in modelling.

      • Stochastic Optimal control

      • Hamiltonian Monte Carlo

      • Network Coding theory

    • Experimental work (I have no wet lab experience but I al willing to learn as it will be useful in my career).

Other research Interests:

Machine Learning

  • Application of Statistical Mechanical methods in Machine Learning

  • Deep learning (Boltzmann machines and Multi Layer Neural networks)

  • Unsupervised learning

  • Dimensionality reduction

  • Spectral methods (spectral graph theory)

  • Information Geometry (see below)

  • Estimation methods for statistical parameters, distributions (maximum entropy) and statistical quantities (e.g. entropy)

  • Statistical Models of Natural Stimuli (Vision, Sound, Language)

  • Latent Variables in Machine Learning (inferring latent variables)

  • Anything based on Variational methods and Optimal Control

Complex systems:

  • Network Coding theory

  • Ecology of information flow; Causality and information flow dynamics

  • Information Geometry (use of differential geometry and statistical mechanics in statistical modelling of complex systems and Machine Learning using Mathematics of Differential Geometry and Riemannian manifolds)

  • Interested to study about Information theory topics: Non-Shannon Information Inequalities, various definitions of entropy (Kolmogorov-Sinai, Topological entropy, etc) and measures of uncertainty.

  • Probabilistic Models of the Brain and Neural networks

Research Interests in Computer Science

  • Irreversible computing for low-energy computing (as a programming paradigm, for low-energy hardware, for low-energy cloud platforms)

    • New software and hardware paradigms for irreversible computing

  • Distributed and Cloud systems:

    • Scalability, scale-free patterns in distributed systems. Programming languages in distributed and cloud systems,Programming languages

    • Memory management for low-latency and cloud computing applications

    • Resource-oriented-Architecture and Resource-oriented-Computing (ROA / ROC)

    • Cloud performance tuning based on run-time collection of control flow statistics and automatic selection of optimal trade-offs (memory management (virtual memory, garbage collection parameters and multiple layers of GC based on real-time and other performance requirements), synchronisation, automatic and predictive allocation of process containers)

    • Tier-less languages (that generate code for both client and server in the same source code)

    • Software Architecture inspired by brain architecture and its design patterns (especially regarding neural information flow and control)

  • Quantum Computing and Quantum Information theory

  • Bio-inspired computing (especially based on neuroscience and brain science):

    • Bio-inspired distributed systems

    • Bio-inspired computational models

    • Bio-inspired Machine learning (see Machine Learning)

  • New hardware based on new building blocks (e.g. memristors) and related software

  • Neuromorphic hardware and efficient implementation of very large neural networks

  • Programming languages design for above items

Sohail Siadatnejad, PhD