Spring 2019

Quick info:

  • To sign up for the mailing list (you do not need to be enrolled in the seminar), visit our google groups website.
  • The seminar meets Tuesdays, 3:30 to 4:30 PM in the Newton Lab (ECCR 257) in the Classroom Wing of the Engineering Center on the CU Boulder main campus


Announcements:


List of Talks

  • Jan 15, Lior Horesh (IBM Research), "Don't go with the flow – A new tensor algebra for Neural Networks"
  • Feb 5
    • Ann-Casey Hughes presents "Learning the Grammar of Dance" (ICML 1998, by Joshua Stuart and Elizabeth Bradley)
    • Osman Malik, "Fast Randomized Matrix and Tensor Interpolative Decomposition Using CountSketch"
  • Feb 12, Svenja Knappe (CU Assoc. Research Prof. in Mech. Eng.), "Magnetic field imaging with optically-pumped magnetometers"
  • Feb 19, Sam Paskewitz presents "Bayesian Online Changepoint Detection" (Adams and McKay, 2007)
  • Feb 26, Phil Kragel (CU ICS)
  • Mar 5, Shuang Li
  • Mar 12, Sidney D'Mello (CU Assoc. Prof. in ICS)
  • Mar 19, Sophie Giffard-Roisin
  • Mar 26, No talk (spring break)


Abstracts

Jan 15

Title: Don't go with the flow – A new tensor algebra for Neural Networks

Speaker: Lior Horesh (IBM Research)

Abstract: Multi-dimensional information often involves multi-dimensional correlations that may remain latent by virtue of traditional matrix-based learning algorithms. In this study, we propose a tensor neural network framework that offers an exciting new paradigm for supervised machine learning. The tensor neural network structure is based upon the t-product (Kilmer and Martin, 2011), an algebraic formulation to multiply tensors via circulant convolution which inherits mimetic matrix properties. We demonstrate that our tensor neural network architecture is a natural high-dimensional extension to conventional neural networks. Then, we expand upon (Haber and Ruthotto, 2017) interpretation of deep neural networks as discretizations of nonlinear differential equations, to construct intrinsically stable tensor neural network architectures. We illustrate the advantages of stability and demonstrate the potential of tensor neural networks with numerical experiments on the MNIST dataset.

Speaker Bio: Lior Horesh is the Manager of the Mathematics of AI group of IBM TJ Watson Research Center as well as an IBM Master Inventor. Dr. Horesh also holds an Adjunct Associate Professor in the Computer Science Department of Columbia University, teaching graduate level Advanced Machine Learning and Quantum Computing Theory and Practice courses. His expertise lies at large-scale modeling, inverse problems, tensor algebra, experimental design and quantum computing. His recent research focuses on the interplay between first-principles and data-driven methods.

Feb 5

Title: Discussion of the paper “Learning the Grammar of Dance” by Joshua M. Stuart and Elizabeth Bradley, Dept Computer Science at CU

Speaker: Ann-Casey Hughes

Paper Abstract: A common task required of a dancer or athlete is to move from one prescribed body posture to another in a manner that is consistent with a specific style. One can automate this task, for the purpose of computer animations, using simple machine-learning and search techniques. In particular, we find kinesiologically and stylistically consistent interpolation sequences between pairs of body postures using graph-theoretic methods to learn the “grammar” of joint movement in a given corpus and then applying memory-bounded A* search to the resulting transition graphs—using an influence diagram that captures the topology of the human body in order to reduce the search space. Paper link: https://www.cs.colorado.edu/~lizb/papers/icml98.pdf

Title: Fast Randomized Matrix and Tensor Interpolative Decomposition Using CountSketch

Speaker: Osman Malik

Abstract: In this talk I will present our recently developed fast randomized algorithm for matrix interpolative decomposition. If time permits, I will also say a few words about how our method can be applied to the tensor interpolative decomposition problem. Our preprint paper is available at https://arxiv.org/abs/1901.10559

Feb 12

Title: Magnetic field imaging with optically-pumped magnetometers

Speaker: Svenja Knappe

Abstract: I present our ongoing effort in developing imaging systems with microfabricated optically-pumped magnetometers (mOPMs). By use of microfabrication technologies and simplification of optical setups, we aim to develop manufacturable sensors of small size and low power. Our zero-field mOPMs require a shielded environment but reach high sensitivities of less than 10 fT/Hz1/2. One target application lies in the field of non-magnetic brain imaging, specifically magnetoencephalography (MEG). The attraction of using these sensors for non-invasive brain imaging comes from the possibility of placing them directly on the scalp of the patient, very close to the brain sources. We have built several multi-channel test systems to validate the prediction of very high signal-to-noise ratios in standard MEG paradigms.

Bio: Svenja Knappe received her Ph.D. in physics from the University of Bonn, Germany in 2001 with a thesis on miniature atomic magnetometers and atomic clocks based on coherent-population trapping. For 16 years, she worked at the National Institute of Standards and Technology (NIST) in Boulder CO, developing chip-scale atomic sensors. She is now an Associate Research Professor at the University of Colorado and her research interests include microfabricated atomic sensors. She is also a co-founder of FieldLine Inc.