2018-2019

Sep 25 Tamas Horvath (Oakland University) From Continuous Galerkin to Space-Time Hybridizable Discontinuous Galerkin - Variations on a Theme

The finite element method (FEM) is a family of numerical methods for solving partial differential equations. In this talk, we will present different FEMs, such as continuous Galerkin, discontinuous Galerkin (DG) and hybridizable DG methods for steady-state problems. For unsteady problems, we will introduce the space-time DG methods, which became a popular way to discretize time-dependent problems on deforming domains. These methods use DG also for the temporal discretization allowing an arbitrarily high order approximation in time. We will present some comparisons for these methods with numerical examples.

Oct 2 Dylan Rupel (Michigan State University) Cell Decompositions for Rank Two Quiver Grassmannians

A quiver Grassmannian is a variety parametrizing subrepresentations of a given quiver representation. Reineke has shown that all projective varieties can be realized as quiver Grassmannians. In this talk, I will study a class of smooth projective varieties arising as quiver Grassmannians for (truncated) preprojective representations of an n-Kronecker quiver, i.e. a quiver with two vertices and n parallel arrows between them. The main result I will present is a recursive construction of cell decompositions for these quiver Grassmannians. If there is time I will discuss a combinatorial labeling of the cells by which their dimensions may conjecturally be directly computed. This is a report on joint work with Thorsten Weist.

Oct 23 Zili Zhang (University of Michigan) P=W: A Strange Identity for Dynkin Diagrams

Start with a compact Riemann surface X with marked points and a complex reductive group G. According to Hitchin-Simpson's nonabelian Hodge theory, the pair (X,G) comes with two new complex varieties: the character variety M_B and the Higgs moduli M_D. I will present some aspects of this story and discuss a new identity P=W indexed by affine Dynkin diagrams - occurring in the singular cohomology groups of M_D and M_B, where P and W dwell. I will focus on some concrete examples constructed over elliptic curves and elliptic fibrations. This talk is not aimed at specialists.

Nov 6 Moongyu Park (Oakland University) Fractional Calculus and Application

Fractional calculus has long history. But they attracted many researchers’ attention a few decades ago. Because they realized that integer-order differential equations do not describe some experimental data and field data such as anomalous diffusion and nonlocal problem. Now, fractional calculus is one of hot topics in mathematics, physics, Biology and engineering. In the colloquium I will talk about mathematical issues, numerical computation and applications in sciences and engineering.

Nov 28 James Albert (Bowling Green State University) Catcher Framing: A New Measure of Performance

In baseball it is relatively easy to measure the performance of hitters and pitchers, but other aspects of performance such as fielding and speed are more difficult to measure. Due to the advances in technology such as PitchFX and Statcast, new types of data are being collected and this new data provides opportunities to measure subtle measures of performance. We focus on called pitches, and explore the catcher's ability to affect the chance of a called strike. Good catcher framers can be measured in terms of the number of runs they contribute to their teams. We describe the Bayesian model for estimating catcher framing effects.

Jan 15 Peter Gerdes (Oakland University) Computability Theory: Introduction To Current Results

In this talk I give a short introduction to computability theory for mathematicians who aren't familiar with the subject. I'll start with a brief overview of the motivating questions and concepts the field uses to answer them such as Turing reductions, computable and computably enumerable sets as well as the jump operator. I'll then introduce the central tool of computability theory, the priority argument, by presenting the classical Friedburg-Muchnik theorem establishing the existence of computably enumerable sets of intermediate degree. Time permitting I'll then try and give an example of approachable curret research by presenting a cute recent result of mine (in collaboration with Steffan Lempp, Uri Andrews, Joseph Miller and Noah Schweber) analyzing the degree-theoretic properties of the symmetric difference operator in the computably enumerable degrees. Note, no prior understanding of the notions mentioned here will be necessary to follow the talk.

Feb 12 Sharon Berry (Oakland University) Mathematical Knowledge and the Access Problem

Human beings seems to know many things about abstract mathematical objects like the numbers and sets. But it can seem mysterious what could explain the match between human belief and objective mathematical facts -- given that (for example) we can’t see, or hear or touch or otherwise causally interact with mathematical objects.

This puzzle is called “The Access Problem”. And it has driven much work in philosophy, from Plato’s era to the current day. In this talk I’ll try to get listeners worried about the Access Problem. Then I’ll contrast several approaches to solving it and present my preferred approach.

Feb 26 Ethan Kubatko (The Ohio State University) Discontinuous Galerkin Methods and Supporting Computational Tools for Environmental Fluid Dynamic

Discontinuous Galerkin (DG) methods are a family of finite element methods that exhibit a number of favorable properties for modeling environmental fluid dynamics problems, including their ability to handle advection-dominated flow scenarios, their local conservation properties and the relative ease with which both h (mesh) and p (polynomial) refinement can be implemented. This talk will highlight the development and application of a suite of DG models for one-, two- and three-dimensional shallow water flow, overland flow due to rainfall and spectral wave modeling. Supporting computational tools that are used within the context of these models include an advanced unstructured mesh generator that we have developed called ADMESH+ and new (in many cases optimal) sets of numerical integration rules and time stepping methods that have been specifically designed for efficient calculation when used with high-order DG spatial discretizations. A number of applications that demonstrate the accuracy, efficiency and robustness of the developed modeling framework will be highlighted.

Mar 15 Gengxin Li (Wright State University) Empirical Bayes Risk Prediction Models for Sequence Data

The rapidly developing sequencing technologies have led to improved disease risk prediction through identifying many novel genes. Many prediction methods have been proposed to use rich genomic information to predict binary disease outcomes. It is intuitive that these methods can be further improved by making efficient use of the rich information in measured quantitative traits that are correlated with binary outcomes. In this study, we generalize the Efron's method to allow for some of the peculiarities of the sequencing data. In particular, we introduce two ways of extending Efron's model, and propose a novel Empirical Bayes prediction model that uses information from both quantitative traits and binary disease status to improve risk prediction. Our method is built on a new statistic that better infers the gene effect on multiple traits, and it also enjoys the good theoretical properties. We then consider using sequencing data by combining information from multiple rare variants in individual genes to strengthen the signals of causal genetic effects. In simulation study, we find that our proposed Empirical Bayes approach is superior to other existing methods in terms of feature selection and risk prediction. We further evaluate the effectiveness of our proposed method through its application to a real whole genome sequencing data.

Mar 18 Jun Hu (University of Vermont) A Class of Purely Sequential Minimum Risk Point Estimation (MRPE) Methodologies


Mar 20 Wenjie Wang (University of Connecticut) Integrative Survival Analysis with Uncertain Event Times in Application to a Suicide Risk Study

The concept of integrating data from disparate sources to accelerate scientific discovery has generated tremendous excitement in many fields. The potential benefits from data integration, however, may be compromised by the uncertainty due to incomplete/imperfect record linkage. Motivated by a suicide risk study, we propose an approach for analyzing survival data with uncertain event times arising from data integration. Specifically, in our problem deaths identified from the hospital discharge records together with reported suicidal deaths determined by the Office of Medical Examiner may still not include all the death events of patients, and the missing deaths can be recovered from a complete database of death records. Since the hospital discharge data can only be linked to the death record data by matching basic patient characteristics, a patient with a censored death time from the first dataset could be linked to multiple potential event records in the second dataset. We develop an integrative Cox proportional hazards regression, in which the uncertainty in the matched event times is modeled probabilistically. The estimation procedure combines the ideas of profile likelihood and the expectation conditional maximization algorithm (ECM). Simulation studies demonstrate that under realistic settings of imperfect data linkage, the proposed method outperforms several competing approaches including multiple imputation. A marginal screening analysis using the proposed integrative Cox model is performed to identify risk factors associated with death following suicide-related hospitalization in Connecticut. The identified diagnostics codes are consistent with existing literature and provide several new insights on suicide risk prediction and prevention.

Mar 22 Tamas Horvath (Oakland University) Time-dependent problems on evolving domains

In this talk, we will discuss numerical methods for solving unsteady partial differential equations on deforming domains. Deforming domains show up in many real-life scenarios, such as wind turbines, helicopter rotors, car wheels, free surface flows. I will focus on the space-time finite element method that is an excellent approach to discretize problems on evolving domains. This method uses discontinuous Galerkin to discretize both in the spatial and temporal directions, allowing for an arbitrarily high order approximation in space and time. Furthermore, this method automatically satisfies the geometric conservation law which is essential for accurate solutions on time-dependent domains. The biggest criticism is that the application of space-time discretization increases the computational complexity significantly. As a solution to overcome this, I will present a high order accurate Hybridizable or Embedded Discontinuous Galerkin method. Numerical results will be presented to illustrate the method.

Mar 25 Shixu Meng (University of Michigan) Qualitative Imaging Methods and Wave Motion in Complex Media

The mathematical theory of wave scattering describes the interaction of waves with natural or manufactured perturbations of the medium through which they propagate. The goal of inverse scattering or imaging is to estimate the medium from observations of the wave field. It has applications in a broad spectrum of scientific and engineering disciplines, including seismic imaging, radar, astronomy, medical imaging, and non-destructive material testing. Qualitative imaging methods have been the focus of much activity in the mathematics community. Examples are the linear sampling method, the factorization method, use of transmission eigenvalues, Stekloff eigenvalues and so on. Reverse time migration methods and the closely related matched field or matched filtering array data processing techniques are related to such qualitative approaches.

In this talk I shall first discuss qualitative imaging methods in an acoustic waveguide with sound hard walls. The waveguide terminates at one end and contains an unknown obstacle of compact support or has deformed walls, to be determined from data gathered by an array of sensors that probe the obstacle with waves and measure the scattered field. To further investigate qualitative imaging methods in complex media, I shall discuss computation of Bloch variety and higher-order wave homogenization in periodic media, where such media have been used with success to manipulate waves toward achieving super-focusing, sub-wavelength imaging, cloaking, and topological insulation.

Mar 26 Elise Brown (Oakland University) Exercise and Prevention of Cardiometabolic Disease

Exercise is one of the most effective ways to prevent cardiometabolic diseases like type 2 diabetes and heart disease. While most research to date has focused on the effects on aerobic exercise, more evidence is providing support for the utility of resistance training exercise on risk factors. Data collected for examining these exercise intervention effects across time and conditions include biomarkers (blood pressure, blood sugar and fat), body composition (fat mass, lean mass, bone mineral content, waist circumference), physical performance (aerobic endurance, muscular strength), and psychosocial variables (exercise self-efficacy). General linear models with repeated measures and analysis of covariance are used to examine these intervention effects. Various preventative efforts will be discussed across different populations and settings.

Mar 28 Laetitia Paoli (University of Saint-Etienne) Discrete dynamical systems with frictionless unilateral constraints

The topic of this talk is to give an overview of mathematical issues for discrete dynamical systems with frictionless unilateral constraints: starting from the basic description of the dynamics, a mathematical formulation of the problem as a measure differential inclusion will be derived and existence results based on the convergence of a time-stepping approximation will be presented. An example of implementation will conclude the talk.

Mar 29 Yongjin Lu (Virginia State University) On the pullback dynamics of 3D Navier-Stokes equations with nonlinear viscosity

We study the pullback dynamics of 3D Navier-Stokes equations with nonlinear viscosity and subject to time-dependent external force. Using a decomposition method, we establish the existence of a finite dimensional pullback attractor in a general setting involving tempered universe. We give estimates on the upper bound of the finite fractal dimension of pullback attractor. We also investigate the upper semi-continuity of pullback attractor as the non-autonomous perturbation vanishes.

Apr 2 David Banks (Duke University) Adversarial Risk Analysis

Adversarial Risk Analysis (ARA) is a Bayesian alternative to classical game theory. Rooted in decision theory, one builds a model for the decision-making of one's opponent, placing subjective distributions over all unknown quantities. Then one chooses the action that maximizes expected utility. This approach aligns with some perspectives in modern behavioral economics, and enables principled analysis of novel problems, such as a multiparty auction in which there is no common knowledge and different bidders have different opinions about each other.

Apr 9 Tapabrata Maiti (Michigan State University) High Dimensional Discriminant Analysis for Structurally Dependent Data

Linear discriminant analysis (LDA) is one of the most classical and popular classification techniques. However, it performs poorly in high-dimensional classification. Many sparse discriminant methods have been proposed to make LDA applicable in high dimensional case. One issue of those methods is the structure of the covariance among features is ignored. We propose a new procedure for high dimensional discriminant analysis for structurally correlated data. Specifically, we will discuss spatially structured data. Penalized maximum likelihood estimation (PMLE) is developed for feature selection and parameter estimation. Tapering technique is applied to reduce computation load. The theory shows that the proposed method can achieve consistent parameter estimation, features selection, and asymptotically optimal misclassification rate. Extensive simulation study shows a significant improvement in classification performance under spatial dependence.