The ACM seminar provides a regular venue for presenting and discussing topics such as numerical analysis, scientific computing, modeling, simulation and visualization. The seminar also hosts regular group meetings and special presentations associated with the NSF-funded RTG program Computation- and Data-Enabled Science (CADES).
Faculty, students and researchers are welcome to attend and present.
Meetings will be held in FHM 462, the large conference room in our department, unless specifically indicated otherwise. Starting in Fall 2025, the seminar meeting time will be Fridays at 10:30am.
Notification of upcoming presentations, including Zoom links when relevant, will be sent as calendar invitations to those on our mailing list. To subscribe, click here.
Organizer: Prof. Jeffrey Ovall (jovall@pdx.edu)
Current Seminar Schedule: Fall 2025
January 9 (11am--note time change): Paige Kinsley, Argonne Leadership Computing Facility (ALCF)
Title: Opportunities in Computing at Argonne National Laboratory and the Department of Energy
Abstract: Do You Know What High Performance Computing Is? What Could it Do for Your Interests and Research as a Student? Come and Find Out! (No Experience Necessary!)
Want to learn more about how some of the most powerful supercomputers in the world are enabling and accelerating science? Interested in learning about internships to use these machines at a Department of Energy National Lab? Paige Kinsley, Education Outreach Lead at Argonne National Laboratory, will share information on the work and science being done in high-performance computing (HPC) at DOE national laboratories, along with student internship opportunities at Argonne and other DOE labs.
Students can join us in-person or via Zoom: https://pdx.zoom.us/j/83630941484
Paige Kinsley is the Education Outreach Lead for the Argonne Leadership Computing Facility (ALCF), at the Department of Energy Argonne National Laboratory. She assists ALCF staff in the development and facilitation of training and outreach activities across the facility, as well as acting as the central point for all education, outreach, and training activities for the ALCF. Paige received her Ph.D. in Chemistry from University of Wisconsin – Madison and holds a B.S. in Chemistry and B.A. in Chinese Language from the University of North Carolina at Chapel Hill.
January 16 (10:30am): RTG Group Meeting
Research updates:
January 23 (10:30am): Nick Fisher, Portland State University
Title: Novel Applications of Kernel Methods in Data Science and Approximation Theory
Abstract: Kernel methods have seen use in a wide range of academic disciplines such as approximation theory, numerical analysis, spatial statistics, statistical learning, and beyond. In this talk, we highlight some recent research on the applications of kernel methods over a diverse set of scientific domains including: i) the recovery of vector fields from trajectory data, ii) image classification, and iii) fractional order spline interpolation. Moreover, throughout the course of the presentation we will demonstrate how kernel-based techniques can be exploited in a variety of ways. For instance, by carefully selecting our kernel we can guarantee that any learned vector-field analytically satisfies a divergence-free constraint and by making use of statistical techniques such as the maximum mean discrepancy we can compare the distributions of learned features in an image classification problem. Additionally, we will move beyond the familiar world of symmetric kernels to establish novel fractional order spline approximations in reproducing kernel Banach spaces. At the end of the talk, we will take a moment to explore various avenues for students and researchers can get involved in these new and exciting topics.
Bio: Nick Fisher currently holds the position of Postdoctoral Scholar at Portland State University. After receiving his PhD in Computational and Applied Mathematics from the Colorado School of Mines in 2019, he held a Postdoc position (also at the School of Mines) and was an Instructor at Minnesota State University, Mankato before arriving at Portland State. In recent years, his primary research interests have centered around the application of kernel-based methods to data science and approximation theory with particular focus on the recovery of vector-fields from data and the numerical solution of fractional differential equations. In his spare time, Nick enjoys taking his dog on hikes through the forest and watching cooking shows with his spouse.
January 30 (10:30am): NO MEETING TODAY
February 6 (10:30am): Jay Gopalakrishnan, Portland State University
Title: Computing Curvature
Abstract: I will describe recent joint work on computing curvature of manifolds and how our initial utilitarian considerations of numerics led us to grapple with foundational questions on generalization of the concept of curvature itself. I will start from simple and intuitive motivations of extrinsic curvature of embedded manifolds and eventually build up to a generalization of the intrinsic Riemann curvature for isometrically glued nonsmooth manifolds.
February 13 (10:30am): RTG Group Meeting (Cancelled)
Research updates:
February 20 (10:30am): TBA (Cancelled)
Title:
Abstract:
February 27 (11:00am, FMH 417--note time change and room change): Ashesh Chattopadhyay, University of California, Santa Cruz
Title: A theoretical eigenanalysis framework for neural autoregressive models of multi-scale chaotic dynamics
Abstract: Recent years have seen a growing popularity of neural autoregressive models in science and engineering, especially for multi-scale chaotic dynamical systems such as turbulent flows, weather and climate modeling, and ocean modeling. These autoregressive models are typically implemented as deep neural networks trained to predict the state of a system at the next time step given its current state. Although they are often very successful at short-term prediction, they frequently become unstable over longer time scales. Depending on the architecture, the choice of loss functions, and other design decisions, a model may exhibit a longer stability horizon, a shorter one, or even become unstable relatively quickly. While recent research has explored ideas inspired by numerical methods, such as hard-constraining architectures with higher-order integrators or incorporating physics-inspired loss functions to improve long-term stability, there is still no clear a priori diagnostic to evaluate a model’s quality in terms of both performance and stability, and more broadly, a rigorous or semi-empirical theory describing inference-time stability in these models remains lacking. In this work, we draw on linear stability analysis from classical numerical methods to develop and demonstrate a semi-empirical theory of stability for neural autoregressive models that is agnostic to architecture, integration schemes used to constrain the model, and loss functions. Building on this theory, we introduce a novel stability-promoting loss function that improves both predictive performance and long-term stability in neural autoregressive models of dynamical systems.
Speaker bio: Ashesh is an Alfred. P. Sloan fellow and assistant professor in the department of applied mathematics at the University of California Santa Cruz. His interests lie at the intersection of theoretical deep learning, dynamical systems, and computational physics. Ashesh did his PhD from Rice University, Houston and spent a year at Xerox PARC and then at SRI as a staff research scientist before moving to UCSC.
March 6 (10:30am): Agnieszka Truszkowska, University of Alabama, Huntsville
Title:
Abstract:
March 13 (10:30am): RTG Group Meeting
Research updates:
Dates for Spring 2026
April 3:
April 10: Sergei Pilyugin, University of Florida
April 17:
April 24:
May 1:
May 8:
May 15:
May 22:
May 29:
June 5: