Contributed Talks Session 1


Organizers:

Paul Cazeaux, University of Kansas

Agnieszka Miedlar, University of Kansas

 




Sunday, October 3, 2021 at 11:00 am – 12:20 pm (CST)

 

Ke Chen

University of Texas-Austin

11:00 am

Structured Sketching for Constrained Least Squares Problems



Constrained least square problems arise in many optimization and data science problems, including linearized inverse problems, tensor decomposition, and sparse recovery problems. The memory and computation cost are usually intractable due to large scale of data and high dimensions. We use the so-called “sketching strategy‚ Äù to project the constrained least square problems into a much lower dimension counter-part via random sketching matrices. The key idea of this strategy is to maintain accuracy and reduce the dimension of the problem as small as possible.

In this work, we show that a general class of tensorized sub-Gaussian matrices can be used as sketching matrices and we obtain high probability theoretical guarantees on the sketching dimension. State-of-the-art estimate is obtained when the constraint set is Euclidean space and for general convex constraint sets, we show that the sketching dimension is dependent on a statistical dimension that characterizes the geometry of the underlying problems. Accuracy and efficiency of our method are demonstrated in a few concrete examples, including unconstrained linear regression and sparse recovery problems.

Jun Liu

Southern Illinois University-Edwardsville

11:20 am

A Well-Conditioned Direct Parallel-in-Time (PinT) Algorithm for Evolutionary Differential Equations



In this talk, we introduce a direct parallel-in-time (PinT) algorithm for evolutionary differential equations with first- or second-order derivative in time. We use a second-order boundary value method as the time integrator and explicitly diagonalize the time discretization matrix, which yields a direct parallel implementation across all $n$ time levels. A crucial issue on this methodology is how the condition number of the eigenvector matrix $V$ of the time discretization matrix behaves as $n$ grows. Based on a novel connection between the characteristic equation and the Chebyshev polynomials, we present explicit formulas for $V$ and $V^{-1}$, by which we proved that $Cond2(V) = O(n^2)$. This implies that the diagonalization process is well-conditioned and the round-off error only increases moderately as $n$ grows. Numerical results on parallel machine are given to support our findings, where over 60 times speedup is achieved with 256 cores.

Jay Mayfield

Iowa State University

11:40 am

An Asymptotic Green's Function Method for the Wave Equation



We present an effective method, namely the asymptotic Green's function method, for propagating the waves through the linear scalar wave equation. The wave is first split into its forward-propagating and backward-propagating parts. Following that, the method, which combines the Huygens' principle and the geometrical optics approximations, is designed to propagate the forward-propagating and backward-propagating waves, where an integral with Green's functions that is based on the Huygens' principle is used to propagate the waves, and the Green's functions are approximated by the geometrical optics approximations. Upon obtaining analytic approximations for the phase and amplitude in the geometrical optics approximations through short-time Taylor series expansions, a short-time propagator for the waves is derived and the resulting integral can be evaluated efficiently by fast Fourier transform after appropriate lowrank approximations. The short-time propagator can be applied repeatedly to propagate the waves for a long time. In order to restrict the computation onto a bounded domain of interest, the perfectly matched layer technique with complex coordinate stretching transformation is incorporated. The method is first-order accurate, and has complexity $O(t_\epsilon N\log N)$ per time step with $N$ the number of spatial points and $t_\epsilon$ the low rank for a prescribed accuracy requirement $\epsilon>0$. Numerical experiments are presented to demonstrate the proposed method.

Molena Nguyen

North Carolina State University

12:00 pm

Take-away Impartial Combinatorial Game on Different Geometric and Discrete Structures



Following from the winning strategy for a Take-Away Impartial Combinatorial Game on only Oddly Uniform or only Evenly Uniform Hypergraphs in the Ph.D. Dissertation of Dr. Kristen Barnard (currently an Assistant Professor of Mathematics at Berea College), Molena Nguyen found the new winning strategy for a Take-Away Game on neither Oddly nor Evenly Uniform Hypergraphs during her Undergraduate Independent Research opportunity. However, these neither Oddly nor Evenly Uniform Hypergraphs must meet the given special requirements. In a Take-Away Game on hypergraphs, two players take turns to remove the vertices and the hyperedges of a hypergraph. In each turn, a player must remove either only one vertex or only one hyperedge. When one vertex is chosen to be removed, all hyperedges that contain the chosen vertex are also removed. When one hyperedge is chosen to be removed, only the chosen hyperedge is removed. Whoever removes the last vertex wins the game. All of the new theorems in this research paper are in agreement with the previous theorems in the Ph.D. Dissertation of Dr. Kristen Barnard.

KEYWORDS: Take-Away, combinatorial game, hypergraph, geometric structures,discrete structures