Session O-3I
Exotic Data Sets and Analysis Methods
3:30 PM to 5:00 PM | MGH 287
- Presenters
-
- Scott Hai Wynn, Senior, Applied Mathematics, Computer Science, Mathematics
- Sarah Grace Mathison, Senior, Mathematics
- Mentors
-
- Be'eri Greenfeld, Mathematics
- Eric Zhang, Mathematics
- Session
-
- MGH 287
- 3:30 PM to 5:00 PM
Nilpotency degrees of finite-dimensional quadratic algebras carry essential information for their combinatorial and homological applications. It is known that the maximal nilpotency degree a finite-dimensional quadratic algebra with n generators can contain is at least n+1 for all n > 2. However, the optimality of this bound is still unknown. I propose a geometric visualization of the algebraic varieties of all quadratic algebras with n generators in degree d to find the true optimal bound. I then utilize this visualization to construct a linear program that deterministically determines whether a finite quadratic algebra with n generators exists that has a nilpotency degree of at least d. Thus far, I have verified that this algorithm will give the desired optimal bounds and have completed an implementation using Sage. I expect to find the true optimal bound on the maximal nilpotency degree of a finite-dimensional quadratic algebra with three generators shortly. However, the algorithm will require revisions for higher values of n due to scalability issues caused by its computational complexity. Knowing this optimal bound would solve several open problems in ring theory, including bounding the computational complexity of computing the global dimension of Koszul algebras. Finding a bound that extends to all algebras, including non-quadratic algebras, would also bound the computational complexity of determining if a finitely presented graded algebra is finite-dimensional
- Presenter
-
- Annika Alice Jorgenson, Senior, Earth and Space Sciences: Geology
- Mentor
-
- Katharine Huntington, Earth & Space Sciences
- Session
-
- MGH 287
- 3:30 PM to 5:00 PM
The mid-Miocene climatic optimum (17-14 million years ago) and subsequent cooling is an important global climate event that affected the North American continental interior’s temperatures, ecosystems, and hydrology. These effects can be studied using isotopic records of surface temperature and hydrologic conditions from 16 to 6 million-year-old lake minerals (carbonate) on the southern Colorado Plateau–the Bidahochi Formation. Lake carbonate bulk isotopic values measure the abundance of heavy to light carbon and oxygen isotopes (d13C and d18O), and clumped analysis (∆47, ∆48) examines the bonding of heavy isotopes that reflect carbonate growth temperatures. However, the temperature estimates derived from clumped isotope data (T∆47 and T∆48) assume mineral growth under isotopic equilibrium, which may not be true for all lake samples. Disagreement of apparent temperatures T∆47 and T∆48 and covariance with d13C and d18O values can assess non-equilibrium. In cases where carbonates precipitated out of equilibrium, applying both ∆47 and ∆48 data (dual-clumped isotope thermometry), and using models to understand the mechanism for disequilibrium (e.g., kinetic isotope effects) allows for the reconstruction of true growth temperature. Here we analyze different types of lake carbonates using sample textures and d13C, d18O, ∆47, and ∆48 covariance to determine if they grew in or out of kinetic equilibrium and reconstruct depositional temperatures. Preliminary results support the hypothesis that lake carbonates that formed at the surface and settled into deeper water (marls) precipitated in equilibrium and recorded accurate temperatures, while carbonates formed at lake margins affected by groundwater (tufas) grew quickly and may have experienced kinetic effects that require dual-clumped analysis to reconstruct their growth temperatures. The temperature record that we build using ∆47 and ∆48 data will give a data set of surface temperatures in the North American continental interior throughout the mid-Miocene climatic optimum further constraining paleoclimate records.
- Presenters
-
- Javier Garcia, Senior, Mathematics
- Rico Qi, Senior, Computer Science, Mathematics
- Vlad (Vladimir) Radostev, Junior, Applied & Computational Mathematical Sciences (Discrete Mathematics & Algorithms)
- Mathieu J (Mathieu) Chabaud, Senior, Mathematics UW Honors Program, NASA Space Grant Scholar
- Linda Yuan, Senior, Mathematics
- Mentors
-
- Silvia Ghinassi, Mathematics
- Garrett Mulcahy, Mathematics
- Session
-
- MGH 287
- 3:30 PM to 5:00 PM
Fractal dimension, a measure of geometric complexity, finds application in image analysis, biology and medicine, neuroscience, geology and various other fields, yet existing methods often lack adaptability to finite data sets. Using ideas rooted in geometric measure theory, such as Hausdorff measure and Frostman’s Lemma, this research introduces a novel approach to compute fractal dimensions for finite sets, addressing limitations of traditional methods. Using Python, we developed and tested an algorithm to validate known sets such as the unit interval, square, cube, and fractal objects including the Cantor set and Sierpinski triangle. Comparative analysis was also conducted on established methods, including box-counting and correlation integral algorithms, to demonstrate the algorithm's accuracy in determining fractal dimensions. Pivoting towards data sets, we expect to use the computed fractal dimension of real data as a tool for assessing data and optimizing data compression. Our methods offer an improvement as most existing techniques use statistical methods that are limited to integer dimensions. In addition, recent studies have shown that fractal dimension values can be useful as features in machine learning. We also improve upon the calculation of the local dimension of regions in a data set, allowing for additional insights into complex data sets. This includes identifying regions of high complexity, and we expect to show that this allows for the more effective use of algorithms such as principal component analysis. All of these are increasingly important in our society due to the abundance of high-dimensional datasets in both the physical and social sciences. Overall, the benefits of studying novel ways of calculating the dimension of large data sets include efficient representation of data, improved interpretability, and decreased computational burden, as well as detecting certain features in data such as regions of high complexity.
- Presenter
-
- Marc Sailer, Senior, Mathematics
- Mentor
-
- T.J. Fudge, Earth & Space Sciences
- Session
-
- MGH 287
- 3:30 PM to 5:00 PM
The Mid-Pleistocene Transition (MPT) was a major climatic shift in Earth’s history occurring between 1.2 and 0.7 million years ago. During the MPT, Earth’s glacial cycles shifted from a high frequency (~40 kyr), low amplitude cadence to a low frequency (~100 kyr), high amplitude cadence which has dominated since the MPT. While we are able to observe the MPT in benthic ð›¿18O records, our current ice core record only extends back 800 kyr and does not include a preservation of the entire MPT. COLDEX, a multi-institution collaboration, is seeking to find a region in Antarctica where a continuous deep ice core may preserve the MPT in order to better understand the underlying mechanisms that caused it. Ice at this depth, however, is subject to much different conditions than the ice cores that comprise our current record. My study aims to analyze how atmospheric gases, namely CO2 and the ð›¿O2/N2 ratio (used to identify precessional cycles for dating ice cores), diffuse in Antarctic ice of 1 to 1.5 million years old. I focused on the COLDEX survey region between the South Pole and Dome A. I employed two models: 1) a one-dimensional steady state model which calculates the temperature and age of the ice with respect to depth, and 2) a gas-diffusion model which uses the temperature- and age-depth relations to calculate the amplitude of the gas signals in the ice through time. The input parameters for these models are measured using aerial radar, provided by COLDEX, or interpolated accordingly. So far, I have found that CO2 is relatively well preserved in the region, while the ð›¿O2/N2 ratio is much less well preserved. This suggests that finding an ideal region for a deep ice core drill site, on the basis of gas diffusion, may be difficult.
- Presenter
-
- Bradley James Taylor, Senior, Astronomy, Physics: Comprehensive Physics, Mathematics
- Mentors
-
- David Hertzog, Physics
- Omar Beesley, Physics
- Session
-
- MGH 287
- 3:30 PM to 5:00 PM
PIONEER is a rare-pion decay experiment, which aims to test Lepton Flavor Universality (LFU), a consequence of the Standard Model (SM) of particle physics. The SM is very successful but is known to be incomplete as it cannot describe gravity, dark matter, and other observed phenomena. PIONEER will test LFU by measuring the relative frequency of the two primary decays of a subatomic particle known as a pion. The ratio of the rates of pion decay to muon and pion decay to electron is predicted extremely precisely by the SM and is sensitive to physics beyond the SM. Therefore, this ratio is extremely important to measure. Muons quickly decay to electrons, so the final product of both decays is an electron, but their energies can distinguish the decay path. Thus, this measurement requires an extremely sensitive calorimeter to measure the energies of the resulting electrons. One candidate for this calorimeter is a large array of LYSO crystals. LYSO is a fast, dense, high-light-yield scintillator whose intrinsic properties suggest it would be a natural candidate for the experiment. Despite its advantages, a large, LYSO-based calorimeter has never been developed. We wish to measure certain properties of large LYSO crystals, such as energy resolution and uniformity, to determine if they meet the requirements necessary for use in the PIONEER calorimeter. Bench tests conducted thus far have displayed impressive single-crystal resolution and uniformity at low energies when crystals are wrapped in a well-fitted specular reflector. Energy resolution tests were conducted on an array of 10 LYSO crystals, using 17.6 MeV gamma rays produced by the Van der Graff accelerator at the Center for Experimental Nuclear Physics and Astrophysics (CENPA) here at UW. LYSO crystal performance and energy resolution have been shown in preliminary tests to be within the specifications for the PIONEER calorimeter.
The University of Washington is committed to providing access and accommodation in its services, programs, and activities. To make a request connected to a disability or health condition contact the Office of Undergraduate Research at undergradresearch@uw.edu or the Disability Services Office at least ten days in advance.