Session T-6D
Physical Sciences - Physics, Astronomy, Geophysical 1
2:15 PM to 3:05 PM | | Moderated by William Brightly
- Presenter
-
- Alexander G (Alex) Chkodrov, Senior, Physics: Applied Physics Mary Gates Scholar
- Mentor
-
- Shih-Chieh Hsu, Physics
- Session
-
- 2:15 PM to 3:05 PM
The ATLAS detector is the largest general-purpose particle detector at the Large Hadron Collider, surrounding a site where protons collide at near-light speed and recording the resulting expulsion of particles and energy with an array of sub-detectors. Particles travelling outward from the collision deposit charge in clusters of cells (‘cluster images’) along the electromagnetic and hadronic calorimeters of the ATLAS detector. A convolutional neural network is used to analyze cluster images generated by incident Pions and classify whether the incident Pions are neutral or charged. For each type of Pion, a dense neural network is used to analyze cluster images and predict the energy of the incident Pions. In this project, I implemented a mixture density network in place of the dense neural network to analyze cluster images and predict the energy of incident Pions as well as the associated uncertainty of the energy for each cluster. The energy resolution of each cluster contains important information for the purposes of tracking particles’ trajectories throughout the detector, especially as collisions become more energetic and particles with overlapping tracks become more numerous; propagating the uncertainty from each cluster to the particle tracks would result in more accurate measurements by the ATLAS detector, allowing the standard model of physics to be studied under greater scrutiny.
- Presenter
-
- Samuel Cornwall, Recent Graduate, Mathematics, Astronomy, Physics: Comprehensive Physics
- Mentor
-
- Siegfried Eggl, Astronomy
- Session
-
- 2:15 PM to 3:05 PM
The Vera C. Rubin Observatory's proposed Legacy Survey of Space and Time (LSST) is expected to vastly expand our knowledge of moving objects present in the Solar System. Roughly 60 petabytes of astronomical data will be produced over the 10 year campaign, with which several million Solar System Objects are expected to be discovered. The new data analysis pipelines developed to handle the massive throughput generated by the LSST require testing on simulated datasets to ensure robustness and evaluate performance. I present a survey simulator that I have codeveloped with colleagues at the NASA Jet Propulsion Laboratory. The software can ingest real or modeled orbits of Solar System Objects and produces a corresponding catalog with billions of detections. Running the survey simulator on objects currently known as well as synthetic models of the Solar System, I have created a dataset that will play a vital role in the testing of LSST pipelines and create more extensive estimates of discovery rates for the various minor planet populations of the Solar System in order to confirm current expectations.
- Presenter
-
- Ishan Francesco (Ishan) Ghosh-Coutinho, Sophomore, Pre-Sciences
- Mentors
-
- Trevor Dorn-Wallenstein, Astronomy
- Emily Levesque, Astronomy
- Session
-
- 2:15 PM to 3:05 PM
This project a follow-up study to the research conducted by my mentors, Trever Dorn-Wallenstein and Dr. Emily Levesque on the use of a Support Vector Machine (SVM) classifier to classify massive stars (Dorn-Wallenstein et al. 2021). My project is to verify that the SVM classifier sorted all the stars correctly by analyzing high-resolution spectroscopic observations of the stars visible from the Apache Point Observatory, and possibly other telescopes in the future. A support vector machine is a supervised learning model used in many fields for classification, regression, and outliers detection. In the original project, a support vector machine took a table with ‘features’ for each star (here a feature is a color or magnitude or an estimate of the star’s variability) and found the N-dimensional plane in the feature-space that best separates each class from all the other classes. Simply put, you might imagine that if you had a bunch of red and blue stars with color and brightness/magnitude measurements, that plane would be a vertical line in the Hertzsprung-Russelldiagram with all the hotter blue stars to the left and all the cooler red stars to the right. The SVM algorithm’s job was to figure out the parameters that best described each category or, in other words, a general description of what classification is. There are lots of ways to accomplish this, an SVM is just one particular way to calculate what the mathematically “best” plane in the feature space is to separate classes. My project is to go through the catalog generated by the SVM algorithm from the paper and verify whether the stars were sorted correctly. Many stars in the catalog have a pre-existing classification that can be verified, but many are not classified and the scope of my project is to identify, observe and classify them.
- Presenter
-
- Han Slade Hiller, Senior, Philosophy, Physics: Comprehensive Physics
- Mentor
-
- Arthur Barnard, Materials Science & Engineering, Physics
- Session
-
- 2:15 PM to 3:05 PM
In this project, we measure electron flow in graphene, a 2-D lattice of carbon atoms, and compare the results to simulations that we run. As current is passed through typical electrical devices, electron transport is dominated by momentum relaxing electron-phonon scattering, i.e. electrons colliding with the impurities and vibrations of the crystal's lattice structure. This is typical of the omhic regime. However, other modes of electron transport are possible. In clean graphene, for example, electrons are weakly coupled to lattice sites and electron-electron scattering dominate. In these interactions, momentum transferred between electrons is conserved. When measured over a range of temperatures, we find dips in the resistance, resulting from these hydrodynamic electrons’ tendency to “pull” one another along with the bulk. Analogous to honey, these electrons have viscosity, which unlike resistivity, is a property of the fluid. This research will further elucidate properties of this electron fluid. To complete this project, we will fabricate graphene devices and study them in a table-top cryostat, measuring the current output from 4K to room temperature. We are particularly interested in how this viscous fluid behaves as it encounters a boundary within the device, an open question in the field of solid-state physics. We use a low voltage probe tip which can be positioned anywhere within the device. By blocking a portion of the drain with the probe-tip and measuring the current output along segments of the drain, we may gain insight into the boundary conditions of the electron fluid. This research will directly benefit the electronics industry: the next generation of computer chips will utilize 2-D materials such as graphene, potentially enabling the useful properties of hydrodynamic flow to be exploited.
- Presenter
-
- Haley Margaret Staudmyer, Senior, Atmospheric Sciences: Climate UW Honors Program
- Mentor
-
- Thomas Ackerman, Atmospheric Sciences, U. of Washington
- Session
-
- 2:15 PM to 3:05 PM
A simple, but important, measure of the ability of models to simulate cloud properties is whether the vertical structure of cloud occurrence in the model is consistent with that from data. Observed cloud occurrence profiles in the tropical western Pacific typically exhibit three peaks, one near the top of the boundary layer, one near the freezing level, and a broad peak in the upper troposphere. There is considerable variation in the probability of occurrence and the strength of these peaks. Here, we investigate the ability of a new generation of high-resolution models to simulate these profiles. Our study uses Global Storm Resolving Models (GSRMs) from the DYAMOND project. Nine models were run globally for 40 days starting from initial conditions on August 1, 2016. We use two data sources: ground-based data from the Atmospheric Radiation Measurement (ARM) program site in Manus Island, Papua New Guinea and Nauru, as well as data from National Aeronautics and Space Administration (NASA) satellite products (CCCM). Our study consists of a determination of local variability in cloud occurrence profiles. We make use of the ARM data to construct profiles for each August in the data series. The ARM data are available at high frequency at a single location but the monthly average profiles are influenced by local weather variation. The CCCM data are sampled over a broader spatial region but at lower spatial and temporal resolution. These two data sets provide us with an accurate assessment of cloud occurrence and a measure of internal variability. We then compare the profiles from the models. Our results suggest that the models simulate the rough structure of cloud occurrence but that there are large differences in the relative strengths of the peaks among the models and the overall probability of occurrence.
The University of Washington is committed to providing access and accommodation in its services, programs, and activities. To make a request connected to a disability or health condition contact the Office of Undergraduate Research at undergradresearch@uw.edu or the Disability Services Office at least ten days in advance.