Session O-2I
Optics, Bosons, ML and More...
1:00 PM to 2:30 PM | | Moderated by Gerald Seidler
- Presenter
-
- Yifei Bai, Senior, Physics: Comprehensive Physics, Mathematics Mary Gates Scholar, UW Honors Program
- Mentor
-
- Subhadeep Gupta, Physics
- Session
-
- 1:00 PM to 2:30 PM
One of the Ultracold Atoms Group’s themes is to study the interaction between trapped ultracold atom mixtures. Certain experiments, such as the study of spin-dependent Feshbach resonance, requires us to select one specific nuclear spin state of the atom from the mixture. This process is achieved by the optical Stern-Gerlach technique, where we use the laser to produce the magnetic field gradient. However, this technique requires us to use the imaging path with relatively poor imaging quality due to, for example, vibrations of the optics. When we normalize these atom images, these vibrations introduce misalignment between images and thus unwanted noises. Hence my project is focused on stabilizing the imaging process by a software implementation of imaging alignment scheme. The code I developed can be seen as the analog to the inner product of two vectors, which characterizes the extent of misalignment between two vectors. This scheme has increased the efficiency of the experimental procedure and at least doubled the signal-to-noise ratio. Its easy implementation provides another route to reduce noises in the data of similar experiments.
- Presenter
-
- Jakub Filipek, Senior, Computer Science (Data Science) Mary Gates Scholar
- Mentor
-
- Shih-Chieh Hsu, Physics
- Session
-
- 1:00 PM to 2:30 PM
Recent developments in Machine Learning have led to a number of different applications across a variety of fields. This rapid progress has been fueled by the increased performance of Graphics Processing Units (GPUs). Similar rapid developments can be seen happening in Quantum Computing Hardware. While still years behind logical computing, certain statistical models indicate that quantum computers will be able to outperform classical computers within years. However, due to the lack of high memory systems, all of the distributions have to be represented in low-dimensional space. Our work focuses on using classical computing to automatically find efficient feature maps that allow users to scale down real-world or established problems into low-dimensional space, which can then be loaded into quantum computers. Additionally, by creating a simple, modular design, we want to allow other researchers to have a simple interface to compare classical and quantum versions of algorithms to investigate if there are any benefits to using quantum computing over classical systems. We expect quantum computers to perform similarly, if not better, than similarly sized classical models, but still be outperformed by larger, more complex classical systems.
- Presenter
-
- Ajay R. Rawat, Sophomore, Engineering Undeclared
- Mentor
-
- Shih-Chieh Hsu, Physics
- Session
-
- 1:00 PM to 2:30 PM
Machine learning (ML) is an important tool in analyzing huge data sets. There are various machine learning models in the realm of physics that do everything from identifying subatomic particles to predicting the energy of particle jets. Our project is focused on creating a benchmark that would be used to test different models and compare them with each other. Our goal is to host a service that would evaluate different metrics for a user-provided model and display the results. We have created a Yadage workflow that analyzes different tag taggers (i.e. ML models that identify top quarks). To evaluate the top taggers, we plotted their ROC (Receiver operating characteristic) curves. We then compared the AUC (Area Under the Curve) for each model. Our current goal is to run our workflows on REANA (Reproducible research data analysis platform) servers. We believe this project is not just restricted to the world of physics and can be extended to benchmark models from other disciplines as well such as health sciences, natual language processing, computer vision, etc. Similar Benchmarks could be created for different types of models which can be compared using a common dataset for a better comparison
- Presenters
-
- Htet Aung Myin, Senior, Physics: Applied Physics
- Evan Robert (Evan) Saraivanov, Senior, Physics: Comprehensive Physics, Mathematics
- Mentors
-
- Shih-Chieh Hsu, Physics
- Wanyun Su (moony2628@stju.edu.cn)
- Session
-
- 1:00 PM to 2:30 PM
In the ATLAS detector at the Large Hadron Collider (LHC), high energy quarks and gluons can be produced during proton-proton collisions. Individual quarks and gluons cannot be directly observed, however, when they enter the detector, interactions with the detector create a number or secondary particles called hadrons, which are made up of groups of quarks and gluons, that can be directly observed. A tagger is used to measure the secondary particles and classify them as coming from a quark or a gluon. The tagger uses several variables which are derived from detector data and machine learning algorithms. We analysed data from the detector and Monte Carlo simulation and compared them using the derived variables, which involves calculating ratios, Monte Carlo closure and scale factor, between distributions of the simulation and detector data for each variable. The Monte Carlo closure and scale factor between extracted detector samples and extracted Monte Carlo samples is expected to be close to 1, indicating the simulation models the data, with an uncertainty less than 10%. The results of this study give an analysis on how well these variables are able to classify the initial particle, and allow better calibration of the tagger parameters. Better classification allows for more precise measurements of physics processes at the LHC.
- Presenter
-
- Helen Chen, Junior, Mathematics, Physics: Comprehensive Physics Mary Gates Scholar
- Mentor
-
- Shih-Chieh Hsu, Physics
- Session
-
- 1:00 PM to 2:30 PM
After CERN’s discovery of the Higgs boson at the Large Hadron Collider in 2012, particle physicists shifted their aim to use the Higgs as a new tool to search for particles beyond the Standard Model. This is made possible with the theoretical prediction that new heavy unknown particles have an anomalous production of the Higgs in their decaying processes, hence, a good model which simulates the production and decay chain of the Higgs is crucial for any new discovery. One of the ways of testing existing models involves a computational process called Higgs-to-bottom-quark (Hbb) tagging, which identifies bottom quarks, the usual product of Higgs decays, to determine whether any Higgs have been produced in a proton-proton collision. This study investigates the performance of the tagger in question and calculates its precision using the uncertainty of many experimental factors. Gluon-to-bottom-quark (gbb) decay events were used to test the tagger, and a computer algorithm calculated its efficiency. We expect a good agreement between the results from Hbb decays and the results from gbb decays, in which case the results obtained from gbb decays could be used as accepted values for future Hbb studies.
- Presenters
-
- Carter N. Merrill, Senior, Physics: Comprehensive Physics, Astronomy
- Andrew Wu, Freshman, Center for Study of Capable Youth
- Mentor
-
- Shih-Chieh Hsu, Physics
- Session
-
- 1:00 PM to 2:30 PM
In 2024 the Large Hadron Collider will undergo upgrades that will dramatically increase the number of collisions occurring. In order to accommodate the increased bandwidth, the innermost detector as well as the readout hardware and data acquisition software will be upgraded. This readout chip, called the RD53, already has a preliminary model called the RD53a for which a software emulator already exists. The final design of the chip, the RD53b has recently been released and the software emulator for the RD53a needs to be updated in order to reflect the new specifications of the RD53b. Our research is seeking to create a software emulator for the RD53b readout chip. In order to create the software emulator, we are updating the existing software emulator for the RD53a and comparing the emulator's output with that of the physical hardware chip. We are currently working to create robust software tests of the old software emulator by running scans from the data acquisition software using the software emulator. Thus far the analog, digital and threshold scans for the RD53a emulator have been implemented. In this talk we will give an overview of the design of the software emulator, deployment progress and the further development plan. We will show how this software emulator can enable the faster development of new data acquisition software in preparation for the upgrades to the Large Hadron Collider.
- Presenter
-
- Kuan-Wei Lee, Senior, Physics: Comprehensive Physics Mary Gates Scholar
- Mentor
-
- Aaron Hossack, Aeronautics & Astronautics
- Session
-
- 1:00 PM to 2:30 PM
HIT-SI3 is a plasma physics experiment built for studying magnetic confinement of a fusion plasma for eventual clean energy production. HIT-SI3 utilizes steady inductive helicity injection to form and sustain spheromak equilibria, a stable arrangement of plasma. A tomography system has been installed to assess the symmetry of plasma density in the HIT-SI3 spheromak plasmas. The tomography diagnostic consists of four toroidal chord fans and three sets of three poloidal fans that provide 3D plasma emission information. Each fan expands from a wide-angle lens with a 130 degree field of view coupled to bundles of fiber optics. The light collected by the fiber optics is split into two paths, filtered at 668 nm and 728 nm HeI emission lines, and imaged by a high-speed camera. Since the ratio of the 668/728 nm emission has a strong plasma density dependence within the range of typical HIT-SI3 plasma parameters, the 3D emissivity profiles constructed by inverting line-averaged emissivity along chords can be related to the plasma density profiles. The reconstruction of emissivity profiles constitutes a highly underdetermined and ill-posed inversion problem and the maximum entropy method was chosen to find the most physically informed solution. My work in this project is to develop an algorithm that performs plasma density reconstruction accurately on HIT-SI3 to facilitate the study of plasma dynamics and comparisons to HIT-SI3 simulations. According to the NIMROD simulation, the HIT-SI3 plasma has lower density at the center of the flux conserver and higher density around the edge of the wall. We expect that the plasma density profiles reconstructed by the tomography system will capture this "hollow" feature.
- Presenter
-
- Murtaza A. (Murtaza) Jafry, Junior, Extended Pre-Major Mary Gates Scholar, UW Honors Program
- Mentor
-
- Silas R. Beane, Physics, university of washington
- Session
-
- 1:00 PM to 2:30 PM
In this project, we consider two particle scattering with an arbitrary finite-range potential interaction, contained in a volume by a harmonic trap. The properties of this scattering system are then studied through Effective Field Theory. From this analysis, we attain general relationships among effective range parameters in various dimensions. This project is an extension of confinement induced through periodic boundary conditions studied previously. From this work, calculations can then be made to deduce properties of multi-body condensates. Currently, we already have shown the general relationships between effective range paramaters in toroidal compactified spaces.
The University of Washington is committed to providing access and accommodation in its services, programs, and activities. To make a request connected to a disability or health condition contact the Office of Undergraduate Research at undergradresearch@uw.edu or the Disability Services Office at least ten days in advance.