Experimental Flavor Physics
print


Breadcrumb Navigation


Content

Master Thesis Topics

Image: rays

Belle II Grid Computing

Because of the large amount of data that the Belle II experiment will collect, computing resources distributed around the world are required to store and analyze it. The Belle II distributed computing system is based on DIRAC which was originally developed for the LHCb experiment Grid technologies have evolved in the last years and to profit from them they should be included in the Belle II system. In particular the Rucio data management system seems promising. Another challenge is to support modern analysis work flows like machine learning.

Measurement of anti-deuteron production with improved particle identification

Anti matter can be an important signal for the detection of dark matter annihilation. This requires a good understanding of backgrounds from standard model processes. Belle II has the potential to significantly improve the knowledge of anti-deuteron production in e+e- collisions. A key element of the analysis is a reliable and highly selective particle identification. The information from multiple detectors have to be combined in an optimal way, for example with machine learning techniques.Fast simulation with machine learning

Fast simulation with machine learning

Measurements at particle physics experiments often rely on simulations for calculations of efficiencies and estimates of backgrounds. In particular for processes with low selection efficiency, like background in the search for the rare B -> K nu nu decay, huge computing resources are invested to produce large samples of which most events are then rejected after the reconstruction. For a more resource efficient production the decision whether an event is kept or not should be made before the expensive detector simulation. Deep neural networks may be able to make this decision using only generator level information.

Optimization of simulation and reconstruction algorithms

Dedicated algorithms are implemented to be able to analyze the data of modern particle physics experiments. The requirements in physics performance and resource efficiency are demanding. The progress in software development technologies should be examined with respect to this goal. In particular the implementation of the magnetic field at Belle II should be reviewed.

Improved measurement of CP violation with a combination of tagging and vertex fit

The measurement of time-dependent CP asymmetries in B0 meson decays is one of the main tasks of B-factory experiments. Essential components for such measurements are the determination of the decay vertex and the flavor of the second B meson in the event. So far both tasks have been addressed by independent algorithms, but information whether a particle comes directly from the B decay or from a B daughter particle decay is relevant for both. A new algorithm which combines the determination of both quantities of interest may provide a significant improvement of CP violation measurements. A measurement of CP violation in the golden mode B0 -> J/psi KS will serve as benchmark.

Search for lepton-flavor violating B -> tau l decays

The baryon asymmetry in the universe tell us that there has to be physics beyond the standard model, so called new physics. An observation of the charged-lepton-flavor violating decay B -> tau l would be a clear evidence for new physics. The reconstruction of this decay is challenging because the two or three neutrinos from the tau decay are not detected. Nevertheless, the B -> tau l decay can be searched for by exploiting that the B mesons are produced in e+e- -> Y(4S) -> B anti-B reactions.

CP violation in D0 -> K0S K0S decays

So far CP violation was observed in decays of kaons and B mesons, but not in charm meson decays. In D0 -> K0S K0S decays the CP violation can be of the order of a percent and thus measurable at Belle II. At Belle a sensitivity of 1.5% was reached which is dominated by the statistical uncertainty.