High Energy Physics Seminar
The high-luminosity upgrade of the Large Hadron Collider will deliver denser collisions at an increased rate, allowing for the collection of a data set ten times larger. However, as a result, more particles per collision will be produced with increasing radiation. The current Compact Muon Solenoid (CMS) detector could not cope with the worsening conditions. To address this issue, the CMS Collaboration has designed a new sub-detector to be installed inside CMS, a radiation-hard high-granularity calorimeter (HGCal). The HGCal will provide measurements of energy deposits from particles produced in each collision finely segmented in both space and time, but clustering these energy deposits to reconstruct particle properties will be an exceptional challenge. At the same time, due to the increased detector complexity, paired to the denser environment, the experiment will produce an unprecedented volume of data per second. This avalanche of data must be filtered by the CMS real-time filtering system, known as the trigger. In this talk I will discuss approaches to make the best use of the full information provided by the HGCal using machine learning (ML), both offline and online. I will describe specific examples of ML applications that perform advanced pattern-recognition for the offline reconstruction of high-level physics properties of particles in the HGCal, as well as the online selection and compression of the information from the HGCal in the on-detector electronics for transmission to the trigger. Finally, I will present an open problem with the clustering of HGCal energy deposits in the trigger and hint at potential solutions.
The talk is in 469 Lauritsen.
Contact [email protected] for Zoom link.