Maria Girone | |
Alma Mater: | University of Bari (PhD) |
Fields: | Particle physics Supercomputers Computational science Cloud computing |
Awards: | Marie Curie Fellowship |
Workplaces: | CERN Imperial College London |
Known For: | CERN openlab |
Maria Girone is the Head of CERN openlab. She leads the development of High Performance Computing (HPC) technologies for particle physics experiments.
Girone studied physics at the University of Bari.[1] She earned her doctoral degree in particle physics in 1994. She soon became a research fellow on the ALEPH experiment, supporting analysis and acting as liaison for the accelerator.[2] She was awarded a Marie Curie Fellowship and joined Imperial College London, where she worked on the hardware development for both the LHCb and ALEPH experiments.[3] [4]
CERN openlab was established in 2001 and supports academics at CERN in their collaborations with independent companies. Girone moved into scientific computing in 2002, working for the Worldwide LHC Computing Grid (WLCG). The WLCG stores, shares and assists in the analysis of data from the Large Hadron Collider where she developed a persistence framework. The WLCG is the largest assembly of computing resources ever collected for a scientific endeavour.[5] In the Large Hadron Collider experiment detectors there are around one billion beam collisions per second.[6] WLCG analyses billions of beam crossings and tries to predict the detector response.[7]
In 2009, whilst at the WLCG, Girone founded and led the Operations Coordinations team. She was appointed coordinator of the software and computing for the Compact Muon Solenoid (CMS) in 2014. In this capacity, she was responsible for the operation of seventy computing centres across five different continents. She joined CERN openlab as chief technology officer (CTO) in 2016, and she's leading it since 2023.[8]
She has worked on the upgrade of the Large Hadron Collider (the High Luminosity Large Hadron Collider), which will require up to one hundred times more computing capacity than it did originally. This increase in capacity will come through access to commercial cloud computing platforms, data analytics, deep learning and new computing architectures.[9]