Particle physicists team up with AI to solve toughest science problems
The Large Hadron Collider generates more data per second than Facebook collects in an entire year – even after compression, it's far too much to store. Through clever use of AI and machine learning, applied in real time, the data can be analysed almost as it's generated, and the system decides for itself what dat to keep and what to chuck. More recently, the teams have started deploying deep learning with networks many, many layers deep. But there's more to come.
Experiments at the Large Hadron Collider (LHC), the world’s largest particle accelerator at the European particle physics lab CERN, produce about a million gigabytes of data every second. Even after…