Haven't kept up but 20years ago when I shared a building with these guys the computers were the bottle neck.
The original design for ATLAS had a sampling system where only random blocks of data could be analysed just because the computers (and buses and memory) couldn't keep up - so the data analysis became a sort of monte-carlo process as well. And this was in spite of huge and very impressive arrays of custom FPGAs and Transputers (in the 90s).
Perhaps, but we were discussing software. Specifically, "Linux" vs "some other operating system".
I agree that advances in computing must surely have a lot to do with the discovery of the Higgs, but such advances are about hardware, not operating systems. Since the 1990s, billions of dollars, both public and private, have been spent to keep Moore's Law rolling forward, and that's most of what makes modern computers more capable.
Anyway, it occurs to me that this discussion is a classic bikeshed. Rather than debate tricky and obscure issues like the quantum mechanics behind the Higgs phenomenon, the engineering of supercolliders, the politics of getting supercolliders funded, the ins and outs of data processing algorithms at CERN, or the techniques of manufacturing transistors with a 20nm gate size, we have fallen back to debating whether or not the license used for the operating system was vitally important.
If CERN had had to implement a Unix entirely from scratch in order to do their job, they would have done so. It would have been a minor side issue. Indeed, from a certain point of view, that may be exactly what they did. Why discuss how Linux was vital for CERN's scientists, but not the other way around? Perhaps the reason why Linux was so good for high-energy physics is that high-energy physicists built it that way?
Read the Wikipedia page on the Compact Muon Solenoid. There were some significant computational challenges involved, including how to (pre)process and store terabytes of data per second. These are challenges that push the envelope at every layer of computing, from hardware to the OS and application levels. It's not an informed proposition to say that the OS was some replaceable commodity with an unimportant role -- that's like saying Linux is unimportant at Google or Amazon. The ability to tune the kernel wad definitely crucial.
I would also question whether 20+ years of devlopment on a private UNIX clone would have produced a better result than Linux. There are plenty of private UNIXes but sarcely a convincing argument that a single one is better than Linux in some meaningful way.
Personally (my PhD is in experimental physics) the interesting bit is all the vacuum pumps, magnet control, beam dumps, control systems as well as the detectors.
Whether some semi-imaginary particle fits the parameters of some semi-imaginary theory is pretty uninteresting ;-)
I found that once you get past the technicalities and when you get back to a bird's eye view, interactions among elementary particles follow almost naturally and looks deceptively simple [0], and the postulate of the Higgs boson existence seems like one† of the obvious solutions explaining mass.
Now when you get back down in the trenches and have to properly define it theoretically I'm positively convinced it is another matter entirely in terms of complexity.
† Then again, we previously tried to explain how light could be propagating through a postulated medium called luminiferous aether. Hopefully it looks like the LHC experiments are not going the way aether experiments did.
The original design for ATLAS had a sampling system where only random blocks of data could be analysed just because the computers (and buses and memory) couldn't keep up - so the data analysis became a sort of monte-carlo process as well. And this was in spite of huge and very impressive arrays of custom FPGAs and Transputers (in the 90s).