How much longer would it have taken if they'd had to use BSD?
No longer, because software was not the bottleneck.
The hard part about finding a Higgs boson is designing the gigantic particle accelerator and the detectors, then getting the funding to build them, then building them, then running them for N hours and analyzing the data. These things took decades. They started in the 1980s, years before Linux.
I'd contend that manpower is the bottleneck, and time spent writing and debugging software is time taken away from designing and performing experiments. How much time, exactly, depends on the precise software.
I'm not saying it's on the scale of years, but that's what I'm opening up for discussion. It's certainly plausible, given the sheer scale of work that's gone into the development of Linux, that there may be features that researchers would have had to recreate at great effort and expense had Linux not existed.
You're opening it up for discussion with the implication that somehow Linux provides superior tools. However, despite the strong implication, you're unsure of what those could actually be.
This is a leading line of discourse -- you're strongly implying that Linux permitted more rapid development, but you don't appear to have any supportable justification for why that would be the case.
It has been noted by others (in the article, for example) that Linux is the undisputed king of high-performance computing, in the public sector at least. My only assumption is that that is not random, that there are reasons for it.
I'm not trying to lead the discourse anywhere -- I'm certainly not a Linux fanatic, if that's what you're implying. I just think it's utterly plausible that Linux, and the open source community in general, saved these people a whole bunch of time.
I would imagine having an Open Source kernel is an advantage for very niche applications like this because you can add/change features that would be useless or detrimental to 99.9999% of users but would be useful in this instance.
I have no idea whether that was part of the reasoning for this or not though.
> My only assumption is that that is not random, that there are reasons for it.
It's complicated, more so than can be adequately addressed here, and far beyond a simple question of Linux specifically "saving [people] a whole bunch of time".
Haven't kept up but 20years ago when I shared a building with these guys the computers were the bottle neck.
The original design for ATLAS had a sampling system where only random blocks of data could be analysed just because the computers (and buses and memory) couldn't keep up - so the data analysis became a sort of monte-carlo process as well. And this was in spite of huge and very impressive arrays of custom FPGAs and Transputers (in the 90s).
Perhaps, but we were discussing software. Specifically, "Linux" vs "some other operating system".
I agree that advances in computing must surely have a lot to do with the discovery of the Higgs, but such advances are about hardware, not operating systems. Since the 1990s, billions of dollars, both public and private, have been spent to keep Moore's Law rolling forward, and that's most of what makes modern computers more capable.
Anyway, it occurs to me that this discussion is a classic bikeshed. Rather than debate tricky and obscure issues like the quantum mechanics behind the Higgs phenomenon, the engineering of supercolliders, the politics of getting supercolliders funded, the ins and outs of data processing algorithms at CERN, or the techniques of manufacturing transistors with a 20nm gate size, we have fallen back to debating whether or not the license used for the operating system was vitally important.
If CERN had had to implement a Unix entirely from scratch in order to do their job, they would have done so. It would have been a minor side issue. Indeed, from a certain point of view, that may be exactly what they did. Why discuss how Linux was vital for CERN's scientists, but not the other way around? Perhaps the reason why Linux was so good for high-energy physics is that high-energy physicists built it that way?
Read the Wikipedia page on the Compact Muon Solenoid. There were some significant computational challenges involved, including how to (pre)process and store terabytes of data per second. These are challenges that push the envelope at every layer of computing, from hardware to the OS and application levels. It's not an informed proposition to say that the OS was some replaceable commodity with an unimportant role -- that's like saying Linux is unimportant at Google or Amazon. The ability to tune the kernel wad definitely crucial.
I would also question whether 20+ years of devlopment on a private UNIX clone would have produced a better result than Linux. There are plenty of private UNIXes but sarcely a convincing argument that a single one is better than Linux in some meaningful way.
Personally (my PhD is in experimental physics) the interesting bit is all the vacuum pumps, magnet control, beam dumps, control systems as well as the detectors.
Whether some semi-imaginary particle fits the parameters of some semi-imaginary theory is pretty uninteresting ;-)
I found that once you get past the technicalities and when you get back to a bird's eye view, interactions among elementary particles follow almost naturally and looks deceptively simple [0], and the postulate of the Higgs boson existence seems like one† of the obvious solutions explaining mass.
Now when you get back down in the trenches and have to properly define it theoretically I'm positively convinced it is another matter entirely in terms of complexity.
† Then again, we previously tried to explain how light could be propagating through a postulated medium called luminiferous aether. Hopefully it looks like the LHC experiments are not going the way aether experiments did.
No longer, because software was not the bottleneck.
The hard part about finding a Higgs boson is designing the gigantic particle accelerator and the detectors, then getting the funding to build them, then building them, then running them for N hours and analyzing the data. These things took decades. They started in the 1980s, years before Linux.