I don't understand this argument at all. Of course it isn't making money yet-- that's because it's an early technology that is still being researched. Sure it might never mature, but it seems crazy to call it a "bubble" or to analyze it based on current sales figures.
That wasn't the argument made in the article. There are reasons to believe that the technology is fundamentally unsound, and will never be able to scale or make money.
"The simple reason for this is that despite years of effort nobody has yet come close to building a quantum machine that is actually capable of solving practical problems. The current devices are so error-prone that any information one tries to process with them will almost instantly degenerate into noise. The problem only grows worse if the computer is scaled up (ie, the number of “qubits” increased)."
The current mainstream view is that QC is a very hard but tractable engineering problem. There is no fundamental reason for quantum error-correction to not work, and a demonstration that it cannot would be a major and surprising breakthrough in fundamental physics. This has been our state of understanding for a couple of decades now, and the progress we have been seeing is consistent with this view.
To be clear, there are people making sophisticated arguments for fundamental barriers to quantum computation. For instance, the quantum-skeptical mathematician Gil Kalai writes about his thoughts on recent progress on QC at (https://gilkalai.wordpress.com/2022/05/26/waging-war-on-quan...). His view is considerably more nuanced than this FT article, and much more conducive to learning and discussion. I hope you take the time to read it, and I think I'll submit this to HN main as well.
As someone working in QC control systems, this holds up with my experience. We're fundamentally dealing with tough engineering problems, primarily on the hardware side of things. To be clear, these systems are very complex and rely on a multitude of software and hardware working, and working well. I'm of course biased, but I lean towards a positive outlook on QC.
It's not clear and obvious that this and the following 5 paragraphs mean that the technology is "fundamentally unsound" either. It's just that there are big problems we haven't yet figured out how to work with.
Though it is true that we don't know the degree to which we will be successful at developing this technology, and we know there are fundamental properties we will need to contend with (for better and for worst), this is entirely consistent with how "early technology" develops.
AFAIK, even "commercially oriented" quantum computing projects are better understood as being in a research stage at this time. When you do research, in general, it feels daunting and it's not at all obvious that things are going to work. (my field is biochem)
The problem is that none of this scales beyond toy systems with a hand full of qubits. As soon as you try making it bigger, everything starts falling apart. I feel like this is a fundamental difference to digital logic which is extremely easy to scale.
It's not actually early technology, it's been developed since the 80's. And if the underlying theories are unsound - if it doesn't even work in theory - then putting more money in won't make it magically viable.
1. Seems a bit unfair to say it has been developed since the 80's. In 80's a couple of people (e.g. Feynman) noticed that if you have a quantum simulator, you can simulate chemistry in a way that a classical computer is seemingly incapable of. But the transmon (one of the first possibly viable implementations of a qubit) was not developed until the 00's and complete control of some of these systems (e.g. transmon coupled to an oscillator, in order to make a memory) was not demonstrated until the 2010's. Life-times of quantum memories have also been growing exponentially for more than a decade (a trend that started in the 00's).
2. It is worth mentioning that by the standards of your comment, the time between conceiving of a classical computer (Babbage) and a scalable electronic computer (ENIAC and family) was about a century.
3. While ultimately there might be a "quantum winter" in the next few years because we (I work in the field) overpromised, this would not be the first time a tech that ultimately works gets disregarded for a decade or two because of mismanaging expectations (e.g. Liquid Crystal displays or neural networks, which were both developed for many decades before being commercially viable).
EDIT: And yes, there are some startups with misleadingly general pitches.
Liquid crystal displays were commercially viable 10 years before they became commercially available.
Source: my dad worked for a company that made display tubes for CRT manufacturers all over the world. He told me back in the seventies that they knew how to make TVs you could hang on the wall like a picture, but weren't making them because they were camping on a lucrative market.
If the output side is saturated with work load but the input side keeps growing to blow up without significantly changing the output, or at worst affecting it negatively (quantum blockchain buzzword bingo), it may be fair to speak of a bubble.