HN2new | past | comments | ask | show | jobs | submitlogin

Using the largest number factored as a benchmark for progress in quantum computing is like evaluating floor(f(0)) = 0 and floor(f(1)) = 0 and concluding f(x) = 0. You can't distinguish f(x) = 0 from f(x) = x/2 from f(x) = e^x/3 when your test is too coarse.

If you want to see the progress in quantum computing today, pay attention to component reliability. For example, in 2014, the best rep code run on a quantum computer had a 1% error rate per round [1]. In 2024, it was 0.00000001% and it had become possible to run full quantum codes with a 0.2% error rate per round [2]. If it takes another decade for that 0.2% to go down by a factor of a thousand, then you've got lots of time. Maybe you'll be dead before progress exceeds the coarseness of factoring. If it takes a year instead of decade (because the cost of error correction is frontloaded) then, well, pay appropriate attention to that.

[1]: https://arxiv.org/abs/1411.7403

[2]: https://arxiv.org/abs/2408.13687



I think the author was just poking fun at the idea that the 15 and 21 factoring results can be used to indicate some sort of progress. I think the graph makes that obvious.

Last I heard we were 1-2 orders of magnitude away from a physical qubit error rate low enough for error correction to have a chance of creating a threat against cryptography. So things sound pretty unlikely at the moment. We need a fundamental breakthrough.


Is 1–2 orders of magnitude a lot or a little here? With exponential progress that could be a few years; with stalled progress that’s maybe never.


Even if that's a few years, that's the progress until we get 1 (useful) qbit. We're then another 3 orders of magnitude from being able to factor a number that can't be factored on a classical computer. I don't think it's impossible, but I do think it's very much in the ~2 decades away stage.


Not into the field, but how far are we from factoring 35?


Not an expert, but IIUC current quantum computers probably could "factor" numbers in the ~10k range (but doing so wouldn't be that interesting since they could only do so without the error correction that will be incredibly necessary to making Shor's algorithm work in practice)


Maybe Shor’s algorithm is just a dead end.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: