HN2new | past | comments | ask | show | jobs | submitlogin
Sudden Progress on Prime Number Problem Has Mathematicians Buzzing (wired.com)
201 points by lelf on Nov 22, 2013 | hide | past | favorite | 74 comments


When I ordered a set of three foot-long chicken parmesan sandwiches from Subway a few weeks ago, the guy behind the counter said, "Because prime numbers are fundamentally connected with multiplication, understanding their additive properties can be tricky."

I was confused at the time, but it makes sense now.


Two points:

- In math, an innovative "attack" (such as a technique to approach a problem) can result in years of subsequent research that progresses a field. Zhang deserves a lot of credit for his accomplishment!

- He shouldn't have had to work at subway. If only our society held intelligent people in the same regard as athletes. What we have right now (hoops to get tenure, publish or perish culture, etc.) is a joke. I review papers for some notable conferences in my subfield in CS. This week I read about a dozen papers. The ones that pained me the most were written by very smart people solving made up and artificial problems. But hey ... you need to publish this crap to graduate/get tenure/get next grant/whatever. How about a basic income for everybody with a PhD?


I don't know if athletes are really a good model to follow here.

If intelligent people were treated like athletes, some would indeed benefit enormously. But it's not all roses. That "rat race" is still there. Intelligent people would be using their smarts to get money for education (just like now!) but the relationship would become much more exploitative, with the schools potentially making millions off of them without the student seeing any of that.

Afterwards, a small number of the most intelligent would have a shot at making millions, with most never making it, with most either having to find careers elsewhere, and the lucky ones leading unglorious lives in small towns making little money, always hoping to make it big but seeing that chance fade year after year.

Even those who are good enough to make millions have it thrust upon them with almost no preparation, as they get massive contracts based on their potential. Most of them, despite being intelligent, are not good with money and end up squandering their wealth.

And of course none of them are able to keep their positions for a very long time, since it's a young man's thing. They get ejected, used and abused and still fairly young, and if they haven't managed their money well then they are also broke.

Not that I'd at all mind being in Michael Jordan's position, but my point is that those are the exception rather than the rule, and despite the overall high regard for them, trying to be a professional athlete is not generally a particularly great proposition. Compared to academia, where you generally expect to reach a position where you'll have a good income until you die, or if you fail in academia you at least have useful skills that people will pay you for in the outside world, the world of sports seems downright cruel.


"How about a basic income for everybody with a PhD?" How about a basic income for everybody?

I am always amazed, how unsuccessful PhDs in life are. Myself included.


Why? Existential crisis?


A doctorate in most fields is proof of a degree of general smartness and an ability to work rigorously in the field. It doesn't necessarily have anything to do with good judgement, luck, diplomacy, social skills, practical skills, or any of the thousand other things that contribute to success in life.


To do a PhD you have to be smart enough to be able to do one, and dumb enough to actually do one.


In my opinion rather smart and despaired enough to actually do one.


Yes, something like that. At least I am not alone. PhDs are a dime a dozen. Even friends with programming skills can't find jobs...


1. He was a math professor before his proof.

2. He wasn't a cashier, he worked on the business side of a franchisee.


He was a lecturer, not a professor. In universities, lecturers are considered at the bottom of the totem pole, probably even lower than those desktop support staffs as the latter are often unionized.


I will hazard a guess that making his living at Subway helped his math endeavours. He did not have to endure the stress of a corporate or academic rat race, and nobody claimed to own his intellectual products.


As a PhD, I get the thought. But then I have to wonder: why stop there?

There are a lot of lazy and functionally useless PhDs out there.


> If only our society held intelligent people in the same regard as athletes.

I suspect you're comparing top paid athletes with low-end PhDs. Which might not be a fair comparison - who gets paid more, Michael Jordan or the CEO of Genentech? It might be fairer to compare a low-end PhD with, some college high jump athlete. So keep that in mind when I use the labels "intelligent" and "athlete".

Why? Intelligence is an inward quality. What good does it do to have 100,000 smart people? Athletes, however, are valued for their outward quality - they entertain us with their skill. They provide marginal utility for a huge chunk of society; intelligence in and of itself, only provides a marginal utility for the self.

So, unless you're some sort of Ayn-Rand individualist, really the athlete does do more good (for the collective) than the intelligent, and therefore is compensated more.

Note this is from a qualitative standpoint. Athletes are somewhat absurdly highly compensated, but there are reasons for that, but it shouldn't be surprising that Athletes are more compensated than the intelligent.

Obligatory David Wong quote:

Let's say that the person you love the most has just been shot. He or she is lying in the street, bleeding and screaming. A guy rushes up and says, "Step aside." He looks over your loved one's bullet wound and pulls out a pocket knife -- he's going to operate right there in the street.

"OK, which one is the injured one?"

You ask, "Are you a doctor?"

The guy says, "No."

You say, "But you know what you're doing, right? You're an old Army medic, or ..."

At this point the guy becomes annoyed. He tells you that he is a nice guy, he is honest, he is always on time. He tells you that he is a great son to his mother and has a rich life full of fulfilling hobbies, and he boasts that he never uses foul language.

Confused, you say, "How does any of that fucking matter when my (wife/husband/best friend/parent) is lying here bleeding! I need somebody who knows how to operate on bullet wounds! Can you do that or not?!?"

Now the man becomes agitated -- why are you being shallow and selfish? Do you not care about any of his other good qualities? Didn't you just hear him say that he always remembers his girlfriend's birthday? In light of all of the good things he does, does it really matter if he knows how to perform surgery?


> There are a lot of chances in your career, but the important thing is to keep thinking

I hope this quote gets picked up and cited throughout future history


reminds me of the Feynman Algorithm [1]:

    1. Write down the problem.
    2. Think real hard.
    3. Write down the solution.
[1] http://c2.com/cgi/wiki?FeynmanAlgorithm


It's very easy to forget the first step :-)



And the second...


That's actually the tricky bit, because often you're just thinking really hard that you're thinking really hard about the problem.


Or thinking about the wrong attack, and giving up. Or just thinking hard about how to define something... Or... Research is hard.


And the third can be a real strain on the wrist.


Yes, and rewriting is the real pain, actually.


What is it about this quote that makes you feel it should be cited throughout future history? Just curious.


It seems some people think being reminded to think is profound.


Because there is always the danger of forgetting to think and getting lost in a thoughtless state.


Previous discussion here: https://hackertimes.com/item?id=6766097


    Previous discussion here
272 points, 93 comments, 3 days ago.


I'm confused by the "Admissible Combs" sidebar. Can someone help me understand?

"Roughly speaking, a comb is admissible if there is no obvious reason why its teeth couldn’t point entirely to primes infinitely often as you move it along the number line"

Then: "A much more audacious conjecture called the prime k-tuple conjecture ... posits that any admissible comb will point entirely to primes infinitely often."

Isn't this just saying the prime-tuple conjecture states that admissible combs are admissible?


Admissibility doesn't say that a comb's teeth will point entirely to primes infinitely often, it merely says (as you quoted) that there's no obvious reason its teeth won't point entirely to primes infinitely often.

I suspect that "obvious reason" is being used as shorthand for some technical definition the article doesn't try to state. Any mathematician here that knows more?


Sure. {n, n + 2, n + 4} is a 3-tuple that is not admissible: One of n, n + 2, and n + 4 will always be divisible by three, so the tuple has no chance to ever represent three prime numbers for one value of n. (Except for n = 3.)

Conversely, {n, n + 2, n + 6} is an admissible 3-tuple. There's no fixed integer that will always divide at least one of n, n + 2, and n + 6, so we believe that there are infinitely many n for which n, n + 2, and n + 6 are all prime.


And so the claim is that there is no less obvious reason that a comb might not be all primes only finitely often. That is an audacious claim :-).


Not being able to understand the intricacies of this proof, I can take away but one message. Genius can come from anywhere, even from an academic who was only able to find a job working at a Subway diner. So be open to good ideas - they can come from anybody, anywhere.


And to top it all off, this Mathematician was in his late fifties when he published his breakthrough proof!


It's not the age, it's the marital status that affects the productivity the most.


It certainly can't hurt to not be distracted -- Erdos was famously asexual, and it seemed to have turned out well for him! [1][2]

[1] http://www.jrhenry.net/johnrhenry/erdos.htm [2] http://www.nndb.com/people/401/000032305/


Zhang is married.


I find myself downright inspired as a family man.


You should feel depressed instead. In men, creativity is destroyed by marriage http://www.abc.net.au/science/articles/2003/07/11/900147.htm


I'm very appreciative of the the statistical concept of outliers.

As Hume would point out, is is not ought.


Were you being sarcastic? I've heard this before and it's usually considered that once someone gets married/has a family of their own, they simply will have to devote much less time to their profession/'hobbies' (like math) because of time constraints, and then they just start degrading due to lack of practice etc.

One of the times it was specifically brought up in reference to a known mathematician. The takeaway being that getting in a relationship, or even married, automatically disqualifies you from achieving anything of great importance in the future....made me pretty sad that it's such a common pattern of thought.


I'm not being sarcastic. Marriage and family gave me reason and impetus to get my crap together and focus. I am easily distracted by sophomoric pursuits--now I choose hobbies and what not that I can share with the kids. One we really enjoy is kite building.

But then, I started my family early. I didn't have much of an established pattern of productivity before. So my baseline comparison may be way off.


Are you a mathematician?


Economist.


Does this mean as much as I think it does? Aren't most of our current cryptography schemes based on prime numbers taking time to calculate?


No.

Finding random primes is easy, especially if you don't mind using methods which allow an extremely tiny probability of error [1] -- this is standard practice in cryptography.

Finding random big primes p and q is easy. Multiplying them together into a product r = pq is easy. RSA cryptosystem picks p and q when generating the key, multiplies them, then throws away original p and q values. Only r is stored in the key.

Cracking the RSA algorithm itself [2] is basically equivalent to turning r back into p and q.

RSA is based on the hardness of factoring an integer into a product of primes. I don't think this conjecture will lead to practical improvements in cryptography / cryptanalysis, at least not with respect to cryptosystems that are currently widely used.

[1] http://en.wikipedia.org/wiki/Primality_testing#Probabilistic...

[2] Of course, there are many more attack vectors on specific implementations of the RSA algorithm, like trying to gain information on the secret key by measuring time taken to encrypt a message, measuring heat/power consumption/electronic noise, taking advantage of implementations that use a poor source of randomness, hacking into the system where the private key is stored, or bribing/threatening a human with legitimate access to the private key (as covered in http://xkcd.com/538/).


> Cracking the RSA algorithm itself is basically equivalent to turning r back into p and q.

This has not been proven. RSA encrypts by taking C=m^e (MOD N). The normal way of decrypting is to take C^d (MOD N), where d can be efficiently computed if you know the factors of N. However, all you need to do to decrypt C is to take the e`th root (mod n). It has not been shown that the ability to do so allows you to factor N.

In fact, it appears (after a few minutes of research, so someone please correct me if I'm wrong), that their is evidence that breaking RSA is easier than factoring [1]

[1]http://crypto.stanford.edu/~dabo/pubs/papers/no_rsa_red.pdf


"Finding random big primes p and q is easy"

For those wondering why then, it is news when a new largest prime is found: there are gradations of 'big'. The big primes used in cryptography have fewer than 1000 bits, or about 300 digits. That is not what one (nowadays) calls 'large' in the field of finding the largest prime. There, one laughs at numbers of a million digits (the largest known prime has 17,425,170 digits)


Also, for a practical engineering problem, like writing an RSA implementation which will be used in an SSL implementation which will encrypt credit card numbers on an e-commerce website, it is okay to use probabilistic tests which have a tiny chance of a false positive (the test claims a number is prime when it is actually composite).

You can generally repeat a probabilistic test with different parameters to make the probability of a "false positive" prime number very tiny -- like .000 000 000 001 -- while still staying within the computational budget manageable on an ordinary computer (obviously, a practical Internet security scheme should be able to complete its computations in a second or two on a typical desktop, laptop, server, and smartphone).

So probabilistic tests are good enough for practical purposes, but you need a non-probabilistic test to get the false positive probability all the way down to zero. And you need to use a method that has zero percent false positive chance to claim a world record. The EFF-administered monetary prize for a record-breaking prime number, for example, requires "the primality proof must be a deterministic proof..." [1] So you can't use probabilistic methods.

[1] https://www.eff.org/awards/coop/rules


I know (or think) that most record primes these days are Mersenne primes due to the relative ease of testing their primality. What I wonder is if probablistic tests could be used to find candidates for very large non-Mersenne primes before testing them using AKS.

Or is o̅(log⁷·⁵(n)) just infeasible for those sizes of numbers?


You run into problems, though, if you randomly select one of the same prime factors as someone else: https://www.hnsearch.com/search#request/all&q=gcd+public+key


The fact that mathematicians have refined the theoretical distance between pairs of primes has precisely no effect on the difficulty of factoring large composites. Those are completely unrelated problems.


I would imagine that proving they are completely unrelated would be a significant result in and of itself.


I think this is a completely fair point. Things in mathematics are astoundingly interconnected and you never know whether a seemingly benign proof will turn out to be the key to unlocking something much more pivotal.


Proving the absence of something lies in the nether regions of science and mathematics. Consider the problem of proving the absence of Bigfoot -- it can't be done. This is a well-understood logical problem with a storied past:

http://en.wikipedia.org/wiki/Russell's_teapot


Some of our current cryptography schemes are based on it taking time to factor large numbers, but this is not the same thing as finding prime numbers.

The basic principle is that if I take two large prime numbers, P1 and P2, and give you their product, Q, then it will take you a very long time to learn P1 and P2 unless I give you one of them.

It is actually very important that the hardness of that problem does not come from the difficulty of finding prime numbers. Which numbers are prime is a question that is independent of breaking any particular key, which means that if that were the source of hardness, you could reuse the work from breaking one key to break future keys, by storing a list of the prime numbers you calculated. This would be a Bad Thing.

There are a lot of misconceptions, in fact, about hardness, prime numbers, and factorization. People often assume that primality testing and factorization are equivalent, and that both are NP complete. In fact, PRIMES is known to be in P, and (the decision version of) integer factorization is probably not NP complete (it is in NP by way of PRIMES being in P), although it is not known if it is in P.


What it means is that, although on the whole consecutive prime numbers tend to have larger separations as the magnitude increases, you will still be able to find consecutive primes with smaller separation sizes. The current proof puts that limit at 600.

The twin prime conjecture posits that the limit is 2, i.e you can never exhaust pairs of consecutive primes with the smallest possible separation size.


No.

If someone discovers a shortcut in factoring very large numbers, then it will have the attention of cryptographers.


Since counting numbers are so regular I've always wondered why finding primes is so hard. Then again I have no hair so using combs is a mystery.


Related questions, what are the open problems with prime numbers? Not that I'd ever solve them, but I was having fun using machine learning algorithms try to predict the next prime number in a sequence and it was kind of interesting.


- Goldbach's conjecture (http://en.wikipedia.org/wiki/Goldbach%27s_conjecture)

- Goldbach's weak conjecture (http://en.wikipedia.org/wiki/Goldbach%27s_weak_conjecture) - Harald Helfgott claims that he proved it

- Is factoring in P? (it is known that prime testing is in P; thus the factoring is in NP ∪ coNP) - if it were true, RSA would be broken immediately (and some expert that I talked to told me, he believes that such a factoring algorithm could probably be extended to break ECC, too)

- Riemann hypothesis (http://en.wikipedia.org/wiki/Riemann_hypothesis)

- Do for each even number n exist two prime numbers p > q such that n = p-q?


What I find surprising is that there's an online tool for math collaboration (github for maths?), and that it turns out to be rather useful.


Are they still at 600?



No. That abstract says that the bound it 12 if the Elliot-Halberstam conjecture holds. Under that assumption, the bound has been 16 since 2000. The trick that brings the number down from 16 to 12 under that assumption is the same that brings the proven number down to 600. This is actually described in the linked wired article.


Ah. Thanks for the clarification.


Holy cow. You could almost say (naively I suppose), "They're almost there!" Of course, who knows if the distance between 600 and 12 is infinitely easier than the distance between 12 and 11. Still ... Progress!


Also from the article: "Conceivably, Maynard said, someone with a clever sieve idea could push this limit as low as 6. But it’s unlikely, he said, that anyone could use these ideas to get all the way down to a prime gap of 2 to prove the twin primes conjecture." So seems like something completely different is needed to actually prove the twin primes conjecture.



Could someone explain this in layman's terms? What does it mean to have infinitely many pairs of primes separated by 600?


You have a good reply, but I'd like to give another answer with a slightly different approach.

Pick a number. Make it as large as you like. A billion to the power of a billion to the power of two billion? Just pick something.

Now whatever you pick, I can then proceed to find two prime numbers which are at most 600 apart. Maybe they're just slightly larger than the number you picked. Maybe they're vastly larger. But the point is, they exist.

We can repeat the procedure with you picking larger and larger numbers and I can always find a suitable pair that's bigger than what you pick.

(In theory. In practice, we may get to a point where it takes more time to locate the actual pair than we have time before the universe burns out or collapses. But we can prove that they exist, at least.)


If the original claim is true, it means that the maximum numerical separation between primes is 600, meaning primes may sometimes be separated by a smaller gap, but no two primes will be separated by a larger gap, a numerical distance greater than 600. What this means is, if there are no gaps between primes a and b greater than 600 (i.e. |a-b| > 600), if all prime pairs have a smaller gap, then given the size of the set under consideration, it follows that there are an infinite number of pairings {a,b} in which their separation is 600 or less (|a-b| <= 600).

Also, in mathematics, finding an example of a property isn't very difficult. The difficulty lies in proving a theorem that makes a universal statement about that property -- that's the real challenge.

If that answer wasn't clear, don't hesitate to ask a clearer question.


I don't think this is right.

There are certainly prime gaps that exceed 600. In fact, the higher you up (in the number line), the higher these prime gaps tend to get.

http://en.wikipedia.org/wiki/Prime_gap

What's being said here is that despite of that, there will always be prime gaps smaller than 600. No matter how high you go, you can always find a pair of primes that are separated by less than 600. In other words, pick the biggest N you can ever dream of, and there will exist primes p1 and p2 such that they are both bigger than N and their difference is smaller than 600.


Yes, you're right, and it's too late for me to either delete or edit my original post. :(

In light of that, I think the discussion and work revolves around discovering the smallest gap as the numbers themselves become larger and tend toward infinity. Obviously the very smallest prime gap is that between 2 and 3, i.e. 1, and there are a great number of primes separated by 2, and the Twin Prime Conjecture asserts that there will always be occasional pairs of primes separated by 2 no matter how large the numbers themselves become. The present work is in part meant to put that conjecture on a more analytical footing.

I would love to delete my original post.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: