Rather poorly though. It takes several numbers for which we have no idea of a reasonable guess, multiplies them to get a number. The best we can say about this number (pretty much) is that it is in the interval (0,1]. Which is what you get by the definition of a probability. We then multiply this by several other (somewhat) arbitrary constants to get a number.