It IS semantically correct. The number of functions from a set of 0 elements to a set of 0 elements is exactly 1. No parallel argument exists for division by zero. The two situations are not analogous at all.
Mathematics is full of cases where the same notation means different things. You’ve kind of dodged the question of what 0^0 is by talking about set theory, because it is (in some sense) arbitrary that X^Y means Y -> X or its size (cardinal exponentiation), and in analysis may mean exp(y log x). We accept that for natural X,Y except (0,0) these must agree but that does not imply that they should agree at (0,0), because the definitions are simply not the same.
If you are serious about math, which it seems you are, I would be a bit more careful about “transporting” notation from one field from another and arguing about correctness.
The usual way to transport a definition from a discrete domain to a continuous one is a technique called analytic continuation. I am curious if there is an analytic continuation of the discrete X^Y which contains 0^0=1, and what that would look like, but at that point you’re definitely not talking about exponentiation any more.
> it is (in some sense) arbitrary that X^Y means Y -> X or its size (cardinal exponentiation)
Are you referring to the notation? I don’t think notation is what’s being contested.
> in analysis may mean exp(y log x)
Doesn’t work with a base of 0.
> We accept that for natural X,Y except (0,0)
Why “except”?
> that does not imply that they should agree at (0,0), because the definitions are simply not the same.
The definition of exponentiation of reals typically starts with exponentiation of naturals as a given (see Baby Rudin).
> The usual way to transport a definition from a discrete domain to a continuous one is a technique called analytic continuation. I am curious if there is an analytic continuation of the discrete X^Y which contains 0^0=1
Analytic continuation refers to something else, but I get what you’re trying to say.
The answer is simple: Real exponentiation is an extension of natural exponentiation. Hence, it should have the same value as the latter wherever the latter is defined.
Yes, I’ve read that SE question before. It’s good that you brought it up. I recommend reading the comments under the accepted answer and the other answers as well.
Mathematicians not working in type theory generally agree that 1/0 is undefined (or if anything, they would set it to infinity, but even saying that would probably be too cavalier for most mathematicians). They do not necessarily for 0^0. There are good algebraic and combinatorial reasons for 0^0=1, while analysts will complain about non-continuity.
I think that’s missing the point. What I’m saying is “this is a convention, and because it is a convention, it may be reasonable to follow a different convention at times”. Different conventions are convenient for mathematicians working in different fields. Whether a convention is “semantically correct” is not a question that even makes sense, because when you define something, your definition is true by definition. The only real question here is whether you can reasonably expect people to understand you if you use different definitions.
This is a view of mathematics and programming that just completely ignores intuition. Of course, what you say is technically correct, yet mathematicians debate definitions all the time, precisely because it matters to them to get them "right" (of course, there's rarely one true answer).
This is even more true in programming. Yes, you can also learn all the type conversion rules of JavaScript, but many people agree that they are insane because they violate the principle of least surprise. Expectations matter, especially when you're fixing a bug at 3am.
Intuition is mostly an internalized notion of convention, and I don’t think that we can appeal to intuition to resolve much.
For students learning division, the fact that 1/0 is undefined is not an intuitive result and must be learned. This is their first time encountering something which is “not defined”.
We agree that intuition is important, I think we disagree about where it comes from. Intuition comes from experience. A programming language is “intuitive” because it conforms to your previous experiences with programming languages.