Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

Undefined behavior isn't a feature. A spec isn't an implementation, either.

All behavior in an implementation can be teased-out if given sufficient time.

> if you were an alien trying to reconstruct it, for example - you'd hit serious problems.

I can't speak to alien minds. Considering the feats of reverse-engineering I've seen in the IT world (software security, semiconductor reverse-engineering) or cryptography (the breaking the Japanese Purple cipher in WWII, for example) I think it's safe to say humans are really, really good at reverse-engineering other human-created systems from close-to-nothing. Starting with documentation would be a step-up.



Yes, humans are incredible at reverse engineering. My point was about specification, and what happens if you have only a specification and no implementation. Because that's more closely analogous to the mathematical situation, where you're manipulating the specification.

You said:

> because programs can't work without being specified.

.. what I think you may have meant was "can't work without being implemented", because your subsequent comments are all about implementation.

> Undefined behavior isn't a feature

Yes it is, it's a feature of the C specification.

This is where a whole load of pain and insecurity comes from, because as you say the implementations must do something when encountering undefined behavior, and people learn what usually happens, then an improvement is made to the optimizer which changes the implementation.


> All behavior in an implementation can be teased-out if given sufficient time.

Can it? Given what? You would need to understand how the CPU is supposed to execute the compiled code to do that. In order to understand the CPU you would need to read the manual for its instruction set, which is written in human language and hence not any better defined than math. At best you get the same level of strictness as math.

If you assume you already have a perfect knowledge of the CPU workings, then I can just assume that you already have perfect knowledge of the relevant math topic and hence don't even need to read the paper to understand the paper. Human knowledge needs to come from somewhere. If you can read a programming language manual then you can read math. Every math paper is its own DSL in this context with its own small explanations for how it does things.


> Every math paper is its own DSL in this context with its own small explanations for how it does things.

That's really the point though: not every piece of software defines it's own DSL, nor does it necessarily incorporate a DSL from some library or framework (which in turn may or may not borrow from other DSLs, etc.). It is also impossible to incorporate something from other software without actually referencing it explicitly.

Math, though, is more like prose in this respect – while any given novel probably has a lot of structure, terminology, and notation in common with other works in its genre, unless it is extremely derivative it almost certainly has a few quirks and innovations specific to the author or even unique to that particular work that you can absorb while reading or puzzle out due to context, as long as you accept that the context is quite a lot of other works in the genre (this is more true of some genres/subfields than others). Unlike novels, at least in math papers (but not necessarily books) you get explicit references to the other works that the author considered most relevant, but those references are not usually sufficient on their own, nor necessarily complete, and you have to do more spelunking or happen to have done it already.

Finally, like prose, with math you have to rely on other (subsequent) sources to point out deficiencies in the work, or figure them out on your own. Math papers, once published, don't usually get bug fixes and new releases, you're expected to be aware (from the context that has grown around the paper post-publication) what the problems are. Which means reading citations forward in time as well as backward for each referenced paper. The combinatorial explosion is ridiculous.

It would be great if there were something like tour guides published that just marked out the branching garden paths of concepts and notation borrowed and adapted between publications, but textbooks tend to focus on teaching one particular garden path.


> It is also impossible to incorporate something from other software without actually referencing it explicitly.

No, some programming languages just injects symbols based on context. You'd have to compile it with the right dependencies for it to work, so it is impossible to know what it is supposed to be.

And even if they reference some other file, that file might not even be present in the codebase, instead some framework says "fetch this file from some remote repository at this URL on the internet" and then it fetches some file from the node repository, which can be another file tomorrow for all we know. This sort of time variance is non-existent in math, so to me math is way more readable than most code.

And you have probably seen a programming tutorial or similar which uses library functions that no longer exists in modern versions, tells you to call a function but the function was found in a library the tutorial forgot to tell you about, or many of the other things that can go wrong.


> some programming languages just injects symbols based on context

Well, okay, yes, not all software projects deliver reproducible builds of their software. Some software is, in fact, complete garbage.

I'm also not using Gene Ray's TimeCube theory[0] as an example of a mathematical paper.

> This sort of time variance is non-existent in math

Not... entirely. You could cite a preprint that then changes in the final version.

> And you have probably seen a programming tutorial or similar which uses library functions that no longer exists in modern versions

Sure. And cited papers can be retracted entirely.

[0] https://web.archive.org/web/20070718050305/http://www.timecu...


> Well, okay, yes, not all software projects deliver reproducible builds of their software. Some software is, in fact, complete garbage.

And not all math papers are properly documented either. Some math papers are in fact complete garbage. Why are you complaining about an entire field just because some of it is garbage.


Why are you ignoring the fact that I specifically said I wasn't basing my criticism on the worst examples I could find?


All meaning of math notation can be teased out if given sufficient time.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: