Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

Because mathematics is a very dense language; building in redundancy does not scale when you're reading mathematics papers.


Would it make sense to say "building in redundancy does not scale when you're reading source code"? Not to me it doesn't.


That's a false equivalency. Mathematics, in practice, is more of a domain specific language that encapsulates more and more precise information as you increase the level of abstraction. The levels of abstraction layer upon one another to be able to handle more complex concepts while retaining all of the underlying precision.


> Mathematics, in practice, is more of a domain specific language that encapsulates more and more precise information as you increase the level of abstraction

Isn't that the point of any DSL? Also, wasn't the advent of DSLs meant to make it easier for non-programmers to understand the flow of control in a system?

I think generally the idea was to make things easier to understand by factoring out the complicated bits (implementation) and leveraging abstraction to provide clarity on top of an implementation.

If this is the case for every DSL I and every other CS major designs, why should it not be the goal of every math major? Sure to you and many other math-centrist people these symbols are "how I learned it so it must be the best" but to everyone else it's difficult to grasp and hard to understand.

The lingo is half the battle for most things but for math it seems like the lingo is the entire battle.


> The lingo is half the battle for most things but for math it seems like the lingo is the entire battle.

There's a kernel of truth there.

To abuse the DSL metaphor further, even mildly complicated mathematics concepts are only expressible using layer upon layer of ever-more-abstracted DSLs. Lower-level expansions of the building blocks to improve clarity for reading would be useless in practice, since mathematics is intended to be written and understood by humans and such an expansion would exceed our ability to keep the entire intended concept in our working memory.

Proficiency in mathematics is proficiency with understanding and extending the lingo itself. Mathematics /is/ the lingo, which is why it can't be unraveled from it.

>"how I learned it so it must be the best"

Not at all. It is, however, a very practical approach to getting someone up to speed to be able to do actual mathematics (as opposed to the basic symbol manipulations that most people confuse with actual mathematics.)


You don't need to read or understand (most of) your source code because it is 90% libraries. Mathematics is built based on understanding all the concepts behind a subject, and it's written to express to another human clear rigorous thought.

You cannot just black box everything away and choose to ignore it the way programmers do with source code and libraries.

The metric for success with a program is "does the machine run it properly". The metric for success with a piece of mathematics is "do people who've done some reading in my field understand my new result". You can cheat in one but not in the other.


> You don't need to read or understand (most of) your source code because it is 90% libraries

That's just plain incorrect. I expect all code to be written to standards of which would allow anyone to read.

> Mathematics is built based on understanding all the concepts behind a subject, and it's written to express to another human clear rigorous thought.

Isn't abstraction designed to avoid this? To remove a solution from a problem domain? To work on it without integral knowledge of all; applications? If not then all of Computer Science has been doing it wrong.

> You cannot just black box everything away and choose to ignore it the way programmers do with source code and libraries.

We don't do that. We black box things that aren't related to the problem at hand by referring to them in a terse and understandable way. For instance if I want to use a fork join scheduler for parallel thread scheduler I don't need to write out all the code for a fork join scheduler. I just need to say "You probably know what this is, if not it's called a fork join scheduler and I'll interact with it via sending messages of run, runAll, etc" and be done with it. I wont use an obscure Greek letter to notate a Fork-Join as that's archaic and insane.

> The metric for success with a program is "does the machine run it properly". The metric for success with a piece of mathematics is "do people who've done some reading in my field understand my new result". You can cheat in one but not in the other.

The KPIs for software are as follows: functionality, maintainability, simplicity. In that order. There is no single goal of software but instead a multifaceted set of goals constantly working against each other yet CS people make room for an even balance.

I don't see how this relates to you engaging in proving further information from other information. I don't see why math needs 40 non-terminal characters to type out a proof. I don't see why it can't be written in a way that is more approachable while not detrimental to the performance of current professionals. I also find it unfathomable that someone would think that the current way of doing things is the, all together, best way of doing things because that points a blind eye at the other two things I mentioned: "maintainability, simplicity".

I wouldn't call any of the advanced math world "simple" or "maintainable".


>That's just plain incorrect. I expect all code to be written to standards of which would allow anyone to read.

So you only work on open source hardware, with open source operating systems and open source drivers and open source software? And you've read the code for the libraries you use of course? Please. Unless you're Stallman you're not really assessing how much black boxing you are actually doing.

>We don't do that. We black box things that aren't related to the problem at hand by referring to them in a terse and understandable way. For instance if I want to use a fork join scheduler for parallel thread scheduler I don't need to write out all the code for a fork join scheduler. I just need to say "You probably know what this is, if not it's called a fork join scheduler and I'll interact with it via sending messages of run, runAll, etc" and be done with it. I wont use an obscure Greek letter to notate a Fork-Join as that's archaic and insane.

You do do that. And have you ever really read any mathematics at all? It is literally exactly what you describe. "I am going to use a fiber bundle here, I'll call it (A,B,pi,F), you probably know what this is..." I guess you just hate Greek? The notation is terse and completely obvious to anyone who has the faintiest idea about the mathematics they are reading. Letters are not used to describe abstract ideas (these have English names) they are used to express specific variables. Just like you do in CS. It's hard to type 'alpha' on an English keyboard or programmers would call things alpha as well. The letter notation is essential to express your ideas clearly. The subject material in doing advanced mathematics is much more difficult, abstract and complex than writing code usually, so the notation looks more difficult. That's all.

It isn't simple. Because what it's describing isn't simple. It is maintainable because it's easy to read once you understand it. Understanding is the hard part.

Programmers always think they can "fix math" by swapping notation around. They don't realize that it's as good as it can be until they actually start doing some real mathematics. Notation is really not a problem. It's a tiny minimal bump along the road that massively simplifies the entire endeavor and tries to get out of the way of actually understanding the mathematics, which is the difficult part.

The only objection you're raising that has merit is that a lot of the symbols aren't on a keyboard. Which is why we have TeX. Which is the digital language of mathematics. \pi is really not hard to type.


> So you only work on open source hardware, with open source operating systems and open source drivers and open source software? And you've read the code for the libraries you use of course? Please. Unless you're Stallman you're not really assessing how much black boxing you are actually doing.

And operating system is an operating system. A database is a database. A text editor is a texteditor.

I don't need to know exactly how someone's solution is implemented to understand the concepts and I've written at least one piece of toy software that does everything I do on a day to day. I've done CPU graphics rendering, GPU graphics rendering, web parsing, low level kernel/OS stuff, a lot of things. I'm not an expert. I'm still a student and as such I am and always will be learning. That doesn't mean I'm not a Computer Scientist and that doesn't mean I can't open up documentation on any part of my entire system and get a working facsimile from the ground up. Not to say it will be as performant of course (now that I'm in a computer organization course I can just now realize how horrible my scribbles of a CPU I wanted to build is. No pipelining and everything was going to be hooked up to a huge bus).

Basically, yes anything computer related is something I can sit down and make work. Hardware or software I can design something that will probably work to a degree. Will it work well? No. Will it be the exact same implementation as someone else? Of course not. Will my implementation function the same as whatever I'm replacing? Absolutely.

The black boxing is made in such a way that with basic documentation you can construct in your mind how said box may function.

> I guess you just hate Greek?

No I just like standard keyboard keys.

> The notation is terse and completely obvious to anyone who has the faintiest idea about the mathematics they are reading

The key here is "who has the faintiest idea about the mathematics they are reading". It's not meant for outsiders. It's meant for an in-group. Not for anyone who want's to learn math.

> Just like you do in CS. It's hard to type 'alpha' on an English keyboard or programmers would call things alpha as well

Not anyone who would ever work for, or with, me. I'd replace that code in a hot minute. Anything past a toy you're never going to ever use again NEEDS more then "alpha" or "a". Naming is the most important thing in CS, the runner up is abstraction.

No one worth their salt would ever seriously use single letter variable names outside of maybe the following list:

   i, j: Loop variables. Holdovers from FORTRAN.
   x, y: Talking about a position in a matrix or in space. 
Those are the only 4 I'd ever allow in a professional code base I was getting paid for.

> The subject material in doing advanced mathematics is much more difficult, abstract and complex than writing code usually, so the notation looks more difficult. That's all.

Oh so it's just that mathematics is hard and computer science is easy. That makes sense.

This elitism is part of the issue. I think mathematics is hard but I've worked on many complicated projects and they all don't really seem complicated. We work as a field to tame complexity through understanding a problem domain and abstracting the unimportant implementation details.

It all just turns into operations on data that are (sometimes) sequential. No matter if it is data processing and rendering for an oscilloscope like system, optimizing networking protocols and writing your own stack over UDP to control ordering of packets that are non-essential, or even writing huge orchestration systems that operate on generated statistical models to predict future actions that need to be taken. The simplistic comes from an understanding of how to create interfaces for the complex that are simpler then what they are representing.

> It isn't simple. Because what it's describing isn't simple. It is maintainable because it's easy to read once you understand it. Understanding is the hard part.

It seems like most of the time, when a colleagues explains something to me, I can completely understand the concept but I have no time for the fodder of inefficient notations that are adored by mathematicians.

> Programmers always think they can "fix math" by swapping notation around.

I hardly think I can "fix math". I want someone to fix math so I can learn it and then join in on the party. I want to learn it. It's just overly complicated.

> They don't realize that it's as good as it can be until they actually start doing some real mathematics

You're crazy if you think the current way of doing things cant be improved. Improvements are always happening in my field and I learn and adapt as they they are brought to me. Mathematicians have been stuck with an attitude of "it's different and I don't like it".

> It's a tiny minimal bump along the road that massively simplifies the entire endeavor and tries to get out of the way of actually understanding the mathematics, which is the difficult part

For you it's a tiny bump. For someone who is attempting to learn out of the classroom for their own interests it's the only difficult part. I can go to khan academy and learn about pretty much any college level topic. When I go to take a class at my college (I need to take 3 math courses) I can't learn anything. So much archaic nonsense and a lack of good explanations for new comers.

> The only objection you're raising that has merit is that a lot of the symbols aren't on a keyboard. Which is why we have TeX. Which is the digital language of mathematics. \pi is really not hard to type.

Pi tells you it's a variable, Pi does not tell you why it's a variable. I have no idea how to convey that meaning as I am not in the "cool kids club" that you and your colleagues have built for yourself


To clarify; mathematics talks about objects with increasing amounts of density; as you proceed up the abstraction stack into increasingly abstract topics, you need to be able to efficiently compress the topics you're already supposed to have mastered.

Mathematics is for experts that have mastered the lower levels of the abstraction layer. You cannot study any subfield of math without having understood the basic definitions of the field. No amount of syntax rewriting is going to change that; those definitions are not going anywhere.


"To clarify; software talks about objects with increasing amounts of complexity; as you proceed up the abstraction stack into increasingly abstract topics, you need to be able to efficiently compress the topics you're already supposed to have implemented."

What I'd expect to follow here would be a statement about how your abstractions should be terse and verbose in the right ways. Terse in the implementation specific handling and verbose in the explanation of your abstract solution.

"Software is for experts that have mastered the lower levels of the abstraction layer. You cannot study any subfield of systems programming without having understood the basic implementation of the underlying software. No amount of syntax rewriting is going to change that; those definitions are not going anywhere."

I don't see how that statement would make any sense. I expect to be able to give anyone with the most basic of understandings of computers a piece of any code (from Assembly to high level Lisp code) from widely used software like the Linux kernel all the way to stock trading applications and I expect them to understand it by looking at it for a few seconds, reading the comments, and reading the names of the variables and functions.

If the average person cannot do that, then it's bad code. Understand-ability is the only point of programming. To be understood by the computer and by a maintainer.

I'd never say that obtuse notations for an abstraction should stay around. I'd replace the obtuse notation with a better abstraction.


> I expect them to understand it by looking at it for a few seconds, reading the comments, and reading the names of the variables and functions.

Mathematics also comes with comments, many papers essentially consists of one giant comment for the proof in the appendix. Granted, many variable names could be a bit less terse, but it's just easier to write "let f be an arbitrary function A -> B" once instead of using arbitraryFunctionMappingTheSetOf{DefinitionOfElementsOfA}toTheSetOf{DefinitionOfElementsOfB} every time.


Or there could be a standardized notation for expressing functions and the concept of mapping can be derived from taking a function and applying it to a set of elements.

Function and Map are different concepts that should not be described together.

In CS I'd just say

   B = map(some_function, A)
B is the Mapping of `some_function` over the elements in A.


>Function and Map are different concepts that should not be described together

Not in this context. Assuming you are using `map` in the CS sense, B is what is called the "image" of some_function (over the domain A).

In math, a function is a mapping.

Consider the type of `some_function`. In C like notation, it would be `foo some_function(goo x)`. Which indicates that some_function takes an arguement of type `goo` and returns a value of type `foo`. In other words, some_function maps Goo to Foo (where Goo and Foo represent the set of all values of type goo and foo).


That's all fine, but surely there is a middle ground between 'single letter variables or bust' and 'parody of enterprise Java'. It wouldn't kill anyone to use short words instead of re-using phi.


It makes sense to me, although perhaps at a different level --- IMHO redundancy is bad in source code in general, and if you base your development methodologies and practices on assuming that it's not, you end up with things like Enterprise Java and C#: huge, eye-bleedingly verbose monstrosities to perform the simplest of tasks.


Just as with everything there is a sweet spot. I personally feel some things need to stay verbose while some things are implementation and not inherently needed. Before college, I didn't do any python. After touching the language because of a class I like some of the things they do. For instance:

    for i in range(10):
        print(i)
This is a great way to show a way to print 0-9 to stdout. This is great. Contrary to this, you have a more c-like method of writing this.

    for (int i = 0; i < 10; i++) printf("%d\n", i);
This is the same code, but we have a lot more stuff here that convey a lot of meaning. This is fine for me. The construct is dense, but it's not overly dense. It is simple because once you see it the first time, you will understand it and it will always make sense to you no matter how many years pass. I'm fine with this extra verbosity.

On the other side of the spectrum you have this:

    map(print, range(10))
This is far fewer characters, but it's more difficult to understand. There are more things to understand if you've never done any functional programming. This sort of dense solution is frowned upon in many cases, especially since this sort of thing rarely scales efficiently in the sense of cognitive load. I did all of my CS labs using map/reduce formulas for CS100 and it did nothing but make things harder and drive a large appreciation for expanding statements into my heart.


That map is doing an impure operation. The order of application matters.

Otherwise, for people who learned functional programming from the start, it's much easier to understand.


I find the map solution simpler than the for loop as there is no loop variable.


It's simpler, much simpler, in this context. It's not simpler when you scale up the implementation being used. After you scale it up it stops being simpler and starts being `clever`.

   "Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it." − Brian W. Kernighan


Dense and high-level. Like programming languages. You take a group of machine operations and represent those with an operator in a programming language. You take a group of programming statements, expressions, etc and represent them with a procedure. Group some procedures together into a category (or perhaps a 'class' of items.)

Take a few lines of COBOL, compile it, and how much actual machine work does that represent? Lots. Very dense.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: