Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

Hmm, one might think there's a trend in how almost none of the 'programmable programming languages' catch on. Lithp, Rebol, Forth, Nemerle, this one. Perhaps it has something to do with readability being more important than writeability? Me, I find macros and implicits repulsive. Give me Zig over LX any day (Zig purposefully contains no macro facility).


They are less popular for the same reason that Go is popular: if every project is its own DSL, it becomes more difficult for newcomers to ramp up on a new project. That’s exactly what Go is good for.


> Hmm, one might think there's a trend in how almost none of the 'programmable programming languages' catch on. Lithp, Rebol, Forth, Nemerle, this one.

Uh, Lisp caught on quite well. Rebol was held back by coming out when being open source (or at least having an open spec) was expected for languages, but it wasn't until most of the interest had passed. Forth I gather was pretty popular, but fell of its peak even before Lisp did.

Of course, while not with Lisp style macros, pervasive metaprogramming was a key distinguishing feature of both Python and Ruby, which both reached fairly intense levels of popularity. And of course Perl...

> Me, I find macros and implicits repulsive.

And it's fine that that is your taste, but it's clearly not as universal as you would like to pretend.


None catch on? Depends on how you define "programmable".

Lisps are still here (think Clojure). Rust has macros. New-style C++ is mostly about metaprogramming.


Now I'm curious, what is it about macros you find repulsive?

And what's an 'implicit'?

I would trade real macros for almost any other language feature, as having them means being able to patch over whatever is missing.

The more powerful a language is, the less popular it will be. Couldn't be any other way, since only people with enough experience appreciate and feel comfortable dealing with them.


What I find repulsive is that macros are a micro-optimization for the author but a huge destroyer of readability. Consider a simple macro like "if-not" - just an inverted if. In a lanaguage without macros, the maintainer can be sure that there is only one construct like if. The eyes can skim over code, easily understanding semantics of the code without reading into it. Now consider the mere possibility that some if are if-nots: suddenly the mere ifs become a minefield for the reader. And if LX claims ability to eg modify the semantics of X.Y - it's going to be an unreadabe mess.

My point is, readability of code increases proportionately not just to the complexity and size of code, but also proportionately to the size of syntax. In a language with no macros, the latter is fixed and you get a linear dependency. In extensible PLs it's a quadratic one with an unknown second factor, jeesh! I think the approach with a more powerful but almost fixed language (eg Haskell where macros are hard and mostly used to generate boilerplate definitions) is better than a small but extensible language with every programmer on a contest of who makes a wittier and sleeker new control structure that is like the builtin ones but SO much cooler...


I don't know if you are aware of that, but at the time Pascal started emerging, the exact same arguments were presented by foiks used to BASIC that having the ability to add your own "words" to the language would inevitably lead to a mess.

Why didn't it happen? Because programmers overall are reasonable folks. So, yes, in theory, you can define a factorial function and call it "Not_a_Cat" (technically, it is true), but nobody does that.

Similarly, giving the ability to define `X in Y..Z` makes code more readable if you use it right, not less.


There are languages like Go intended to write large systems by large teams and languages like Forth that are better suited for small software written by a single programmer.

There's no need to choose between Zig and LX. Use the one that better fits the problem. If for your case that's Zig, good for you.


To be honest, the real issue with LX / XL / Tao3D is that I never ever really "finished" any implementation. By the time things were working, I had new ideas and broke things again. So except for Tao3D, I never really delivered a version of the language that was solid enough.

I'm trying to work on that now, which is why I'm presently spending so much time on documenting where I want to go. The idea is that if I first write a "spec", then a community can build to implement it, and it's not just my own (limited) time.

So if you are interested in helping right now, I'd say: read the doc (in progress) and send comment / doc fixes.


Zig has CTFE which takes the same place as macros. It doesn't need macros because it has macros.


Is there an example of a Zig CTFE function that, e.g., parses your whole Zig library, and generates new code from it ?


Java has compiler plugins, annotations, reflection and aspect oriented programming as metaprogramming tools.

.NET has T4 templates, attributes, reflection, expression trees, and F# adds code quotations to the mix.


T4 barely counts, as it's string- and not AST-based. For meta-programming to be useful, IMHO it really needs to be part of the language and base tool-chain. I certainly don't want to write code in Java, f'rinstance, that needs random compiler plug-in to weave in AOP calls, etc. But in Lisp, Prolog (and not Elixer, etc.) macros are part of "how stuff works".


Then languages like D don't have metaprogramming capabilities if using AST based one is considered a requirement.


F# quotations are nice, but it's really too bad that .Compile() doesn't use the actual compiler tool-chain and creates super-slow code. I understand due to history why it doesn't, but it certainly would be easier if F# used the Roslyn tool-chain.


The other important consideration is that most language users are not good language designers!

This strategy is kind of "punting" a lot of design work onto the user, which has a lot of downsides.


Ruby is still one of the most popular languages (especially for beginners, whom readability is extremely important for), and it was intended as a simple Lisp.

Metaprogramming doesn't make code less readable.


I am not convinced with your conclusion that metaprogramming does not make code less readable. My personal experience speaks otherwise.

Metaprogramming has a lot of "magic" associated with it which makes code hard to debug and understand sometimes.

You cite the success/popularity of Ruby as an example of why metaprogramming does not make code less readable. The popularity of a programming language does not mean that the language does not have significant warts. There are cultural and historical factors that make a language popular. Availability of web frameworks (e.g. Rails) are another powerful influence.

For the large part, Ruby code _is_ understandable/readable if you avoid metaprogramming. Once you have a large codebase and advanced frameworks that make heavy use of metaprogramming you can quickly get lost. There are lot of "effects at a distance" i.e. one library changes something and that can break something far away and potentially not even connected. Would you say your code is readable? No, the code is not readable because it runs in a totally unexpected manner.

My theory is that beginners decided to learn Ruby because it was easy to start with and it had a lot of powerful framesworks (e.g. Rails). The learning curve is gentle. But pretty soon when you start writing advanced ruby or start using advanced frameworks things can get complex.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: