Anyway, I'd argue competition is tougher not because there are more competitors (there might be, I don't know) but because they have advanced.
Compared to 15 years ago, Python has gained a lot of stuff, JavaScript has become an ok language with very fast implementations, there's a dozen funcional scripting languages targeting browsers, optional typing has made a comeback, Ruby has native threads and a decent vm and a large ecosystem, Go is a valid solution for things were you might have used a scripting language, lua owns the embedded space and luajit is the state of the art etc.
In 1998 it was enough to have closures and objects to be a potential perl replacement.
In 1998 you would definitely need closures and objects to replace Perl, since Perl has had both since 1994.
I was using closures to great effect in Perl before I even knew what a closure was. (Are closures supposed to be a difficult or extraordinary feature?)
I really find it funny when other languages do things that Perl has already been doing for years.
Go2 is basically copying the way Perl5 has been advancing without breaking backward compatibility. (They don't seem to realize this.)
JavaScript copied use strict.
PowerShell copied $_.
Everybody copied Perl extensions to regular expressions.
(I can't wait for them to copy the cleaned-up, easier and more powerful Perl6 regular expression syntax.)
Every feature that I've heard that Python3 changes directly correlates with a similar change that has happened in Perl5. (Only Perl5 has done them in a way to reduce breaking your old code.)
Those are just a few of the the ones I know of.
Perl has been a far more influential language than anyone gives it credit for. Though that is probably because they just use the feature without knowing the history of it.
Eventually Perl6 will be just as influential.
---
I agree that optional typing is good, I use it everyday in Perl6. (And have been for nearly ten years.)
You misunderstood me: I mentioned closures and objects as something perl had, which were also one of the selling points of ruby and python at the time, which would be considered competitors to perl.
The example was meant to show the "minimum" needed to be an interesting scripting language.
Edit: I love perl. I love perl6. I hope languages get inspired from it.
Mmh, depends on when you start to count. I still consider 1.5.2 the gold standard for Python -- but I am weird. :) Anyway, since then it has gained new-style classes (which gave explicit methods to things like strings, tuples and numbers), list comprehensions, generators (which also gained some changes over time like 'yield from'), generator comprehensions, built-in sets with new syntax, dictionary comprehensions, decorators, descriptors, properties, context managers and the 'with' keyword, absolute vs relative imports (again with new syntax), import hooks, a boolean type, extended slices, a ternary if, ordered dictionaries, async/await, extended tuple/list unpacking, the @ operator, several new ways of string formatting, ... And of course 3.x brought major changes to the way Unicode was handled. Never mind those, um, interesting type hints and the upcoming := operator.
Python as it is now, is a very different language than it was in those days. Even ignoring the whole ecosystem with libraries and package management etc, and standard library changes, it's just a much bigger language.
I have been thinking that there might be room nowadays for a new language that is unabashedly a "scripting language"... something you can easily write scripts with, but which doesn't necessarily aspire to scale real far beyond that, so people will not immediately reach for it to write, say, 250K-line web applications. :3 (Then again, if a language is good/useful/flexible enough, people will use it for everything.)
Anyway, I would like to see such a language. The old stalwarts will probably still work just fine for this purpose, but I am curious to see if there's any innovation left/possible in this particular space.
Scripting languages were to the 200x's what visual 4GL's were to the 1990's, i.e. a whole bunch of them came and went, and only a few gained mass adoption, perhaps 4GL's Visual Basic and Delphi, and scripting languages Javascript and Python. The rest are dead, or slowly dying.
Maybe that's because scripting languages are a bad idea for anything performance critical anyway - and experience has shown that even if things are not performance critical initially, they might become eventually.
So why would you write anything in a scripting language only to rewrite it later in a compiled langue, if you could have used an equally capable compiled language in the first place?
> Maybe that's because scripting languages are a bad idea for anything performance critical anyway
This is an example of how programming as a discipline and activity is consistently dismissed by the software community as an unnecessary contrivance. "What about absolute performance?" But the reality is you're making tons of those tradeoffs. You're using a compiler, on an instruction set designed for humans, leaning on network stacks that do work for you in generic but sub-optimal ways, using container systems that introduce huge performance penalties.
Those are socially acceptable to you. They're special in that you think they aren't special.
> So why would you write anything in a scripting language only to rewrite it later in a compiled langue, if you could have used an equally capable compiled language in the first place?
This assumes they are in fact equally capable. Please don't retort with Turing completeness. That's a copout and we all know you aren't about to go shipping things in BF.
Performance just isn’t the only, or always the most important factor when choosing a language. Sometimes we choose a language and it’s ecosystem for factors we know we value right now, rather than ones we might value later on.
Or they go the other way, after an initial assumption that the task was performance-critical when in fact a scripting language was perfectly capable and took 1/10 the effort to write.
All of the popular interpreted languages we use now were dog slow when they originally came out and we still had to deal with many the quirks of low level languages back then but here we are. If you don't like it just don't use it.
This really depends on the task in question and which languages you were comparing it to. Back in the day it was fairly common to have Perl, PHP, Python, etc. outperform languages like C in cases where the task complexity meant that the C programmer had been so busy debugging that they never got around to implementing a better algorithm which required more complex code, or it turned out that the C code behind the Perl regex engine, Python list/dict, etc. already had those optimizations. Java and C++ were compiled but back then the implementations were far less mature and it was easy to find cases where they underperformed in cases which should have been easy on paper. Not having a package manager really favored languages with one or a strong standard library since the alternative was often someone doing a quick naive implementation and never getting around to significant optimizations. The other thing to remember is that in the pre-SSD era it was much easier to be I/O-bound, masking most of the differences.
Someone once posted Perl6 and C/C++ code to #perl6 on freenode.net
According to them, Perl6 was faster.
(The Perl6 code was also a lot shorter, and I could argue it was easier to understand.)
---
My guess is that the reason was that the C/C++ code had to scan for null terminators often, and copy strings around. (Or perhaps more precisely the stdlib had to do that.)
MoarVM doesn't use null terminated strings, and it treats strings as immutable objects.
If you do something like create a substring, it creates a substring object that points into the original string. So rather than copying a bunch of data it basically just creates a pointer.
(Strings are technically immutable in Perl6, it is easy to assume otherwise though.)