Hacker Timesnew | past | comments | ask | show | jobs | submitlogin
[flagged] Please do not use Python for tooling (borud.no)
20 points by bborud on Aug 16, 2022 | hide | past | favorite | 81 comments


What a bizarre take. It seems to boil down to: "Tooling should be compiled so it doesn't involve dependencies."

If we lived in a world in which most tooling (currently written as python scripts) were actually compiled, we'd be seeing better-reasoned essays begging us to write tooling as easily-editable scripts, rather than closed-off executables.

I have had scripts that I needed to edit just slightly almost every time I ran them. Having to involve a compile toolchain every time would have been terrible, and is exactly why I reached for Python in the first place.

I don't doubt the author has felt the pain of changing or missing Python dependencies. It seems likely they don't have a good understanding of pyenv or conda. The solution is not to choose something even worse: it's to write up a README on how to use conda!


The place where I run into Python is mostly while doing embedded development as the tooling for platforms like Zephyr and ESP-IDF. And they are notoriously brittle. Which has a high cost in terms of lost productivity. So we aren't talking about scripts I maintain, but scripts that are part of the tooling for platforms I use.

The point is that it doesn't matter what a theoretical developer could do to lower the "cost of ownership" for their end users, but what is actually being done by actual projects. And in the embedded sphere, the situation is pretty sad. You will have to spend a few weeks every year trying to sort out the tooling.

And it isn't like you the user are supposed to hack the tooling. That's not part of the workflow. Or that it is vitally important that one can do quick changes to the tooling while working on it. It really isn't. It just has to work.

Why should the end user have to care about whether you use pyenv or conda? Or have to figure out how to make them play nicely with some code they didn't write and which they only want to run with a minimum of hassle?

It is kind of like demanding you understand and set the timing advance on your car's engine. Why on earth would you even want to know of it, much less fiddle with it?


I'm not sure about the problem of brittleness caused by buggy scripts you must rely on -- you'd get that in any language. But if the brittleness is caused by slightly different dependencies being installed for each developer or CI machine, I also recommend creating development environment setup and activation scripts that create a reproducible development environment (based on a requirements.txt generated via pip freeze, committed into the repo; or similar).

Even if the platforms you depend on don't do this, you can do it yourself. It won't fix their bugs, but it will make the development environment more deterministic.

I agree that not all embedded developers should have to work with daily or even monthly. Ideally, all they have to do is run to commands:

    ./scripts/setup-dev-env
    source ./scripts/activate-dev-env
However, embedded software development heavily relies on invoking build systems, compilers, analyzers and other tools -- calling project-specific programs and scripts. At least one person on each project should be familiar with setting these up for the 1st time, document their usage, and support others.


The end user of the software could do all that, yes. Or the tooling could have been done in a manner that didn't make it so brittle the user has to come up with mitigation strategies to keep things running.


As a professional python dev, I understand your pain. It took me a while to sort out the pyenv/virtualenv/conda stuff myself.

Could python be better in this regard? Yes, absolutely.

I wrote a comment a while back to help out someone else with python packaging. Once you get your mind around it, it's not too bad frankly.

https://hackertimes.com/item?id=32122191


The perspective I'm trying to shine a light on is what it is like to be the consumer of tooling written in Python - not the author.

I was a bit unclear in the title (which is what most people seem to have read, and then skimmed the blog posting), but the tooling I'm talking about is things like the `west` tool for Zephyr or the `idf.py` tool for ESP-IDF. Tools that are centrally maintained and distributed. Not the ad-hoc tools people use to do various chores in their codebases.

Both of the tools mentioned are notorious for being extremely brittle. Actually, as we speak, I'm trying to figure out why a project that worked before I left on vacation, and which has been untouched for several weeks, now suddenly doesn't build (because the tooling doesn't run). This is actually par for the course. You have to expect things to break for no obvious reason. My guess? I probably updated some other piece of software and it upgraded a common dependency.

I spend about 10-20% of my time dealing with this sort of nonsense.


It sounds like your main issue is that you don't isolate your dependencies into virtual environments. That involves jumping through a couple hoops, sure (primarily just invoking `./venv/bin/activate` before you run the script), but for the most part it entirely resolves Python dependency problems.

Could Python's dependency management be less clunky? Absolutely. But ultimately, when it comes to the tools you mentioned, the vendor is the one responsible for not explaining proper Python dependency management in their documentation, or for not creating an ergonomic wrapper around it.


If you have issues like that, definitely look into virtualenvs and requirements.txt. It's meant to isolate python environments from breaking like that.


Tooling for embedded development? Please back in my day you just didn't get any tooling even the programer was left to the manufacture to make.

> It is kind of like demanding you understand and set the timing advance on your car's engine. Why on earth would you even want to know of it, much less fiddle with it?

Developers are not end users a technical understanding and capability is to be expected you aren't talking about the tooling for a pc you are talking about the tooling for a microcontroller.

A more accurate example would be the timing belt manufacture expects the car manufacture to provide their own tooling to put the timing belt on their engine, in the same way a pic manufacture expects you to provide your own tooling.

The end user is whoever buys the device you build with the pic in the same way that the end user of a timing belt is the person who buys the car not the manufacture that makes the car.


It doesn't need to be compiled. The writer just says that a tool should be future-proof. And future does not mean 18 months, but 10 or 20 years. Python makes this rather hard, and has no real culture around this.

And it's true, python-code has a relative high maintenance-cost and deployment is still one of the dominating complains. But I would say this is just the cost that comes with the low entry and fast development.

There are other ways than compiling stuff, some are used today, some others are developing slowly. But at the moment python is really not at the point where long term-availability is a big target.


Python is definitely not as easy to distribute as it should be by 2022, but I'm not ready to throw it out yet.

I think tooling authors need to look into packaging. Python actually has something similar to JARs, zipapps [0], although I have yet to see them in the wild. PyInstaller is a more well-known option.

[0]: https://docs.python.org/3/library/zipapp.html#creating-stand...


I never knew about zip apps! After using Python for over a u years I still discover new stuff!


What a bizarrely arrogant (condescending) way to formulate arguments.

"It is okay to feel provoked by this statement. As pointed out previously: you have probably invested a lot of time in Python. You will be inclined to justify and defend that investment. I would urge you to take some time to think about this and try to calm your urge to come up with counter-arguments. Let it sink in and try to be open to the possibility that this is how many users experience software written in Python."


Negging your readers might not be the best way to get your point across...


And yet, if you look at the reactions, it precisely anticipated how people would react and asked them to try to try to be open to the possibility that the blog posting had a point.

Did you consider I might have a point and that it might be helpful for people to think about it a bit before just going with their gut reaction?


If your point was strong, people would have listened. But it wasn't, so you had to make a meta point which came off as condescending.

Instead of reading that sentence above and having the gut reaction of not feeling understood, try and be open to the possibility that you didn't make a very strong argument or perhaps didn't communicate it very well.


I'm certainly open to not having communicated my point well. But it was never going to be the kind of point that would be easy to get across since people want to see any criticism of language choice as a language shitpost.


You took a very specific complaint about toolkits in the embedded systems space and generalized it to an expanded argument about any and all tooling, much of which have requirements that run directly counter to yours.

So yes, I don't think you communicated your point well. An essay about the brittleness of current embedded systems toolkits which suggested that much of the brittleness could be due to the use of Python in those toolkits, that might have been better-received--but wouldn't have gotten the attention a splashy "Please do not use Python for tooling" did.

Perhaps if you'd spent less time predicting what your essay would provoke, and more time thinking about why your essay might provoke that, you'd have written a better essay.


Actually, I haven't seen anyone make a compelling argument that counters any of the things I point out. I've mostly seen people either suggest workarounds or being upset that someone would suggest Python isn't the best thing since sliced bread.

You keep harping on about the last point. Which makes me think that you really felt that it hit home with you.


> You keep harping on about the last point. Which makes me think that you really felt that it hit home with you.

Yeah... your communication skill is pretty bad on both ends, it turns out: out and in.

I'm a Go developer who has used Python for scripting, and for teaching my daughter about programming. I don't care to defend Python as a language, but I do care about clear communication, which you continue to lack.


I wasn't criticizing Python as a language. I think you let your gut reaction run away with you a bit.


The article is kind of hyperbolic, but I feel like it echoes how I am beginning to feel about Python.

Just last week, I spent a whole afternoon getting a particular repository of Python code running on my laptop, even with the help of virtualenv and pyenv. requirements.txt doesn't tell me which version of Python the code was developed with, and several libraries are only available on certain versions, so I have to play the guessing game first of all. Then, some of the modules don't have binaries available for M1, and I can't build them from source because I don't have x, y, and z tools installed. Then there's always some issue with PYTHONPATH. I ended up having to build the whole Ubuntu docker container and develop inside that.

I love Python, it's not always like this, and it's certainly not only Python, but that experience is something I dread anyway coming in to every new Python project. It feels like DLL hell all over again. I have had a much better experience personally with C# and Rust, but admittedly I had much more solo control over those projects.


you write: "requirements.txt doesn't tell me which version of Python the code was developed with". Do you expect to have less pain with any othe rlanguage with this situation?


I'm not saying other languages don't have the same problems, but this just hasn't been as much of an issue in my experience when working with other languages.

C# for example - I had some compatibility issues between versions 2 and 3.1 of .NET Core, but at least the .csproj tells me what version it's supposed to be built with, and the LangVersion property indicates the version of C#.


Rust has a way to specify the minimum rustc version.

https://doc.rust-lang.org/cargo/reference/manifest.html#the-...


Rust is so cool. In python's requirements.txt you can put something like:

tensorflow >=1.12,!=2.1.*


“Here is a binary” sounds a lot less painful.


"In a pinch, even Java provides a better alternative as you have the ability to build all-in-one-jar files that contain all the dependencies. Not fashionable, but objectively a far less brittle option."

Really? In a pinch, use a language that's 90% boilerplate and relies on a mastery of the IDE? It's hard to take this seriously.


You don't like people being hard on Python fine. Don't do the same for Java. Modern Java doesn't have nearly as much boilerplate and is nearly as concise as python.

Relying on IDE < Benefiting from IDE more than other languages.

Python can be great and so can Java. Different strokes. Different use cases.


Modern java relies on Lombak (code generation) and DI frameworks (seriously, fuck DI) to avoid boilerplate. The base language has just as much of it as it always has. And, as I just found out this morning, DI frameworks blow up at runtime despite successful compilation and runs through unit testing.


> The base language has just as much of it as it always has

Which is roughly equal to that of C, C++, and many other languages, being a tiny constant percentage larger than equivalent code in some “modern” language.


Nope. Records, lambdas, var, multiline strings, switch pattern matching, improved instanceof, etc.

I suggest reading up on Java 18 and some of the upcoming features.


Those are improvements, no doubt, but they don't even begin to touch the quantity of boilerplate removed by Lombak.


Records do. You need Lombok for very specific things. JPA, etc. If you use JPA then the benefits and capabilities far outweigh any boilerplate or dynamic runtime overhead.


I don't mind people being hard on Python, but the idea of Java over Python "in a pinch" is lost on me, but what do I know...


As a consumer of a program, you don't really care about what language it was written in. But you do care about to what degree you have to involve yourself in the process of getting the software to run.

Java was mentioned because it offers a way to package all dependencies in a single file. From a user perspective this is preferable to "here is a program, now you have to gather all the pieces to make it run without breaking stuff". Whether one loves/hates Java doesn't really enter into it.

The language itself is somewhat irrelevant for the discussion - the way in which software is distributed and installed is the real point. Python isn't particularly nice for tooling since it offloads a lot of work on the consumer. You download a Python program, then you have to care about how you should run it, which language version it is, make sure the dependencies are downloaded etc. And even if you have gotten it to run today, tomorrow it may not work. For myriad reasons.


In my domain and maybe this is not the norm, the consumers of "tooling" are usually other engineers.


It’s hard to take your comment seriously either, based on such a hand-wavy and false description of Java. Like, other than writing implements instead of <: or whatnot, how on earth does Java have any serious amount of boilerplate?


Also don't use C or C++, i wasted so much time trying to compile stuff (no offence if you're a C dev).

Also no Java please, how do I even install JVM on my system?

Node and JavaScript, Ruby, Perl, Bash are out too of course. Bash sometimes works but what if I'm using a different shell?


Perl and Bash don't really change too much anymore. One of the upsides of the Perl 6 never happening is that Perl is relatively static.

I think a lot of the issue is that these languages love to install .directories in your home directory where all your libraries live and it works seamlessly for you.. but no one else.

We use python, perl and bash. Generally people who are more comfortable in one than the other just choose that language.


Perl has yearly releases, but is generally good at backwards compatibility.


Write Bourne shell (/bin/sh), it's POSIX so on every useful machine.


I just don't think python is a "fashionable" language. I think it's a nice utility or "glue" language. It can do almost anything (machine learning, web dev, etc) in a way perl can't, and so I don't see it disappearing like perl.

As for the section on python being antisocial, It makes sense to me. Dependency management is a tough thing for python and javascript. However I would not switch to writing Rust or Go for tooling scripts either.

Though Python has been used for devops scripts in at my job for a long time, and has been very very stable, so I'm not sure why it's causing so much pain for the author.


I rejected Python definitively in 1999 but got dragged back into Python programming a little less than 10 years ago because there was so much work.

It's easy to get into a place with Python is a nightmare. I worked at a place full of data scientists who couldn't get anything to work reliably because they installed things with

   pip --local
which contaminates all the python installations on your machine including Conda distributions. We also found many Pythons were misconfigured, for instance the defaults for

https://docs.python.org/3/using/cmdline.html#envvar-PYTHONIO...

depend on your Python and if they are set wrong and a bit of text gets ingested by the system and spit out by a 'print' your Python will crash. Since there are plenty of 'print'(s) that come in with packages you install with pip the answer 'don't print dirty text' isn't an answer.

On top of that there is the fact that pip's resolving algorithm is incorrect. It can solve simple cases but if you add enough packages it will break down.


What I read here is a poor craftsman blaming their tools.


It's more the opposite. They are blaming the poor results of other craftsman. And base their reason on python making it easy to create poor tools.


I don't think the type system particularly matters. You can make something work without assigning a type like "int" to variables. Python doesn't let you do things like "foo" + 3 (while Node is happy to), so even though you're not naming your types, the types are still static.

One unexpected downside of static languages are dealing with web services written in dynamic languages. As an example, I was writing a Slack app in Go a couple years ago. Some response gets unmarshalled into an easy-to-use struct, but contrary to the documentation, the server has no interest in returning data that can always be unmarshaled into that struct. Sometimes instead of a list of 1 element, the value will just be that one element. There is no type "[]Foo|Foo" in Go, so now you have to write a custom unmarshaling function (or say "fuck it all!" and use map[string]any, at which point you're just writing Javascript). This doesn't cause problems for people using Python or Javascript because neither of those care what's in a dictionary, but statically typed languages do, and you'll have to write extra code to work around that.

I agree that it's annoying to require the users and tool author to have the same runtime and packages installed, though. Python is an incomprehensibly large can of worms here. A python package's dependencies are architecture/platform dependent. To install C packages, you need the exact same C compiler that was used to build Python. Some popular packages have an indirect dependency on a Fortran compiler. It's a pretty big nightmare. You could bundle the runtime with the application like Go does, but the runtime is pretty big and the language is too dynamic to remove the parts you aren't actually going to use in advance, so it's not as easy as a sell.

An option there is that if you're using Python for internal tooling, you bless a particular version once a year, install it on every workstation, and say "sorry, you can't use a different version of Python". That will alleviate a lot of the author's problems, but obviously it's exceedingly politically unpopular.

Anyway, there probably isn't an objective truth here, just personal and organizational preferences here. I agree with the author in that whenever I write some tooling in Python, I regret it almost immediately. But it's working for people.


> the types are still static.

Python is a dynamically, but strongly typed language. JS is dynamic, but weakly typed. Java would be an example of a statically and strongly typed language, and C may be a statically and weakly typed language, but I’m not sure this latter is an apt description (you can cast anything in C and it may work, while this will fail in Java if the types are not compatible)


> To install C packages, you need the exact same C compiler that was used to build Python.

Is this true? I have never encountered this problem.

My experience is that, since CPython is written in C, you only need to have any C compiler for your platform, because the C compilers will be ABI-compatible. I am not sure about Windows, because Windows has some funny issues with C runtime compatibility. But I have upgraded my Python installation and C compiler separately before without issue.


How big would an application have to be in order to be too big? In fact, if I were to guess how large the binaries of the top 5 custom tools I use from the command line are I probably wouldn't be able to tell you. Is 5Mb big? 10Mb? 100?

I was mostly talking about the tooling you find in environments like Zephyr and ESP-IDF. I'd happily accept 500Mb binaries if it meant I never have to spend any time making the tooling work ever again.


> You could bundle the runtime with the application like Go does, but the runtime is pretty big and the language is too dynamic to remove the parts you aren't actually going to use in advance, so it's not as easy as a sell.

It's not so big as to be prohibitive though, especially in a dev environment... if this became the norm, I would be a lot happier with using python-based tooling than I am today.


> even though you're not naming your types, the types are still static.

Python is strongly typed, but definitely not statically.

    >>> a = "foo"
    >>> type(a)
    <class 'str'>
    >>> a = 3
    >>> type(a)
    <class 'int'>
The variable `a` has different types at different parts of the program. That's not static.


Your example doesn't show static typing (declaration/re-declaration is the same syntax as assignment in Python), but I think OP was thinking of Strong/Weak?

Python is Dynamically typed because:

    >>> a = "foo"
    >>> a / 3
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    TypeError: unsupported operand type(s) for /: 'str' and 'int'
This isn't a compile error.

Edit: Also for the re-declaration thing, you can see this using id(). Python seems do do some fancy value stuff behind the scenes but you can see the variable changing its identifier in the following example:

    >>> a = "foo"
    >>> id(a)
    140105790230896
    >>> a = "foo"
    >>> id(a)
    140105790230896
    >>> a = "bar"
    >>> id(a)
    140105790230512
    >>> a = 1
    >>> id(a)
    140105793782000
    >>> a = "foo"
    >>> id(a)
    140105790230512
As for strong/weak, I think it's a bit more fluid because I can't seem to find a set definition that everyone agrees on. Some people consider weak typing to be when the language implicitly casts or converts types for you, which Python does not do:

    >>> a = "1"
    >>> a / 3
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    TypeError: unsupported operand type(s) for /: 'str' and 'int'
    >>> type(a)
    <class 'str'>
Except sometimes it kind of does? The divide operator converts int into float "implicitly" even when both inputs are int. So type conversion is happening behind the scenes (I don't know if you would class this as "implicit" type conversion, maybe it depends on where it happens?):

    >>> a = 10
    >>> type(a)
    <class 'int'>
    >>> b = 1
    >>> type(b)
    <class 'int'>
    >>> type(a/b)
    <class 'float'>


> As for strong/weak, I think it's a bit more fluid because I can't seem to find a set definition that everyone agrees on.

Yes, I've been looking into this lately, and the terms are messy. People tend to use strong to mean "strict in ways I like" and weak to mean "permissive in ways I don't like."

Division is a good example of this ambiguity. It always results in a float, so if you divide two integers, even ones that are evenly divisible, you get a float, so that's kind of a conversion, right? But on the other hand, I don't think a function that was defined as taking two integers and returning a float would be considered an implicit conversion, even if it were overloaded to also accept various combinations of float and integer.

Gary Bernhardt has a good discussion: https://www.destroyallsoftware.com/compendium/types?share_ke...


Thanks for the correction and the comprehensive examples!


Containers work particularly well for python environments.

You can get them working locally, but it's easiest when the developers stick with the standard library. Needless to say, virtually nobody does.


But are containers portable? Can you take them, send them per mail or put on a USB-stick? And I'm not talking about the recipe for creating them, but the actual container itself.


Yes.

The actual image is a tar file, and container runtimes include commands for both exporting and importing. Exceptionally portable. More portable than JAR files and Go binaries, at least.


Wait, which container are we talking about? Docker? AppImage? Kubernetes? What about the runtime?


OCI Images. You can run them in Docker, Podman, Kubernetes, systemd-nspawn, etc.

Nobody is going to use your tool if they have to fetch it from a container registry and have to run it isolated from their filesystem and network, though. OCI images are fine if you have a pet app and you have to ship it to a production server. Not that convenient for interactive tools, though. Consider:

  $ docker run -v /tmp/foobar:/tmp/foobar debian:stable ls /tmp/foobar
  a.txt
You really just want to type "ls /tmp/foobar".


That's something that aliases excel at.


I find it hypocritical that the author writes this: "...If we remove all the pretense, it usually comes down to people defending their personal choices. We have a tendency to try to justify and protect our investment in whatever languages and tools we know and use. This makes sense. But it isn’t always rational behavior."

Yet his entire article is a justification for his defence of his code preferences.

Moreover his article is vague and has no technical depth, despite his controversial statements/advocacy. E.g. not a single line in the article refers to virtual environments of any kind, which is basic to how Python works and dependency management as part of the (arguably Pythonic) tooling process.


> Whoa. The response to this blog posting seems to be extremely polarized.

Well, duuuuh.

I kinda get the argument. Nobody wants to spend time maintaining the tools they use to make other stuff, but the tools written in Python by Python developers are likely written that way to be useful, maintainable and in a language they're productive in. Immediate time is an important expense when you're writing tools.

You talk about sunk costs, but sinking time into becoming as proficient in (eg) Rust as Python seems ridiculous. Especially for smaller tools.

And why are we picking on Python? Javascript toolchains eat themselves every 18 months. Yeah Python versions upgrade but we're going to pretend that Node doesn't?

If you don't want a fight, don't pick one. Let developers use their own expertise and experience to decide what they use.


> Python is an anti-social language. It focuses primarily on the needs of developers and not as much on the user of the software

That's why I love it for tools. It's MY job to help my users. For them, I write Java. For ME, I write Python. Because it's python's job to help me.


> If you use a language that can produce binaries the job of ensuring you have all the dependencies in all the right versions is a one-time job: it happens at build time.

This is not true for most language toolchains by default as far as I know. Most languages don't produce fully self-contained binaries. The developer have to do extra work to create self-contained binaries, otherwise the users still have to work to get the dependencies necessary.

What makes tooling different from system software in my view is that you're using software on a per-project basis, updating it as the project evolves. The developer of the tooling and the user of the tool (another developer) both have responsibilities. They have to agree on a common platform that the tools can target. The tool author is responsible for documenting explicitly any prerequisites and setup steps and they must make sure the tool doesn't implicitly depend on anything more. The user must make sure they've set up an environment for the project that meets the tools needs.

I've personally found that Python works reasonably well for tooling if I as a tool author follow a few guidelines. I require the users only to have a shell, python3, and python3-venv or miniconda installed on their base system. I provide a setup/activation script that creates/activates a virtualenv or a Conda env in the project directory and makes sure that the packages in the also-included requirements.txt or environment.yaml are installed before the tooling is run.

Since the scripts are provided as part of the tools, the tool author becomes responsible for automating the creation of a working environment for the user of the tool. This process can be automated and reproducible based on a frozen requirements.txt or similar, so most of the brittleness can be eliminated by the tool author.

I don't think any other tool implementation language would provide huge benefits to the users. They would usually still need to install some system-wide prerequisites and use some kind of per-project activation script.

The reason I like Python as a tool author is because it's better than writing shell scripts, and it's still easy to include as source with any kind of project. The standard library -- with the argparse, subprocess, urllib, shutil, etc. modules -- is good enough that for simpler tools no external dependencies (nor any activation script nor requirements.txt) are needed, but familiar for many developers.


Wow what a terrible article. Python is a "fashionable" and "antisocial" language? whaaat?


One word. Virtualenv.


They are a solution, but not a good one for this problem. Are Virtualenvs portable today? And are they used that way? And I don't mean the script/recipe for creating the virtualenv.


Could you provide an example of how you solve the problems with Zephyr or ESP-IDF using virtualenv?


It appears ESP-IDF can already be installed with a virtualenv: https://github.com/espressif/esp-idf/blob/master/tools/idf_t...

Perhaps your problems are due to external dependencies (outside of Python)?


So how do you usually install ESP-IDF to ensure smooth sailing? I usually follow a (somewhat more terse) version of the instructions that the Espressif team have on their site. I put that in my blog a while back: https://borud.no/dev/2022/esp-idf-setup/. How should I modify this procedure to help avoid breakages in the future?

UPDATE: interestingly, I decided to check out and re-build an ESP-IDF project I haven't built since before the summer and now the build breaks because of wrong version dependencies.

My point was that why on earth should I have to manage all this stuff? I just want tools that work. This tooling doesn't really provide that. It provides me with more work.


It's a general story that companies like Espressif couldn't care less about developer experience. Part of why Arduino is a success story is that their dev tools, limited as they are, are easy to get going. Microchip publishes free tools to work with Atmel parts and they have a better feature set but are a hassle to get up and running.

I think there's an expectation that top management decides which board gets used and developers just cope with whatever tools come with the board.


What Arduino did right was to understand that you have to make sure the tools run regardless of what is installed on the target machine. You make a software package that has everything it needs to do the job without external dependencies.

Now look at the discussion here. A surprising number of people actually expect users to have to understand the runtime requirements and environment of the tooling and seem to think it is the user's job to learn Python and manage this.

Not a lot of people here "get it". And oh boy do people get offended if you suggest that one may have something to learn from the kind of thinking that went into Arduino.


In that case: one word bazel

One day I wish someone would train something like copilot to automatically bazel all your dependencies. There’s plenty of training data there and you can validate it by making sure the build flags are the same as the cmake or ./configure …


"Tooling" can mean so much, and different things to different people.

Python definitely has its strengths, and dominates certain domains -- but no one language is all-purpose, so in my book a warning against using one thing for all the things is decent advice.


I have to admit when I need to install a tool, I always prioritize the Go, Rust or ANSI C alternative if it exists. It’s just easier to deal with and usually more performant (Python utilities always feel sluggish for 1 second before starting)


Python is fine as a scripting language, but people seem to forget that it's just that. It amazes me that some people think it's suitable for something like embedded development - in fact there's more than one implementation.


Just recently, I had to recompile a (singleplayer) save game editor. So basically a GUI that does some clever hex editing.

It was written in C++ using Qt.

Have you ever tried compiling a Qt program on Windows? It involves signing up for an official Qt developer account to even install qmake.

To the point I had to use an unofficial Qt installer CLI app (aqtinstall) [0] to even install the toolchain to build this little shitty app... which still relied on having several Qt .dll files in the same directory as the .exe to work.

Have you clicked on [0] yet? Well, then guess what programming language aqtinstall uses.

[0] https://github.com/miurahr/aqtinstall


How do you feel about containers, OP?


As in building containers to run tooling? I've gone down that route quite a few times and it usually isn't ... terrible. I've had some issues with access to USB and hardware from containers, but I can't remember the details anymore (I ended up ditching the project because of alternatives that were easier to work with).

One challenge is that when you work on a team and you know there will be people who have to maintain this stuff further down the road, you have to both ensure that they can make use of the technology (are allowed to by their employer) and that it is documented. I've worked with companies that won't allow Docker for instance (for reasons I was never made aware of).

(A lot of people seem to have gotten really upset about this posting. Most of the comments seem to indicate that a lot of people didn't actually read it and/or understand the context. It wasn't a language shitpost. It was a plea for people to please stop making tooling that just creates a lot of work for other people).


>I've had some issues with access to USB and hardware from containers

Can confirm, good point.

I'm not upset about your post, I was genuinely curious. To state the obvious, Java JARs have fallen out of favor and containers in fashion. I sincerely think it's a change for the better, but your post raises the interesting point of the parallels.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: