This sort of thing really bugs me! Marketing departments appropriate an existing term and use it in some new, often deceptive way. This goes all the way back to when IBM released “The IBM Personal Computer”, at a time when “personal computer” was a category name. Then Microsoft released Windows, when “windows” was a generic term for windowing systems. Intel did it with their “core” architecture. The list goes on.
A counter argument is that Mauchly was actually interesting in using computers for weather modeling and I’m sure that influenced the design of ENIAC. He could only get ENIAC funded if it was valuable to the war effort. I’ve read quite a lot about that machine and I’m not aware of any architectural features that were specific to ballistics calculations. This is unlike the British Colossus, another early computer, which was specifically designed for code breaking and wasn’t general purpose.
As for the objection that it wasn’t stored program, I was interested to learn that it was converted to stored program operation after only two years or so of operation, using the constant table switches as the program store. But the Manchester Baby, which used the same memory for code and data was more significant in the history of stored program machines.
On the general question of “first computer”, I think the answer is whatever machine you want it to be if you heap enough conditional adjectives on it.
> Mauchly was actually interesting in using computers for weather modeling and I’m sure that influenced the design of ENIAC
True. Mauchly was a physics professor interested in meterology, and he knew that predicting the weather and calculating an artillery shell's flight are mathematically the same type of problem, which was important to get funding. In the fifties, Eniac was even used to calculate weather forecasts (see https://ams.confex.com/ams/2020Annual/webprogram/Manuscript/...). So these were just two related special problems, and it would be a stretch to interpret this as an intention to build a general-purpose computer. The latter had to wait until the sixties.
I am a boomer and I absolutely give a "flying fink". Stop stereotyping my generation. The group I worked in at NASA Goddard did visualizations of climate data. I heard directly from climate scientists what was going on in the world and it terrified me. When I heard about what's being done to NCAR I nearly cried. I have no children but I have told all my friends' kids how sorry I am that we're leaving them a mess to clean up. How's that for a "flying fink"?
It's fun to play around with, but unless I'm missing something, it's not possible to specify the size, in rows and columns, of the screen, such as 24x80. It's an odd omission.
I remember those monitors, but I forget what resolution they were. For what it's worth, Toy Story was rendered at 1536 x 922. I believe they re-rendered the whole thing from the RIB files for the bluray release.
Yes, there have been a couple re-render the whole thing. There was a good write up somewhere that I cannot find now where there was a discussion of keeping RenderMan bug compatible with the original or not. They also upped the shading rate and a few other quality knobs.
Film weave is also the bane of the VFX world. If a shot is going to have, say, a matte painting added in post, then a pin registered camera must be used. These cameras have a precisely machined pin that centers the film stock in the gate after the pull down claw retracts. Later post processing stages also use pin registered movements, so each frame is in exactly the same place every time it's used. Otherwise, the separate elements would weave against each other and give away the effect.
The things I took away from reading Niven was transfer booths. The world has homogenized because information and people were transmitted instantly one from corner of the globe to another.
I loved the conservation of momentum "hack" for those teleportation booths. Go on, everyone who hasn't read it, see if you can guess how he dealt with that.
reply