Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
12-minute Mandelbrot: fractals on a 50-year-old IBM 1401 mainframe (righto.com)
74 points by winkywooster on Dec 25, 2020 | hide | past | favorite | 18 comments


Only somewhat related, in 1986 or so computer labs at University of Pittsburgh had giant printers, like 8 feet long with a maze like paper path. They were however rather fast, something like 60-90 pages a minute. The pages being spit out was a common sound in the labs. They were however overwhelmingly used for ASCII. The first page would be your username spelled large on the first page out of ASCII.

I got interested in the Mandelbrot set, wrote it in pascal from the original Scientific American article. Trick is I had no way to view it. File quotas on the campus VAX were limited, so I wrote a simple RLE compression. I looked into the Postscript standard so I could write it myself, but was somewhat horrified at the complexity. I had written PCL before. But I did find a nice Fortran library. So I wrote some Fortran to read the RLE picture file and output PostScript.

Some afternoon near Xmas, I tried it out. I had no idea if it would work since I couldn't debug it, there was no postscript viewers (or graphical terminals) available. I printed the postscript file and the printers sudden page per second stopped. After a few seconds the operator jumped up and was about to reset the printer. I pleaded my case for a few minutes, the operator agreed, and a 300 DPI Mandelbrot was spit out. The operator went "Woah!", a few more people in the lab saw it and impressed. I explained what a fractal was and that you could zoom in infinitely. Several people wanted copies, so I printed them. The entire process took many minutes (pascal to generate, fortran to render, and print to print). These days you can zoom way deeper and get at least 60 FPS.

Later I wrote a EGA driver in Turbo Pascal, primarily for Mandelbrot viewing and a few simple games I wrote. I was able to do so from an article in the PC Tech Journal which printed the entire EGA card spec. Later I wrote the ASM for an 8087, then a HP-730 (PA-RISC 1.0) and a small tweak for the HP-735 (PA-RISC 1.1).


One of the first programs that I wrote and that went commercial was written in Turbo Pascal as well. It also had a GUI system with EGA/VGA drivers where most of the performance critical code was done using built-in assembly. Turbo Pascal was mighty good and productive environment. Turbo Pascal's built-in assembly was also used to implement preemptive multithreading in the same program.


The physics department I did my undergrad at had a rule -- you could do your computing practicals in whatever language you liked, but they only officially "supported" a few (C, fortran, etc) -- if you wanted help in others, tough.

Naturally, this led to some people writing answers in less common languages, like, say, 4-bit assembly for a custom CPU, metafont, and a particularly memorable Ising model in postscript (just print it, and the number of pages is the number of iterations!). It wasn't soon after the last one that a nice little laminated notice appeared saying "please do not send fractals to this printer".


Heh, did they mean don't send a factal images to the printer? Or don't send a postscript file that generates a fractal to the printer?


> additional instructions were available for a rental fee

The wet dream of chipmakers today, no doubt. Most recent example that occurs to me is the VideoCore bullshit on Raspi, afaik the effect was few paid for a license and that hardware feature mostly went unused before it went obsolete.


I should point out that the additional instructions in the IBM 1401 required additional hardware, so you weren't paying for IBM to just move a jumper.

For instance, the multiply/divide feature cost $333 per month (in 1960s dollars), but it required the installation of 183 circuit boards (each the size of a playing card) so you were getting your money's worth.

More curious was the Sterling currency option, which provided arithmetic in pounds/shillings/pence. (I.e. 12 pence to the shilling, 20 shillings to the pound.) This required 508 new circuit boards, but made it much easier to do currency math in England. Keep in mind this is transistor circuitry to do math in hardware.


One-cut upgrades do date back to that era. The change from an IBM 1130 with a 6 us cycle time to a 3 us cycle time was minimal. Indeed the machine had to run at 3 us to drive the printer; if you never cleared the printer interrupt you got a free upgrade.


Interesting that IBM thought there was enough of a market for the Sterling option to make it worth the design and marketing effort.


Intel did this 10 years ago. People didn't like it. https://en.wikipedia.org/wiki/Intel_Upgrade_Service

"The program was introduced in September 2010 for the Clarkdale-based Pentium G6951 desktop processor [...] For a $50 fee, this processor could have one additional megabyte of cache enabled, as well hyper-threading"


After a substatial hardware upgrade though - those of us who are old enough remember when IBM PC's came with a empty chip socket for the Intel 8087


Heh, as a poor student I scrounged enough for a 386sx, then later bought a non-Intel coprocessor, cyrix I think. It had 90 bits of intermediate precision (instead of Intel's 80) if you kept values on the stack. I think I managed to keep almost the entire mandelbrot calc on the stack.


i rocked a 286 with the Weitek? was it? FPU that was almost but not quite an 80287 and yet did some cool stuff nothing else did. That counted as worth it then, too. In the OP situation, all the more so.


Perhaps a better analogy to something contemporary, in this case specifically, is adding a GPU to your system, which gives you an additional instruction set, with extremely parallelized execution--at the cost of some quite substantial hardware, as Ken explained.


You mean the hardware license keys for the MPEG-2 decoder? That was very stupid, but I bet it was something the Raspberry Pi people had to do to make the board cheaper (probably through cheaper SoC). So rather than paying (fake numbers here) $5 per processor IC, they paid $3 with the decoder option available.

It sucked at the time, but thankfully the later Raspberry Pi 3B was powerful enough that that wasn’t needed anymore (it was still available on the 3B, but IDK about the 4B).


...and there is a crack available for that, not surprisingly.


(2015); discussed at the time: https://news.ycombinator.com/item?id=9243163


This is awesome and cute. Mandelbrot picture output on slow machines (vintage, retro in this case) always appeared to be something like Deus ex machina to me.

I always admired Mandelbrot pictures. First picture, that came to light was a program I ran on a C64 I ran before bed time. The next morning the magic came to light.


Side note - Mandelbrot’s autobiography is fantastic, relevant and worth a read.

Here’s a review if anyone wants to explore it and get the full book:

https://fredlybrand.com/2019/10/01/benoit-mandelbrots-the-fr...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: