> It is uncertain when it went from purple to red, but it's likely a practical reason: when going outside you don't wear slippers but leather boots, which were very easy to get in a red color, purple? Not really. You can easily find purple used anywhere in clothing, but not shoes.
Wasn't the point of regal colours that they were scarce/expensive?
Scarse is good, for regal, but they do have to actually exist. Were there dyes that would work on leather at that time and yield the desired color and not wash off in 3 seconds and look like crap?
> Is someone actually mad prettier is changing their single quotes to double quotes? Are they mad some line is breaking at some word?
Yes, both of these.
Obviously there are huge benefits to auto-formatting in large teams and popular open source projects, but some people also find benefit in having control of alignment, line breaks, indentation etc.
> The author of DOOM for SNES, Randy Linden, did not have access to any documentation about the GSU chip or even DOOM source code. He reverse engineered all of it
Technically this is impressive, but why was it necessary?
Randy Linden, the port's sole programmer, initiated the port of Doom for the Super NES on his own initially, as he was fascinated by the game.
Since Doom's source code was not yet released at the time, Linden referred to the Unofficial Doom Specs as a means of understanding the game's lump layout in detail. The resources were extracted from the IWAD, with some (notably sprites such as the player's sprites and the original status bar face sprites) unused due to technical limitations.
According to an interview, due to lack of development systems for the Super FX, Linden wrote a set of tools consisting of an assembler, linker, debugger, dubbed the ACCESS, on his own Amiga before beginning development of the port proper. For the hardware kit, he utilized a hacked Star Fox cartridge and a pair of modified Super NES controllers plugged into the console and connected to the Amiga's parallel port. A serial protocol was used to further link the two devices.
After developing a full prototype, he later showcased it to his employer, Sculptured Software, which helped him finish the development. In the interview, Linden expressed a wish that he could have added the missing levels; however, the game, already the largest possible size for a Super FX 2 game at 16 megabits (approximately 2 megabytes), only has roughly 16 bytes of free space. Linden also added support for the Super Scope light gun device, the Super NES mouse, and the XBAND modem for multiplayer. Fellow programmer John Coffey, himself a fan of the Doom series, made modifications to the levels, but some of those modifications were rejected by id Software.
I was lucky enough to work with Randy for quite a few years at Microsoft.
Incredible developer. He also made the Bleem! PlayStation emulator.
Funny enough none of his coworkers ever bothered to look him up online until after we all stopped working together at which point we all learned we'd been working with programming royalty.
I know that Wolf3D on the SNES uses Mode 7. Not for the walls or sprites, but for the entire screen. The graphics are rendered into a background tiles with a resolution of like 175x100 or something, then scaled up with Mode7 to fill the 224x192 screen. (those aren't the exact numbers, but you get the idea)
The "mosaic trick" is a way to perform horizontal pixel doubling in hardware rather than software. And to do this trick, you turn on the SNES's Mosaic feature, scroll 1 pixel to the left every other scanline, and scroll upward one pixel after each two scanlines have been drawn.
Normally the SNES mosaic feature just the top-left pixel of a 2x2 square into that entire square. But the trick makes a different set of pixels get doubled horizontally on the next scanline.
It requires a different arrangement of pixels than the normal way of drawing tiles. A tile containing these pixels:
01234567
becomes this when viewed on two scanlines:
00224466
11335577
Actually performing these scroll writes does not require any CPU intervention because you use the SNES's HDMA feature to do those scroll writes.
A similar thing happened with the Wolfenstein 3D port as well, where John Carmack gave Rebecca Heineman kudos for learning Japanese to read the patents to get the technical documentation, always cool history around these things, some more in my post about it here:
https://eludevisibility.org/super-noahs-ark-3d-source-code
I can't speak to this case, but dev kits and SDK/documentation are often two separate SKUs and the latter has a higher price. If I remember the Crash Bandicoot guys found a hardware bug with memory card saving because they rolled their own code rather than using the SDK they didn't have.
Git is for 1. Helping with conflict resolution when multiple people work on a piece of code, and 2. Tracking changes to software over time, to aid in bug fixing, patching older versions, and understanding why the code looks the way it does.
If you're intentionally committing code that doesn't work then you are ruining both the ability for others to work with the code, and your own ability to use the history for anything.
You _can_ use it as a poor man's backup solution (I do this myself), but the key then is to have a clear separation between a "work in progress" branch that only you work in, and the collaborative or long-lived branches. Before you merge a work-in-progress branch (or make a pull request), you need to make sure that each commit that remains passes the unit tests. You can use an interactive rebase for this.
The problem with Git is that it is actually a rich man's backup solution. It lets you do very detailed snapshotting and manipulation across versions. So when working on something it can be very helpful to have many commits during the work.
But I agree that it is usually best to then "refine" those into nice, working, logical chunks before sharing with others.
Are you recommending Next.js for its longevity? My experience over the last 18 months is that every new version introduces breaking changes. I wouldn’t want to come back to a Next.js project in 5 years and have to deal with migrating to Node.js v32 and whatever Next.js has broken in the meantime.
Don’t get me wrong, I’m not a Next.js fan either. Could’ve widened my list of examples to any SPA libraries and server-side JS frameworks. I just wanted to highlight the issue with the analogy: unlike some criminal a police officer is using force against, the nature of issues to be tackled in most software projects will keep changing.
that is going to be your experience no matter what you use if you want some horrible bloated frontend framework, or in this case, a horrible bloated framework for a horrible bloated front end framework.
will we get a next.js framework to make it easier to use next.js to make it easier to use react to make it easier to use javascript in the near future?
please for the love of god stop enabling these people.
Wasn't the point of regal colours that they were scarce/expensive?