> So I tried something unconventional: draw each character once, cache it as a texture, and then just copy those textures around.
That’s more like the most conventional way to draw characters ever. Nobody goes around rendering filled Béziers any more than absolutely necessary. And conventionally conventionally, fonts were bitmaps in the first place!
Yeah, I confess I would have assumed that this was more common than otherwise.
Makes me curious how much of computing is losing perceived speed because we have moved to calculating everything on the fly? Easy example is website layout. Yes, some dynamic sites need to calculate a ton on the fly. Most sites, though, probably don't need to reflow everything nearly as often as they do. And fitting everything into the mechanism that is the document flow remains baffling to me.
SDF seems to be one of the better solutions for text rendering.
Valve had this problem solved since 2007. I'd argue this technique is a big part of what gave TF2 its impressive visual style. That game ran at 100+ fps on hardware like the 8800GT at the time.
I don't know how common it is in fonts, but for generic 2D vector graphics, problems arise from the management of self-intersections, i.e., the pixels where they fall. With an SDF rasterizer, how do you handle the pixel where two Bezier curves intersect in a fish-shaped path?
For this reason, more conventional rasterizers with multisampling are often used, or rasterizers that calculate pixel coverage analytically, also finding intersections (sweepline, Bentley-Ottmann).
Hmm I'm not sure I quite understand your question. The renderer uses predefined textures of signed distance fields. Essentially you're building a gradient out from your source vector. You can tweak how that texture is built and it doesn't need to be strictly the true distance.
In theory you could handle it however you want. You could make an SDF texture of a figure eight where the inside of the eight is purely zero and only the outside has a gradient. If you built a texture like this then your stroke with would only ever grow out of the shape.
This is an example of how SDF is very powerful.
If instead you're asking about the precision with regards to the infinitely small point of an intersection of two vectors it's actually not an issue at all. The technique uses the usual texture sampling pipeline such that the render will be as sharp as your SDF texture and as alias free as your mitmaping allows.
That really was just the most absurd argument for Microsoft developers to engage in. It felt like a parody of the "optimisation is unnecessary because us developers are such Prima Donnas and simply toooooo expensive to lower ourselves to such levels" attitude that some people have.
He used a cache. A simple hashtable. That's it. He got an absurd speedup of something like hundreds of times faster.
What are developers smoking these days that they can't even envision ever doing something like this without undertaking a research program?
To this day people will debate this, as if there's a valid debate to be had!
"No, no, no, it's premature to optimise software that is... being released to a billion users in production."
"Casey is adding unnecessary complexity that will be hard to maintain... by using a fraction of the code Microsoft did to solve the same problem."
"It must be full of errors... well... other than the superior Unicode compliance."
"It's so much longer to develop high-performance code... the evidence is that it took Casey two weekends to write a nearly complete terminal emulator!"
Etc...
Look where we are today. Microsoft still steadfastly refuses to even look at Casey's solution, let alone adopt it wholesale. Years later there are still blog articles being written about the performance issues of the Windows Terminal.
PS: Notepad and Calculator got the same "treatment" and now struggle to keep up with key presses.
No, actually the biggest difference is removing the `filter` property from the `.wave2` class which is used for rendering the background. With that removed the page is responsive even with the backdrop-filter - and it makes no visual difference AFAICS.
It's also perfectly responsive if you disable JavaScript. Maybe something related to the --positionX and --positionY CSS variables that are updated on every mouse move?
what's worse is that scrolling doesn't actually work unless you have the text area focused/under the wheel... very weird. Which is even worse for PgUp/Down as who would think to focus the area first for scrolling???
Caching the fonts to a texture atlas is not an unusual idea. https://github.com/memononen/fontstash is a well known example of this. Odin has a native port meant to work in conjunction with its bindings to NanoVG. The Odin code is coupled to stb_freetype.
I wonder how the GPU version is implemented. One quad and one texture draw per glyph sounds very not scalable, but one quad per terminal, one texture atlas and one shader to draw glyphs from the atlas already sounds much better.
That's nothing for a modern GPU. For example, this benchmark[1] says to expect on the order of 10-800 million tri/s. At the low end of that, you'd have a frame time of 3.427ms -- 292 fps.
The original Playstation could do 180 000 textured polygons per second[2], so it could've managed ~5 fps. Of course, you wouldn't render that many chars at its available output resolutions anyway. :)
One quad per glyph is very scalable, especially if you use instanced rendering. Each quad is just reading from a single texture atlas. GPUs are beastly blitting machines.
Runs at ~1500 FPS with a 6K screen of full text on my machine even when text is being updated at about the pace of a 150 WPM typist but you only update quads that strictly need to change and store font metrics on the GPU in a buffer. A full screen refresh where you send quad data for 750 Lorem ipsum paragraphs every frame runs at 300 FPS on my hardware.
That’s more like the most conventional way to draw characters ever. Nobody goes around rendering filled Béziers any more than absolutely necessary. And conventionally conventionally, fonts were bitmaps in the first place!