Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Old games like Quake always look pretty janky when upscaled (extreme example of course). The low-poly models seem to become worse the closer you can look at them.

I wonder if there’s a level of detail that just fundamentally can be upscaled indefinitely and end up vaguely “looking right”, or if it is just the case that something like Apex Legend was release recently enough that the lift is not so huge.



Maybe the "upscaler" just needs to be a really good CRT emulation.


Yea, was gonna say that CRTs basically gave hardware-level anti-aliasing to every game for free. Playing the same games on an LCD TV makes them look strange and jagged edges stick out a lot more.


CRT also gave extremely high smoothness for free. 60Hz there looks a lot better than on LCD etc.


That's down to the longer persist. LCD goes on and off (almost) instantly, CRTs have a certain amount of "persist" where the phosphor glows for a moment after the beam has left it.

It's also (kind of) why interlacing worked, when black-and-white CRTs had a persist about as long as one field. This would make fast action a bit more "smeary" though, and you can think of its effect as similar to a wider "shutter angle" in a camera.


From online reading and my personal observations, CRTs have lower phosphor persistence than LCDs, resulting in both the flicker and clearer motion. I think interlacing works more due to persistence of vision, and because the beam is thick enough to nearly fill the scanline gaps. You can actually simulate interlacing on a LCD by showing simulated scanlines, alternating by half a line each field/frame. There's a video demonstration at https://youtu.be/tS0cFwvDWkE?t=480.


The visual properties of CRTs are a surprisingly complex topic.

Still images:

- 240p and 480i (15 KHz) console games were meant to be played on SDTV CRTs. Most consumer TVs have relatively coarse dot pitch (white lines are actually composed of color dots/stripes spaced relatively far apart). This adds "texture" to game visuals, masking aliasing and blur to a degree.

- - I think low-res LCDs actually have a similar horizontal texture to aperture grille (phosphor stripe) CRTs, but dot/slot mask (dot/slot-shaped phosphors) CRTs look quite different from LCDs when seen close up.

- - VGA CRT computer monitors have a much finer dot pitch (adjacent phosphors of the same color closer together) and higher TVL (phosphors of the same color per screen height). This makes them look closer to LCDs, since the color dots/stripes are so close together they can disappear at a normal viewing distance. This is not necessarily a good look for 2D games, or even pre-HD (PS2/GC/Wii) 3D games.

- Unlike LCDs, CRTs have visible scanlines (lines of light traced by the three electron beams, horizontal on wide monitors and vertical on rotated/tall arcade monitors). On a properly calibrated CRT, the three colors of scanline are vertically aligned (convergence errors cause the scanlines and/or horizontal changes of brightness to be misaligned between the three colors). On many (not all?) CRTs, each color's scanline gets taller (vertically wider) when the color is brighter, creating a "blooming" effect utilized by games. The scanlines are usually narrow enough on SDTVs that there's a visible black gap between adjacent lines of light; modern emulators' "scanline" filters add black gaps between rows of pixels.

- Most consoles output a horizontally soft image (or composite video blurs it horizontally). In 480i on a SDTV, adjacent scanlines overlap. Both effects blur the image and reduces high image frequencies, acting as "hardware-level anti-aliasing to every game for free". Additionally, horizontal transitions between colors and black cause the scanline to become narrower as it fades out (on many TVs), which is another characteristic of the CRT look: https://twitter.com/CRTpixels/status/1599513805717135360

- Unlike SDTVs, high-quality properly-focused VGA monitors have scanlines so narrow that drawing 240 scanlines across a 17 inch monitor produces taller black gaps than scanlines, even at full brightness (which produces maximum scanline height). This is very much not an attractive appearance for 240p games. One solution is to display each line twice in vertically-adjacent 480p scanlines (line doubling). Alternatively you can send a high-resolution (960p or above) video signal to the VGA CRT, and add emulated scanlines in a GPU shader or hardware scaler (ideally RetroTink-5X).

- - Note that visible scanlines boosts high vertical frequencies up to (and to an extent, beyond) Nyquist, effectively acting as alias-boosting filters rather than anti-aliasing! *If you want phosphor dot texture and a smoothed on-screen image, which CRT TV/monitor you pick matters as much as playing on a CRT!*

- - Both 240p and 480i games were meant to be played on SDTVs. 240p games were meant to be seen with scanline gaps, while 480i games were not.

- - As an example of "alias-boosting", my 17-inch VX720 VGA monitor has visible scanlines when running at 480p. They're beautiful to look at, and work well in arcade-style games with large objects like Super Monkey Ball 2, or bloomy games like Mario Kart Wii. But they boost aliasing to the point that many GC/Wii games (especially Wind Waker with half-resolution DOF distance blur) look objectionably aliased (worse than a regular CRT or even a LCD display). As a result, when playing 480p games through my GBS-Control, I often configure it to upscale to 960p with vertical bilinear filtering ("line filter").

- I'm not actually sure what CRT monitors (dot pitch, beam sharpness, maximum resolution) PC games from various eras (DOS through 2000) were meant to be seen on; PC monitor resolutions have evolved substantially throughout the decade or so, while pre-HD consoles have universally targeted 15 KHz TVs. I'd appreciate input from someone who grew up in this era of PC gaming.

Interlacing:

- Console games since the PS2/GameCube era primarily output a 480i video signal, where every other frame (technically field)'s scanlines are vertically offset by half a line, causing solid colors to appear "filled" without gaps between scanlines. However, if your eyes are tracking an object moving up/down by half a line per field, then the scanlines line up again and gaps reappear. This is because each scanline is only illuminated for a small fraction of a field, and by the time the next field appears and the scanlines have moved half a line up/down, your eye has moved that distance as well.

- In 480i video mode, if even and odd scanlines are different average brightnesses in an area of the image, the image will flicker when displayed on an CRT. To address this issue, many video games enable "deflicker", which blends together (on Wii) 3 adjacent scanlines when outputting the video (technically when copying an image to XFB). This blurs the image vertically but eliminates flickering between fields.

- - Some GC/Wii games enable "deflicker" (vertical blur) even at 480p output (with no flickering), which softens the image vertically; Nintendont and Wii USB loaders come with optional hacks which hook games to prevent them from turning on vertical blur.

- - And for some reason, in certain translucent foggy/glowing/shadowed areas, some GC/Wii games actually output alternating brightness per scanline (like vertical dithering) and rely on the deflicker filter to smooth it out to a uniform color. Running without deflicker enabled results in horizontal stripes of alternating colors, which I'm (usually) fine with but some people find ugly.

Motion and latency:

- Video signals, both analog VGA and digital HDMI (and probably DisplayPort as well, unsure if DP's MST alternates screens or interleaves them), always deliver the screen's contents from top to bottom over the duration of a frame. Displays (both CRTs and flat-panels) update from top to bottom as the image arrives into the screen. As a result, the bottom of a screen has nearly a full frame of extra latency compared to the top of the screen (unless you configure the game to turn off both vsync and triple buffering, resulting in tearing and the bottom of the screen receiving a newer video frame).

- - You can reduce the latency at the bottom of a screen by delivering pixels faster, and spending more time after each frame not delivering pixels (HDMI Quick Frame Transport, https://forums.blurbusters.com/viewtopic.php?f=7&t=4064).

- LCDs and OLEDs update the state of each pixel once per frame (from top to bottom as the signal is transmitted from the video source), and pixels continue transmitting light for a full frame's duration until the next update arrives. (LCDs also have additional latency from an electrical signal to the image changing, because the liquid crystals respond slowly to electricity.) CRTs instead light up each point on the screen briefly once a frame, and the light output decays towards black (within <1ms on VGA monitors) until it's lit again a frame later. (Note that oscilloscopes and radar screens may use phosphors with far longer decay times: https://en.wikipedia.org/wiki/Phosphor#Standard_phosphor_typ...)

- - As a result, CRTs flicker (bad). But when your eye is tracking a moving object, the entire screen is not smeared by the distance of a single frame like in LCDs, resulting in a sharper image in motion (noticeable in video games and even web browser scrolling). There are techniques (black frame insertion, backlight strobing) to produce CRT-like motion properties (complete with flicker) in LCDs and OLED screens (more info at https://blurbusters.com/gtg-versus-mprt-frequently-asked-que...).

- - Sadly, 30fps console games on CRT effectively lose out on this beautiful fluid motion property, because they display the same video frame twice in a row. As a result, when your eye is tracking a moving object, the object (and screen) appears as a "double image", with the two copies 1/60 of a second apart (at your eye's rate of motion). The same "double image" appears (but with a shorter distance, I don't know if it's perceptible) if you display a 60fps video at 120fps to avoid CRT flicker (or to feed 240p video into a 31-KHz-only monitor).

- - One idea I have is "motion adaptive black-frame insertion" (similar to motion adaptive deinterlacing), where an (ideally) OLED display displays a static dim image for nonmoving sections of the image (to avoid flicker), and a bright strobed image for moving sections (to avoid eye tracking smearing). I'm not aware of any monitors which perform this trick, and I'm not sure if the 1ms or so of added latency to perform motion detection (my best guess, judging by the minimum latency of a RetroTINK-5X Pro or GBS-Control) would make this product unpalatable to gamers.


Were Wii games designed to be displayed on CRT or LCD? If I recall correctly, LCD screen were pretty widespread by the time the Wii came out — at least among middle-class Americans.


I'd say a mix of both. The Wii offers both 480i SDTV (for CRTs and LCDs, through the pack-in composite video cable) and 480p EDTV (better for LCDs, but you had to buy a higher-quality component cable) output. Unfortunately 480p-compatible CRT TVs were rare and far between. Additionally 480p doesn't actually take advantage of the full resolution of 720p/1080p LCD TVs, resulting in a blurry image on-screen (but fortunately free of composite and interlacing artifacts, and deinterlacing latency on some TVs).

Did middle-class Americans commonly have 480p LCD EDTVs, or were they a rare transitional stage of television with little uptake? My family jumped straight from CRT (technically rear projection) to a 1080p HDTV.

Early Wii games were built to output in both resolutions, adjusting the camera width and HUD to match. I think System Menu actually looks better in 4:3, since the channel icons mostly add empty featureless left/right borders when stretched to 16:9. Some later games (NSMB Wii, Skyward Sword) only display in 16:9, adding a letterbox if the Wii is configured to play in 4:3. Interestingly, in NSMBW playing in 4:3 saves two seconds in one scrolling cutscene, because the game objects are actually loaded underneath the letter box, and appear "on screen" sooner, cutting the cutscene short (https://www.youtube.com/watch?v=gkt8L3t1GEU).


> Did middle-class Americans commonly have 480p LCD EDTVs, or were they a rare transitional stage of television with little uptake? My family jumped straight from CRT (technically rear projection) to a 1080p HDTV.

Scottish and very much not "middle-class", but I had a 720p-capable CRT TV in the early-2000s or so. I only got rid of it in about 2010, and still kind of regret it, but it was huge. Secondhand it cost about 400 quid in 2002 money! You can imagine what it must have cost new.

Around about the same time I fitted a VGA input board to a spectacularly expensive Loewe TV for a local high-end home cinema shop.


> Unlike LCDs, CRTs have visible scanlines (lines of light traced by the three electron beams, horizontal on wide monitors and vertical on rotated/tall arcade monitors).

Or in a triangle, on old delta-gun CRTs.

> On a properly calibrated CRT, the three colors of scanline are vertically aligned (convergence errors cause the scanlines and/or horizontal changes of brightness to be misaligned between the three colors).

Inline gun CRTs largely eliminated the need for convergence. I do not miss converging delta gun CRTs at all. Yes, I am quite old.


Sorry I meant that the scanlines are horizontal, not the guns.


Analog video:

- RGB (including VGA) and component (YCbCr, green/blue/red RCA jacks) video are theoretically lossless encodings. S-Video blurs color information (chroma) horizontally, but has theoretically unlimited brightness (luma) bandwidth horizontally.

- Composite video (single yellow RCA plug), as well as RF, encodes color as textured patterns (a 3.58 MHz sine wave in QAM with variable amplitude and phase) added to a black-and-white signal. TVs rely on various filters to separate color and brightness information, inevitably resulting in horizontal color blur, some degree of brightness blur, and (with some video sources and TVs) horizontal changes in brightness being interpreted as color, or changes in color being interpreted as brightness dot crawl.

- - Some games were actually built to rely on composite video to blur color horizontally (https://twitter.com/CRTpixels/status/1408451743214616587), or even smooth out dithering (https://twitter.com/CRTpixels/status/1454896073508413443).

As for my personal setup, I mostly play GC/Wii games (rather than earlier generations of consoles), and run it through a component cable (lossless) at 480p (no interlacing), through a GBS-Control transcoder/scaler, into a Gateway VX720 17-inch Diamondtron VGA monitor. This monitor is useful because it's relatively compact compared to large TVs, has excellent geometry and no audible whine (31 KHz is ultrasonic), and is sharp enough to be used as a computer monitor. The downside of using this monitor with console games is that the phosphors are too fine to add texture, the image (electron beam focusing/scanline size) is too sharp to act as antialiasing like a real TV (so I have to simulate it with bilinear scaling to 960p), and it cannot display 240p games properly unless upscaled like you were displaying on a LCD.

Sometimes I also plug this monitor into my PC (RX570 -> https://plugable.com/products/dpm-vgaf) and use it as a second monitor. I bought this DP++-to-VGA dongle because I heard that DP-to-VGA dongles are less likely to have image quality problems than HDMI-to-VGA dongles, but unfortunately most of my devices don't have DP ports so I can't use this dongle with any of my laptops.

Additionally, my monitor doesn't broadcast EDID data when the front power switch is turned off (I power it down when not in use to reduce CRT wear). And my DP++-to-VGA adapter identifies itself as an "unrecognized but present output" when the VGA cable is unplugged or EDID is missing. So for my computer to properly recognize my monitor, I have to first power on the monitor and plug in the VGA cable to the dongle, then plug the dongle into the DP port (and if it's already plugged in, unplug it first, DP latch and all).

I used to have a 24-inch flat-screen Trinitron SDTV given away by another person. Unfortunately, the screen geometry linearity was poor due to high deflection angles (objects were wider in the left and right of the screen, causing scrolling 2D games to warp and distort unnaturally) and wide pincushion at the top of the screen. Additionally, the TV only ran at 15 KHz, did not support 480p from my Wii (which improves fine detail in fast motion), and had painful levels of high-pitched whine requiring I wear headphones and/or put layers of sound-muffling clothing around the TV's vents (and set up a forced-air cooling fan to replace the obstructed air circulation). I ended up giving it away for free (after two people on Facebook Marketplace ghosted me).

Sadly it's now common for eBay and Marketplace sellers to offer CRT TVs and monitors at absurdly inflated prices, waiting for a desperate buyer to pay hundreds (or thousands for BVMs and widescreen PC monitors) of dollars for a monitor, or most likely sitting on their listing for months on end with no takers. These listings tend to clog up search results if you're looking for a CRT TV. I'd advise you to look out for the occasional "free TV" Craigslist/Marketplace listings (and mythical "free VGA monitor" offers), or see if any electronics recyclers will sell their CRTs to you at a reasonable price ($40 for a 17 inch VGA isn't a small amount of money to drop, but it's downright generous compared to the average eBay scalper).


> Composite video (single yellow RCA plug), as well as RF, encodes color as textured patterns (a 3.58 MHz sine wave in QAM with variable amplitude and phase) added to a black-and-white signal

In the 1970s the BBC transmitted colour TV programmes but archived them on black-and-white film, shooting a black-and-white monitor that actually had enough bandwidth to display the 4.43MHz PAL colour carrier. Someone wrote software to decode this and recolour footage based on what they recovered. It's not great, but it's at least as good as VHS colour.

Unfortunately the only really good examples of footage captured both on film in mono and tape in colour is an episode of Top of the Pops, presented by the infamous Jimmy Savile. In a happier example they were able to recover the colour from a couple of lost episodes of Morecambe and Wise.

https://stardot.org.uk/forums/viewtopic.php?t=16161


I think it’s the latter.

Because, as you point out with low poly models, there’s so much that goes into the fidelity of a game beyond just output resolution - there’s polygon count on the models, resolution of the textures applied to those models, shadow and light effect resolution - each operating independently.

When a game like Apex is played at high settings but output at a low resolution, at that point upscaling it isn’t much different than upscaling a standard def DVD of a movie like Frozen or Avatar. You’re not creating detail in things like individual set pieces that didn’t previously exist, you’re filling in the blanks of the entire image blown up at once per frame.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: