Hacker Newsnew | past | comments | ask | show | jobs | submit | leguminous's commentslogin

I disagree. There are some new (sub-) genres and great games since that period.

* Roguelites have proliferated: Hades is the most obvious example, but there are a variety of sub-genres at this point.

* Vampire Survivors (itself a roguelite) spawned survivors-likes. Megabonk is currently pretty popular.

* Slay the Spire kicked off a wave of strategy roguelites.

* There are "cozy" games like Unpacking.

* I don't recall survival games like Subnautica or Don't Starve being much of a thing in the PS2 era.

* There are automation games like Factorio and Satisfactory.

* Casual mobile games are _huge_.

* There are more experimental games, sometimes in established genres, like Inscription, Undertale, or Baba Is You.

Not to mention that new games in existing genres can be great. Hollow Knight is a good example. Metroidvanias were established by the SNES and PS1 era, but Hollow Knight really upped the stakes.

I'm sure I'm forgetting things and people will have some criticism, but I really don't believe games have stagnated in general.


"Roguelites have proliferated"

I know it's easy to feel that this is people chasing trends, but I've really come to appreciate roguelites over many of the PS2 era games because they give me real progression in a single play session, but also, that single play session is discardable.

As an adult this is a very compelling proposition.

In the PS2 era, while you can find some early roguelite-like-things, you tended to have either the games that have no interesting progression (arcade-like) and the you would just play the game, or you had very long scale games like JRPGs that slowly trickle out the progression but are also multi-dozen-hour games. Compressing the progression into something that happens in a small number of hours, yet eliminates the "I'm 50 hours into this game that I stopped 2 years ago, do I want to pick it back up if I've forgotten everything?" has been very useful to me.

This has been a fairly significant change in gaming for me. I still have some investment into the higher end JRPGs but the "roguelite" pattern across all sorts of genres has been wonderful overall. I don't even think of it as a genre anymore; it's a design tool, like 'turn based versus real time'.


Roguelites are the worst thing to happen to video games since microtransactions. It’s an extremely attractive option to the cash-strapped indie dev, as it promises infinite ‘content’ for little development effort, but what it’s really done is turned every game into a combination of cookie clicker and a slot machine.

The fact that you think arcade games have “no interesting progression” shows just how toxic the roguelike design pattern is. The progression in arcade games is you getting better at the game. If a game needs a “progress system” to communicate a sense of accomplishment to the player, that’s because the gameplay is shallow.


What is the advantage over blue noise? I've had very good results with a 64x64 blue noise texture and it's pretty fast on a modern GPU. Are quasirandom sequences faster or better quality?

(There's no TAA in my use case, so there's no advantage for interleaved gradient noise there.)

EDIT: Actually, I remember trying R2 sequences for dither. I didn't think it looked much better than interleaved gradient noise, but my bigger problem was figuring out how to add a temporal component. I tried generalizing it to 3 dimensions, but the result wasn't great. I also tried shifting it around, but I thought animated interleaved gradient noise still looked better. This was my shadertoy: https://www.shadertoy.com/view/33cXzM


At some point, most NTSC TVs had delay lines, too. A comb filter was commonly used for separating the chroma from the luma, taking advantage of the chroma phase being flipped each line. Sophisticated comb filters would have multiple delay lines and logic to adaptively decide which to use. Some even delayed a whole field or frame, so you could say that in this case one or more frames were stored in the TV.

https://www.extron.com/article/ntscdb3


If a motion adaptive 3d comb filter (which requires comparing successive frames) was present on a TV, you can bet that it would be plastered all over the marketing material for the TV.

I think they are more accessible now than when that article was written. My wife and I bought a mid-trim Hyundai Kona Electric for under $35,000. Besides, lots of people buy used cars, and there are crazy deals on used EVs. I've seen Bolts go for under $15,000. 2 year old ID.4s are selling for under $20,000 in my area. You may not find a $5,000 beater, but EVs are penetrating further into the middle of the market now.

There are also lower ongoing costs for maintenance and fuel.

There is still the secondary wealth filter of having a place to park and charge, of course.


> For example, 720 is tied to 13.5 Mhz because sampling the active picture area of an analog video scanline at 13.5 MHz generates 1440 samples (double per-Nyquist).

I don't think you need to be doubling here. Sampling at 13.5 MHz generates about 720 samples.

    13.5e6 Hz * 53.33...e-6 seconds = 720 samples
The sampling theorem just means that with that 13.5 MHz sampling rate (and 720 samples) signals up to 6.75 MHz can be represented without aliasing.

There's some history on the standard here: https://tech.ebu.ch/docs/techreview/trev_304-rec601_wood.pdf


Non-square pixels come from the legacy of anthropomorphic film projection. This was developed from the need to capture wide aspect ratio images on standard 33mm film.

This allows the captured aspect ratio on film to be fixed for various aspect ratios images that are displayed.

https://en.wikipedia.org/wiki/Anamorphic_format


CRTs didn't have pixels at all. They had shadow masks (or aperture grilles) and phosphors, which could be a triad of rectangles, lines spanning basically the entire screen height, or dots. They did not line up with the signal, so it doesn't make sense to call them pixels.


Aerated concrete is an established building material in some parts of the world. In Europe, a big manufacturer is Ytong, and they even make precast panels in addition to blocks.

It's made differently from this, though. It is aerated through a chemical reaction rather than mechanically.


The industrial version is produced in an autoclave, this allows precise control of curing, density and final mechanical resistance/insulation values. Hence, the name the material is best known by - AAC.

On the other hand, the video linked attributes too much credit and complexity to the foam manufacturing method, it can certainly be done with very primitive technology. Here are some dudes doing it in a developing country, it's very very basic, the foam generator is basically a steel wool sponge where pressurized air combine with water containing the foaming agent. They give out the complete recipe and details of their tools:

https://www.youtube.com/watch?v=-h6zBbVkuQI


There was recently a crisis in older publicly constructed buildings in the UK that were built [0].The aerated concrete had a limited lifespan especially if it was damaged and had contact with water.

Lots of people looking for compensation and claiming mis-representation.

[0] https://www.bbc.co.uk/news/education-66669239


The UK crisis involved steel reinforced AAC beams that were used (of all places) to support roofs of schools. UK turned out to be a rainy place, the rain infused into the cellular structure and corroded the steel, with disastrous consequences.

It's a very particular use case of a very particular product, not relevant to the wide majority of AAC uses around the world, which is largely non-structural and not reinforced, or subjected to moderate compressive loads, such as lateral walls for 1-2 stories buildings in non-seismic areas.


The risks were understood (by engineers) and this usage was given a "shelf life". Unfortunately, those risks were put into the "Oh we'll forget about it" or "We'll wait until it looks a bit shifty" categories.

However as any fule (engineer) kno, reinforced and especially pre-stressed conc members will fail in quite a dramatic fashion. Unless you notice rust dribbling out then you can end up with anything from the roof failing to the roof exploding. I don't think anyone was daft enough to pre-stress these things.

I don't know how much money was saved but it was a really stupid application and basically ended up punting far greater costs due to remediation down the road.


> and this usage was given a "shelf life"

While it might technically be true, that surely does not absolve the engineers who did this crap.

There is a general social expectation that new buildings should be structurally sound for a duration on the order of a century. So, if you deliver something that has a mean time before catastrophic failure around 30 years, you also need to account and set up the institutions that will handle the failure, the same way nuclear companies are required to set aside money for their decommissioning. You need to have periodic inspections for signs of early failure etc. and this whole circus needs to be disclosed and priced into your tender.

In reality, this entire fiasco was a dirty and cheapest way to satisfy the contract, ye old "good enough for government work" as evidenced by the fact no substantial number of private buildings of the same period are having this problem.

The maintenance provision was snuck into - or bribed into - some mountain of legalese, but the fuckers knew exactly that they were putting children in harm's way.


Is that you Molesworth???


Carry on, old boy.


Using porous concrete reinforced with steel in a rainy place is a real WTF decision. It’s a miracle they didn’t collapse earlier.


If you look at the Steam hardware survey, most users (as in, > 50%) are still using 1080p or below.

https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...


In part though that's not because all those users can't afford >1080p because some of them can it's that insanely high refresh rate monitors and esports players often use 1080p at >300Hz - even the ones without still use 1080p because driving up the frame rate drives down the input latency.

Whether it matters is a bigger issue, 30 to 60Hz I notice a huge difference, 60 to 144Hz@4K I can't tell but I'm old and don't play esports games.


I don't think this is contra to my original point. Nearly 50% of all users are running at greater-than 1080p resolutions, and presumably power users are overrepresented in the latter category (and certainly, it's not just the ~2.5% of Mac users pushing the average up)


FWIW, I didn't mean to reply to you in an argumentative way. Just proposing an answer to this:

> I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from

I still see 1080p fairly often on new setups/laptops, basically, although 1440p and 4K are becoming more common on higher-end desktops. Then again, 1440p at 27" or 32" isn't really high dpi.


The EPA estimates that radon[1] is the second leading cause of lung cancer after smoking. I have a RadonEye monitor in the basement. They aren't that expensive and it's nice to have the piece of mind.

[1] https://www.epa.gov/radon/health-risk-radon


What lead you to trust RadonEye?


A radon sensor is just an alpha radiation detector. There's little reason to distrust companies making them. You don't even really need very good calibration, you just want to know if radon levels exceed some "you should take action" treshold. Real-time radon detection is really a bit overkill.


I wonder how one could validate a radon detector. Can one buy a phial known air/radon concentration online without ending up on some spooky list?

Naturally, you would have to account what changes in the sample during shipping, due to the short half-life...


> how one could validate a radon detector

Get a professional measurement and compare. This Reddit thread gave me pause [1].

[1] https://www.reddit.com/r/radon/comments/ptz8tt/professional_...


CCCP was just a collection of existing codecs, they didn't develop their own. Most of the codecs in CCCP were patented. Using it without licenses was technically patent infringement in most places. It's just that nobody ever cared to enforce it on individual end users.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: