Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Retina has 100 million photoreceptors per eye. Also, need higher FPS. Maybe 120 is good enough?


Most of the need for high frame rate is simple lag reduction during head motion. Sensitivity to frame rate and flicker have been extensively studied by SMPTE and others over the past 100 years. The research that went into 24 fps playback for movies (projected in darkened rooms, where the eye is less able to detect framerate induced flicker) and 50 or 60 fps playback of interleaved video for TV (in well lit rooms where the visual sensitivity to flicker is higher) is quite good. The work that went into measuring human visual acuity for using in defining HD broadcast specs was similarly good, and the human eye hasn’t changed much in the past 100 years (except to end up with weaker muscles for shifting between near and far focus). Simply rattling of $BigNum or %maxInt% as resolution and framerate targets isn’t the way to set standards.


It's hard for me to go back to a 60hz display after using 144, and going from 240 to 144 feels like going from 144 to 60.

I played breath of the wild and it was almost bad enough to keep me from playing. I think 60fps is fine for this, but in the future hope to see higher refresh rates as an option.

My main use case from VR has been escaping reality. Having multiple monitors floating in space allows me to prevent distractions. I fell asleep in VR once. I woke up looking at the stars, it was very confusing but interesting.


To add my anectdata, you don't happen to be diagnosed with something like ADHD-PI/ADD?

Almost everyone I've met or talked with that can notice the difference between those kind of framerates seems to either be diagnosed with it, or at the very least clearly show the associated traits.


I do, actually. Along with HFA. I also play games at a competitive high level, which can contribute to it. (top 10 pubg in a season, a+ esea CS, diamond siege etc.) It's so common in these communities to have at least 120hz which makes the most sense if you play CSGO ESEA. I wish Siege would improve interp and go to 120hz instead of 60. It's atrocious the things you die to.

I think most people I associate with are diagnosed ADD, but they're also high skill gamers.


Play, seriously, a competitive shooter/fps and you'll also start noticing the difference. 144hz is buttery smooth compared to 60hz.


What you say is correct for some, but not all people. In FPS games many don't see a huge visual difference between 60Hz and say 140Hz, except for latency.

That's not true for everyone. For me and a few others, there's even a clear difference between 120 and 144 Hz, and to 60Hz it a gulf. A Refresh rate of 60Hz in a high paced, close up, fight becomes a bit of a where's Waldo situation. It's as if my brain stop doing motion estimation because the frames doesn't really fit together, so I kind of stitch together the individual frames and try to guess what next 'slide' is going to show up.

I know there are a few other with the same issue, and anecdotally it might be a trait that occurs with some forms of ADHD/ADHD-PI. But I don't have a good reference on that, only experience and people I've met.


Correct me if I'm wrong, but isn't the difference between 60Hz and 120Hz refresh more or less imperceptible to the average human? I'm sure there's a distribution, but I'd be hard pressed to find a person who could differentiate between 100Hz and 120Hz refresh. It seems like a waste to push rendering beyond the point at which we can even tell there's a difference.

Edit: Thanks for the feedback, I guess it is perceptible. Nevertheless, I think my argument becomes valid at some N. Sure, N !== 60, but N = 144 or 120 may be more reasonable. I'm not too concerned with what N is, more so with the fact that "doubling the refresh rate" eventually becomes an act of futility.


When discussing framerates or refresh rates people tend to make a mistake of not differentiating between interactive and non-interactive mediums. In a video it's just a matter of smoothness. In a video game there is a more important aspect of input delay and the overall motion-to-photon latency, which is greatly affected by frame times.

There is also a misconception of treating visible motion details and temporal artifacts as the same thing ("what human can see?"). There are diminishing returns around 100 FPS for motion details, but it's still far too low to eliminate artifacts like blurring or judder (in VR usually connected to vestibulo-ocular reflex). This is why current VR headsets already have effectively 300+Hz-like persistence. We may need something like 1000 FPS if not more to achieve clean vision at full persistence, so obviously strobing tricks are necessary to get around it. And you don't need a fighter jet pilot pilot to see it. Everyone can notice these problems.


It is definitely perceptible. There is definitely a falloff in value/cost. In the most extreme case, your eyes can tell the difference between a solid-lit LED vs a 1000Hz strobing LED in a dark room if you dart your gaze left and right.

The most important quality issue is for the image update to match the display strobe. That's why FreeSync/Gsync is such a big deal.


60 and 120 are worlds apart, especially in VR. I would be shocked if many people with healthy mind and vision couldn't see the difference after having the basic idea explained to them. In VR some people might even physically feel the difference.

Doubtless there is some refresh rate where it ceases to matter, but the point here is that we're still struggling to push an acceptable frame rate in VR (and 4k, and other high resolution "formats") without making significant sacrifices in other aspects of the video quality. Assuming that image quality and available processing power both continue to increase, it will continue to be important to include framerate in the balance, i.e. we want to draw enough but not be wasteful.


I find it very perceptible. Get access to a high refresh monitor and visit the ufo motion test[0].

Even on your current 60Hz monitor, you can see how the image is blurry in the 60Hz band. On 144Hz I could see the pixel-level details (on 1080p) almost as clear as a motionless image.

[0] https://www.testufo.com/


In regards specificly to the UFO test... The blurriness is generally not related to frame rate but LCD persistence. Ie, on a theoretical zero persistance 60hz display you would not likely notice the individual frames getting blurred from frame to frame. In this case the only real difference between a 60hz and 120hz display is that the 120hz display would have additional frames of motion. Your vision system would then has less frames to interpolate between.

In reality though, all LCD displays have persistence issues. (You end up seeing bits of the previous frame suspended over the current frame).

Higher quality monitors designed for higher display frequencies tend to also be tuned to have less persistence per frame. There are also tools/tricks some brands employ to remove / reduce the persistence (and blurring), which can be effective (or annoying, depending on how they do it) even at 60hz. There are also "120hz" and higher displays with such bad persistence issues that they are way worse off on the ufo test than a good low persistence 60hz display.

As an easy example. IPS panels tend to have a lot more trouble switching between frames quickly than VA or TN panels, so they also tend to exhibit more persistence issues per frame. This is then directly noticeable on the UFO test between these kinds of panels. IPS panels have of course been getting a lot better in this regard in recent years!


Depends on the context and how involved the person is in the experience.

There has been a bunch of research involving VR induced nausia indicating that having a framerate of 90 or higher reduces the incident of nausia over 30 and 60.

For example: https://www.researchgate.net/publication/320742796_Measureme...


Buying a 144Hz display was both a great and horrible decision for me. Great because games at 144Hz feel extremely smooth. Horrible because now I can't enjoy 60Hz.


60-144 was insane the first time. Then I went to 200 hz ultra wide, back down to 144, then to 240. 144 feels like 75 now, and 60 feels like 50.

Its painful to go back.


144-240? This sounds like a Placebo effect. While I'm sure people out there can distinguish between 60 and 120[0], it seems you must be incredibly attuned to visual sensory input (an outlier). It seems you aren't alone, either[1], part of me wants to think this is one big marketing gimmick, but surely people wouldn't buy 240Hz monitors if they couldn't tell a difference? Or is it just a pissing contest for self satisfaction? If I had a few million in the bank, maybe I would also buy a 240Hz monitor -- because why not?

[0] https://www.pcgamer.com/how-many-frames-per-second-can-the-h...

[1] https://www.reddit.com/r/Competitiveoverwatch/comments/5mhqh...


Look at this [0]

It's an interpolation thing; it's much easier for your brain to do the tracking of an object when it smoothly moves around, instead of it having to do interp / extrapolation, and this should demonstrate why

Can I ask why you think it sounds like a placebo? I don't really see why there's any logic behind 144hz being the ceiling of how well your eyes can see, and 144hz -> 240hz is a big jump

[0]: https://www.youtube.com/watch?v=pUvx81C4bgs


You can tell a difference between 144-240. A lot of small differences, but there is a difference. I just wish there were a nice high-tickrate game to play other than CS.

144-200 was not very noticeable. 144-240 was absolutely noticeable. A couple of games that dual 980 ti's cant reach 240 in currently, hopefully Volta changes that. The monitor was also on sale, so I got it for a crisp $300 with no tax.


The 120Hz display on my iPad Pro gives me the same feeling, and I don't use it for anything important.

Using my iPhone immediately after makes the phone display feel cheap for a bit.


Is there any public research about the inverse correlation between framerate and nausea? If there is a perceptible one in that range, it would mean that higher framerate does make a difference, even without being aware of it. This would only be worthwhile for VR.


It is definitely perceptible. Compare a normal iPad (60 Hz) to an iPad Pro (120 Hz), the fluidity of movement is very apparent in just playing around with the home screens.


It may be perceptible, but the number of factors that could affect the different systems in that comparison means it's not exactly a good test. At the simplest level, there's no guarantee it's actually even rendering updated frames at the rates in question if it's limited by some other factor, and the differing hardware may change at what point that limit is hit.


You can drop frames at 60Hz ss well.

But really, when you achieve 120Hz, it’s beautiful, it reminds me of when retina displays came out. We are a bit closer to realistic rendering.


> You can drop frames at 60Hz ss well.

Yes, what I was trying to get at is that just because the hardware is capable of 60 frames in a second, that doesn't mean the software was delivering 60 frames a second. The iPad Pro has a different processor than the iPad (A10X Fusion vs A10 Fusion), and in a lot of tests it's significantly faster.[1]

The iPad pro does have more pixels to push around, but that doesn't exactly negate the CPU difference, it just makes it more complicated to draw an actual comparison. That that's before we even get to the actual graphics processor, which itself could do a better job of offloading some processed to hardware (better OpenGL/Metal/whatever support). For all we know, you were seeing a average of 35 updated frames a second on the iPad, and you're now seeing an average of 55 updated frames on the iPad pro. In that case, the doubling of the screen refresh might help a little (in reducing noticeable laggy frames a bitas it can update between what would be frames at 60Hz), but it wouldn't be earth shattering. I doubt it's that bad, but as an example, this should show how a Hz rating on what a screen is capable of doesn't mean much.

The real benefit of higher screen refresh rates is to better support different lower native refresh rates. Much video content is at 24 FPS. A 30Hz or 60Hz screen can't represent that faithfully, and will need to double some frames. A 120Hz screen can perfectly represent 24 FPS content[2], and that's the real reason screens (and TVs) ship with that refresh rate. Different media (television, internet video, DVDs, Blu-Rays, video game systems, etc) all have different refresh rates they want to deliver.

1: https://www.notebookcheck.net/A10-Fusion-vs-A10X-Fusion_8178...

2: I'm ignoring that it's often actually 23.976 FPS or something.


Yeah, it's night and day.


Human eye is practically capable of seeing around 8 megapixels, which is a similar number of pixels to what a 4K TV has (but obviously with completely different distribution). There was a great episode about it on vsauce yt channel. We are pretty blind, but saccading masking does the magic to make our eyes super efficient.


That resolution is not evenly ditributed throughout our FOV, but rather is highly concentrated in the center of our vision. This makes "similar number of pixels" pretty meaningless.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: