this post was submitted on 21 May 2024
51 points (90.5% liked)

Games

16689 readers
344 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 1 year ago
MODERATORS
 

cross-posted from: https://sopuli.xyz/post/12872542

Does anyone really need a 1,000 Hz gaming display?

top 34 comments
sorted by: hot top controversial new old
[–] jordanlund@lemmy.world 18 points 5 months ago

Considering HDMI 2.1 and Display Port 1.4 can't run that fast...hmmm... no.

[–] jlh@lemmy.jlh.name 13 points 5 months ago (1 children)

1000 Hz seems to be close to the limit human of human vision, since we stop seeing motion blur above 1000hz. Seems like a good endpoint for display technology.

[–] Tetsuo@jlai.lu 14 points 5 months ago (1 children)

On the other hand I heard so many times the same argument for 144hz, 165hz and then 240hz....

Now 1000Hz is much higher frequency but in term of frame time it's not that far. I wouldn't be surprised if some people could successfully tell the difference in a "blind" test.

[–] jlh@lemmy.jlh.name 5 points 5 months ago (1 children)

I remember people talking about 1000hz being the holy grail for vr headsets, though so it seems like there's more consensus on 1000 Hz being a good limit. Frame time is just the inverse of hz.

But yeah ive personally only used 144hz, I think I could see a difference with 204hz, but I'm not sure if I'd be able to discern 480 or 1000hz outside of maybe VR.

[–] oxideseven@lemmy.ca 9 points 5 months ago (2 children)

There kinda isn't really any definitive science that indicates a specific frame rate that the eye can perceive.

There are studies however that show ranges from 30 to 90hz, and studies that show that human perception can detect flicker at up to 500hz even.

The issue is that nothing that happens in the real world is synchronized with what you perceive. So filling in with more Hz means there are more chance for you to actually perceive the thing.

To complicate matters further, our brains do a lot of filling in for us, and our eyes and brains can still perceive things you aren't consciously perceiving yourself. So again more frames is always nice.

Here are some sources

Canadian Centre for Occupational Health and Safety. (2020). Lighting Ergonomics - Light Flicker.
https://www.ccohs.ca/oshanswers/ergonomics/lighting_flicker.html

Davis J, et al. (2015). Humans perceive flicker artifacts at 500 Hz.
https://doi.org/10.1038/srep07861

Mills M. (2020). How Many Frames per Second (FPS) the Human Eye Can See.
https://itigic.com/how-many-frames-per-second-fps-human-eye-can-see/

[–] jlh@lemmy.jlh.name 1 points 5 months ago* (last edited 5 months ago) (1 children)

Sure, eyes dont have a "global frame refresh" like computers do. That's why we can tell the difference between 24hz and 60hz video. Every eye cell is excited independently and continuously.

Still, there's a physical limit for frame time where 99% of humans wouldn't notice a full screen flash 99.9% of the time. Being able to shake your head around with a 1000hz vr headset and not perceive and motion blur from sample and hold seems pretty close to that limit.

[–] Dultas@lemmy.world 1 points 5 months ago

That's still going to depend on the frame rate of the output source. If the source is only 200fps having 1000hz isn't going to matter a whole hell of a lot.

[–] tal@lemmy.today 0 points 5 months ago* (last edited 5 months ago)

If you're using a game that renders each frame at an instant in time, and the aim is to get a better approximation of true motion blur to your eye, the theoretical maximum for getting smoother motion blur is gonna be when the thing is moving at one pixel a second, which is higher than the rate at which we can distinguish between individual images. Well, okay, maybe a bit more, since you could hypothetically have sub-pixel resolution.

But point is, more rendered frames does buy you something even past the point that they're not individually distinguishable, unless the game's rendering engine can render perfectly-accurate motion blur itself.

[–] adam_y@lemmy.world 11 points 5 months ago

Best gaming of my life was on a SNES coupled to a janky sanyo portable.

[–] Kolanaki@yiffit.net 10 points 5 months ago

Do I need to drink my own urine? No! But I do it anyway because it's sterile and I enjoy the taste!

[–] Nibodhika@lemmy.world 4 points 5 months ago (1 children)
[–] exscape@kbin.social 5 points 5 months ago (2 children)

Not obvious at all. Motion blur at high movement speeds makes things unreadable even at 540 Hz, proving that even at 540 Hz there is still plenty of motion blur that the human eye can see.

https://www.youtube.com/watch?v=OV7EMnkTsYA&t=682s

As the video says: "Yes, your eyes really are capable of seeing this in real life"

[–] PipedLinkBot@feddit.rocks 1 points 5 months ago

Here is an alternative Piped link(s):

https://www.piped.video/watch?v=OV7EMnkTsYA&t=682s

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] Nibodhika@lemmy.world 1 points 5 months ago

The title doesn't ask if it's useful, it asks if it's required, considering that no one NEEDS a display to begin with, a 1000Hz display by definition cannot be NEEDED.

I will likely get one eventually, just like I have a 165Hz display now, but do I NEED it? Absolutely no.

[–] autotldr@lemmings.world 3 points 5 months ago

This is the best summary I could come up with:


Now, the high frame rate experts at Blur Busters bring word of a 4K, 1,000 Hz prototype screen being shown off by Chinese panel maker TCL CSOT at the manufacturer-focused DisplayWeek 2024 conference.

And while recent advancements in pixel-flipping times have enabled TCL's LCD prototype, Blur Busters estimates that 1,000 Hz OLED displays could be commercialized as soon as 2027.

The apparent, impending breaking of the four-digit refresh rate threshold got us thinking: Are we finally approaching a point of diminishing returns in monitor-makers' long-running battle of the Hz?

But even faster refresh rates could help with the apparent sharpness of extremely fast objects on Ultra HD displays—think of a mouse pointer or video game crosshair that can move across the roughly 4,000 horizontal pixels on a 4K display in a single second.

Nvidia data shows the RTX 4090 generating 600+ fps frame rates on aging-but-still-popular games like Rainbow Six Siege and Fortnite at "High" settings and a full 1440p resolution.

Factor in a few more generations of graphics card upgrades—not to mention frame-generation technology that could offer 10 reprojected frames for every key frame—and waiting a single millisecond for your monitor to show a new image might not be totally ridiculous.


The original article contains 628 words, the summary contains 202 words. Saved 68%. I'm a bot and I'm open source!

[–] Okami_No_Rei@lemmy.world 2 points 5 months ago

Nope. I can't even tell the difference between 30Hz and 60Hz unless they're actually running side by side.

[–] mindbleach@sh.itjust.works 1 points 5 months ago

1000 Hz is going to be the end. Microsoft's experiments, umpteen years ago, said that's about the limit of human perception. Not "it looks smooth," but "it looks indistinguishable from reality because our wetware can't discern events much finer than that." Mmmaybe you go a little above that, to avoid two-millisecond events, but beyond that there is literally no point. Shutterglasses, I guess. Niche silliness.

More importantly - any framerate looks amazing. Same deal as G-Sync / FreeSync: frames appear onscreen the moment they are ready. There's no stutter at 29 Hz, or at 239 Hz.

[–] fushuan@lemm.ee 1 points 5 months ago* (last edited 5 months ago)

I don't! I barely notice 120Hz so I just run at 60, my GPU loves me for it.

Imagine running a 4K 1000Hz screen, and needing 66 times more computing power to render all those pixels than my wee 1080p 60Hz screen where I see stuff fucking fine.

[–] HubertManne@kbin.social -2 points 5 months ago

if your doing stereo 3d you could use 120.

[–] bstix@feddit.dk -3 points 5 months ago (2 children)

I doubt any game logic is going to be that fast ever. It might look better, but it won't make much difference in how players or even the game itself can react to events in the game. Perhaps it'll make VR more comfortable or something.

[–] Zehzin@lemmy.world 5 points 5 months ago

Some sort of Framegen style technology could get us there in the future and deliver smooth CRT-like motion

[–] DarkThoughts@fedia.io 3 points 5 months ago (2 children)

Game logic runs independently from what your monitor can display. So it's really just a question on what effect it has on the player itself. Maybe for VR there's an argument to be made, although I feel 1000 Hz still sounds like complete overkill even in that area. But I'm gonna call bullshit on people who claim to be able to tell the difference of such high rates.

[–] tal@lemmy.today 3 points 5 months ago* (last edited 5 months ago)

For rendered stuff, it typically does make for smoother motion, even at rates much higher than the eye can see, because of motion blur.

So, recorded video works fine at relatively low bitrates...but the camera is also set up to record a relatively-long exposure, something like a thirtieth of a second, and you see the scene averaged over that time. Your brain can see motion blur and interpret that usefully, to know that there is motion happening.

Rendered 3D game images typically do not work like that. You see a series of perfectly-sharp images at instants in time. So your brain doesn't get the nice smooth motion blur to work with.

But if your computer renders and displays the intermediate images, then your eye can work with that nice smooth blur.

It's probably possible to compute a motion-blur more efficiently than rendering a lot of intermediate frames, get at least some kind of approximation of true motion blur, and some games do that, but brute-force rendering of more frames is simple for s developer and accurate. Plus, any game that can support a high frame rate can do it, even if it doesn't have some kind of faux motion blur approximation.

I have a 165Hz monitor. When moving my mouse cursor around, I can definitely see independent images of the cursor.

EDIT: That being said, you could probably get a pretty good approximation by rendering and combining multiple frames on the card and only pushing a lower frame rate out to the monitor -- that is, you only really need beefy rendering hardware, not a fancy monitor or cable, to get pretty close. I suppose that in theory, a compositor could do that. I don't know if someone's already done that or not.

[–] burgersc12@mander.xyz 3 points 5 months ago* (last edited 5 months ago) (2 children)

Game logic does not always run independent of the framerate. Look at Fallout 4, if you run it at more than 60fps the dialogue literally overlaps itself.

[–] Belgdore@lemm.ee 4 points 5 months ago (1 children)

That’s because Bethesda is bad at making games not because there is an intrinsic need for the game logic to be tied to frame rate.

[–] burgersc12@mander.xyz 1 points 5 months ago

Used to be very common, but even Switch games today lock the framerate to 30/60fps or else it runs at 2x the speed it should

[–] DarkThoughts@fedia.io 0 points 5 months ago (1 children)

I didn't say framerate, I said from what your monitor can display. FPS and Hz are not synonymous.

[–] burgersc12@mander.xyz 1 points 5 months ago (1 children)

Why would you play a 60 fps game on a 1,000hz screen?

[–] DarkThoughts@fedia.io 0 points 5 months ago (1 children)

You've got it the wrong way around. People play very high FPS games on (comparatively) lower Hz monitors. This has been common practice in competitive pvp shooters for decades.

[–] burgersc12@mander.xyz 1 points 5 months ago* (last edited 5 months ago) (1 children)

This is my point. A 1000Hz screen would, most likely, be played at as close to 1000 fps as possible. I am not sure why you think i have it the wrong way when it is you.

[–] DarkThoughts@fedia.io 0 points 5 months ago (1 children)

Your point was that game logic doesn't run independently from your framerate, trying to refute my comment saying that game logic runs independently from your monitor. You're clearly severely confused about the topic at hand.

[–] burgersc12@mander.xyz 1 points 5 months ago (1 children)

I have not tried to refute. Just gave an example of game logic running slower than the screen and a question to why you wouldn't try to equalize fps and Hz. How am i confused again?

[–] DarkThoughts@fedia.io 0 points 5 months ago (1 children)

You gave an example of game logic being tied to framerate, which again, is a completely different matter. And generally, why would you ask me why you wouldn't equalize it, when you claim that the reason I've given was the point you were making in the first place, even though it's a completely different type of example? You make no sense at all.