this post was submitted on 16 Mar 2025
57 points (93.8% liked)

Games

18062 readers
280 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 2 years ago
MODERATORS
all 25 comments
sorted by: hot top controversial new old
[–] _core@sh.itjust.works 13 points 23 hours ago* (last edited 23 hours ago) (2 children)

Ray tracing is cool but the screenshots in the video just destroy the atmosphere by making everything day bright.

[–] MoonMelon@lemmy.ml 2 points 14 hours ago

Yeah it's technically better (crisper, higher res, etc) but the visual language is totally different and IMO worse. That fire just doesn't look as hot, for example.

Lighting as an art form is highly coupled with the given tech. Something as small as just changing the shadow method can require artists relighting a whole game. My guess is they aren't doing this or maybe are pushing the stark lighting the same way depth of field, colored lights, lens flares etc got juiced in previous generations.

That's always wrong. The tech should service the art, not the other way around.

[–] WolfLink@sh.itjust.works 4 points 17 hours ago (1 children)

Agreed. In the video a lot of the “RTX Off” scenes look better IMO.

[–] Tim_Bisley@piefed.social 1 points 10 hours ago

I feel this way about a lot of RTX implementations.

[–] grue@lemmy.world 8 points 1 day ago (2 children)

So this might be a stupid question considering how tied to Nvidia it seems to be, but... will run on the new AMD 90-series cards that can actually do ray-tracing?

[–] Trollception@sh.itjust.works 2 points 17 hours ago

Sure it will. It will probably run like a 2080 TI or 3070 though. The ray tracing performance of AMD isn't stellar. At least they got their shit together and got a AI rescaler that competes with DLSS.

[–] kayzeekayzee@lemmy.blahaj.zone 1 points 1 day ago* (last edited 19 hours ago) (1 children)

Probably not, no

Edit: Actually maybe. Read my comment below

[–] grue@lemmy.world 1 points 22 hours ago (1 children)

So I've been reading up on it a little bit and it seems like raytracing is part of Vulkan and DirectX 12, so the APIs shouldn't be proprietary the way CUDA and PhysX are...right?

[–] kayzeekayzee@lemmy.blahaj.zone 1 points 19 hours ago (2 children)

In theory, yea it should. But this is developed by Nvidia with their own graphics cards in mind. If this is similar to Portal RTX, then RDNA3 cards will be able to run it, just with severly crippled performance.

[–] Trollception@sh.itjust.works 1 points 17 hours ago

Yea but isn't AMDs ray tracing performance severely crippled in general? I thought if you want to use ray tracing your far better of buying from team green.

[–] grue@lemmy.world 1 points 17 hours ago (1 children)

I was under the impression that RDNA3 was still pretty bad at raytracing in general. What I'm really wondering though, having just bought a 9070 XT, is what about RDNA4?

[–] BeardedGingerWonder@feddit.uk 1 points 16 hours ago (1 children)

Out of curiosity, what are you upgrading from? I've a 3070 and I'm wondering if now's the time to jump back to team red.

[–] grue@lemmy.world 1 points 14 hours ago (1 children)

A Vega 56. If you think your 3070 might be obsolete we have vastly different perspectives, LOL!

[–] BeardedGingerWonder@feddit.uk 1 points 11 hours ago (1 children)

I don't think it's obsolete by any means - it is driving an ultrawide monitor and the RAM upgrade might be nice longer term, this gen is slightly more reasonably priced for AMD, Nvidia driver upgrades are painful on linux, so I wanted to know if there's a significant uplift/how well does it handle RT in comparison to justify a switch for me.

[–] grue@lemmy.world 1 points 9 hours ago (1 children)

Well, all I know is that it's a significant uplift over a Vega. I'm pretty sure it's supposed to be faster than a 3070, but I'm not sure if it's enough faster to be worth it. I guess if it's on the bubble, AMD playing nicer with Linux might push it over the edge. You should read/watch reviews or something; you'll be able to get much better advice from e.g. Gamer's Nexus than from me.

One thing I will say is that, in my experience, standing in line at Microcenter on launch day is the only way to have a decent chance at getting a video card at MSRP these days. Since you missed that for the 9070 XT, maybe wait until whatever AMD makes next (9070 XTX? 9060? 9080??) comes out and try to get one of those on launch day.

You should definitely not be desperate enough to pay higher-than-MSRP prices just to get more RAM and easier to deal with drivers.

[–] BeardedGingerWonder@feddit.uk 1 points 59 minutes ago

You make a good point, I'll watch reviews and that sometimes it's nice to get an.average user opinion too! I'm probably at least 6 months away from buying anything unless I see a complete bargain. Not being sure I should get it should probably be an indicator I'm okay for now! Thanks.

[–] mossy_@lemmy.world 2 points 20 hours ago
[–] AMillionMonkeys@lemmy.world 3 points 23 hours ago (1 children)

So who's actually developing it? If it was Valve they would have said...
I already own HL2, but presumably I would have to buy this anew.

[–] Murvel@lemm.ee 6 points 23 hours ago

It's a community project with support from Valve and Nvidia

[–] hisao@ani.social 3 points 1 day ago (2 children)

This looks great, I wonder if it's possible for all this DLSS and similar tech to get rid of severe input lag. No matter how good it looks, I wouldn't play a shooter with input lag.

[–] Trollception@sh.itjust.works 2 points 17 hours ago

There shouldn't be any additional input lag. DLSS and frame gen have some but to most people it's not noticeable since it's like 10ms.

Well as long as you don't turn on frame gen, the input lag shouldn't be much higher than normal. There's also NVidia reflex, which can shave a few milliseconds off input lag

[–] Trollception@sh.itjust.works -1 points 17 hours ago

Waaahhh but I hate Ray TraCing.

if only this could release on consoles