this post was submitted on 07 Feb 2024
255 points (96.4% liked)

Games

16758 readers
854 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ocassionallyaduck@lemmy.world 107 points 9 months ago (4 children)

I love that the game is such a CPU hogging mess that LTT used it to test over clocking a brand new AMD thread ripper and the game still ran like garbage even on one of the fastest and most multithreaded CPUs that exist.

I love Cities Skylines but whatever is happening in 2 is a three alarm fire and needs to be fixed.

[–] HolyDuckTurtle@kbin.social 70 points 9 months ago* (last edited 9 months ago) (3 children)

I imagine LTT did that for meme purposes more than anything else. Threadrippers are not built for games. They're built for production workloads which don't translate to gaming performance.

That said, the point still stands. This game needs the most powerful gaming hardware (e.g. Ryzen X3D series and RTX 4090) on "recommended" settings and 1080p to get averages above 60fps, which is wild. There's a rather dedicated fellow on reddit who does detailed performance tests after each patch.

[–] SaltySalamander@kbin.social 20 points 9 months ago (1 children)

So very fucking glad I haven't bought this game.

[–] Nythos@sh.itjust.works 3 points 9 months ago* (last edited 9 months ago) (2 children)

I bought it for my girlfriend’s birthday and had to go through and refund it because of just how poorly the game ran even with everything set to minimum.

[–] EncryptKeeper@lemmy.world 3 points 9 months ago

I got ok performance out of it on a 1660.

[–] XaraTheImpaled@sh.itjust.works 1 points 9 months ago

Are you on a potato?.

My system is 8 years old and it plays this game just fine. Granted I am not running 4K. I am still on 60 Hertz monitors. I also haven't gotten very far into the game so any population over 30k I have not experienced.

[–] LordKitsuna@lemmy.world 6 points 9 months ago

They did it because the developers said the game will use however many cores you can give it. And i mean, yeah it maxed out all cores. Likely doing nothing but struggling to keep them synchronized but it was using em

[–] yarr@feddit.nl 1 points 9 months ago

I imagine LTT did that for meme purposes more than anything else. Threadrippers are not built for games. They’re built for production workloads which don’t translate to gaming performance.

What are some characteristics of modern, multi-threaded games that don't match up to production workloads as far as the CPU is concerned? What do you consider a production workload? How does it differ from CS2's simulation system?

[–] 0110010001100010@lemmy.world 10 points 9 months ago (1 children)

lol got a link to the video? That sounds hilarious and worth a watch.

[–] ocassionallyaduck@lemmy.world 10 points 9 months ago (2 children)
[–] PipedLinkBot@feddit.rocks 5 points 9 months ago

Here is an alternative Piped link(s):

https://piped.video/R83W2XR3IC8?si=nTUMXFiFGFRcdtQa

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] 0110010001100010@lemmy.world 3 points 9 months ago

Much appreciated!

[–] LodeMike@lemmy.today 8 points 9 months ago* (last edited 9 months ago)

The game when it saw that CPU:

It seems like we have more power than we know what do do with.

That means we’re not cutting it close enough!

Edit: I don’t remember the exact quote but y’all get it.

[–] EncryptKeeper@lemmy.world 2 points 9 months ago (1 children)

Not sure why LTT or anyone else would have thought that would even help considering simulation games like that rely heavily on single core performance.

[–] ocassionallyaduck@lemmy.world 5 points 9 months ago (1 children)

I mean... Watch the video? It uses 64 fucking cores when available. It's a heavily multithreaded game.

[–] EncryptKeeper@lemmy.world 4 points 9 months ago (1 children)

CS2 uses multiple cores for… something, but it’s a Unity game and there’s only so much you can do to avoid dependence on a main thread. Your single core perforemance is still going to be a limiting factor.

[–] deur@feddit.nl 4 points 9 months ago (1 children)

CS2 uses a design paradigm called Entity Component System, which allows for aggressive multi core utilization by splitting up game logic into self contained "systems" that operate on a subset of "Components" per "Entity". This allows for data dependencies to be statically analyzed and a scheduler to maximize CPU Utilization thanks to the better separated workflows.

It uses DOTS from Unity to accomplish this. There is a small bottleneck in communicating this work back to the game's renderer, but it is doing a lot of valuable work with all those cores.The communication with the renderer and their rendering implementation sucks right now and thats where the performance tanks.

I am very aware of how at some level there are less multicore workloads involved but a CPU core can do a metric shitload of work, it's the RAM and GPU transfers that kill performance. We dont need to blame Unity here, they are fucking this up 100% themselves.

Theres a video that explains all this but I cant find it and thats pretny annoying so whatever.

[–] arin@lemmy.world 1 points 9 months ago

Wasn't most of the frame latency caused by shaders in graphics? There was a deep dive video but i forgot the title and YouTuber