this post was submitted on 02 Jan 2024
389 points (98.5% liked)

Technology

59161 readers
1744 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Nommer@sh.itjust.works 146 points 10 months ago (1 children)

Single threaded performance was the only reason to go Intel.

[–] AdamEatsAss@lemmy.world 24 points 10 months ago (2 children)

Maybe this will push more game developers to develop games that use multiple cores? I know nothing about game development.

[–] anlumo@lemmy.world 49 points 10 months ago

That has been happening for the last decade, but it’s really hard.

[–] drfuzzyness@lemmy.world 30 points 10 months ago (1 children)

Most AAA game studios target consoles first. Their in-house or external porting teams will then adapt it for Windows, but by then major engine decisions will likely have already been made in service of supporting the Ryzen/RDNA based Xbox Series and PS5 consoles. Smaller studios might try to target all systems at once but aiming for the least common denominator (Vulkan, low hardware requirements). Switch is a bit of its own best when trying to get high performance graphics.

Multi threading is mostly used for graphics, sound, and animation tasks while game logic and scripting is almost always single threaded.

[–] deleted@lemmy.world 10 points 10 months ago (2 children)

I bought Ryzen 3950x 16 cores 32 threads.

The first thing I noticed is some AAA games only utilize 8 cores. When you go multi threaded, it’s a matter of adding more threads which can dynamically selected based on the host hardware. AAA game studios are going the bad practice route.

I understand if they port an algorithm optimized to run on specific hardware as it’s. But, a thread count?

[–] Amir@lemmy.ml 6 points 10 months ago (2 children)

it’s a matter of adding more threads

You can't ask 300 people to build a chair, and expect the chair to be finished 300x faster than if a single person would build it.

[–] Buddahriffic@lemmy.world 5 points 10 months ago (1 children)

Also, to make it more accurate to what multi-threading does, none of those 300 people can see what the others are doing. And the most reliable ways of sending messages to each other involve taking a nap (though it might be brief, you might wake up in an entirely different body and need to fetch your working memory from your old body or worse, from RAM).

Or you can repeatedly write your message until you can be sure that no one else wrote over it since you started writing it. And the more threads you have, the more likely another one wrote over your message to the point where all threads are spending all of their time trying to coordinate and no time working.

[–] deleted@lemmy.world 2 points 10 months ago (1 children)
[–] Buddahriffic@lemmy.world 1 points 10 months ago

I'm not familiar with their implementation but they'll likely have one of those mechanisms under the hood.

You can only avoid them in very simple cases that don't really scale up to a large number of threads in most cases. The one exception that does scale well is large amounts of data that can be processed independently of the rest of the data and the results are also independent. 3D rendering is one example, though some effects can create dependencies.

[–] deleted@lemmy.world 3 points 10 months ago

So 8 cores is doable but 16 no?

load more comments (1 replies)
[–] Grass@sh.itjust.works 109 points 10 months ago (3 children)

I wish all the computer parts companies would only release new products when they are definitively better rather than making them on a schedule no matter what. I don't want to buy this year's 1080p gaming CPU and GPU combo for more than I spent for the last one with the same capabilities, I want the next series of the same part to be capable of more damn it.

[–] sugartits@lemmy.world 26 points 10 months ago (1 children)

Inflation has entered the chat

[–] Grass@sh.itjust.works 19 points 10 months ago (2 children)

Every *flation seems to exist solely to make me sad and miserable...

[–] Ultragramps@lemmy.blahaj.zone 11 points 10 months ago (1 children)

Lifeboat/Life jacket inflation is pretty much always good. Airbags cause harm going off early.
Then for deflation, a person’s ego can be deflated for good reasons, maybe.

[–] TimeSquirrel@kbin.social 7 points 10 months ago* (last edited 10 months ago)

Unless you inflated it while still onboard the sinking aircraft.

[–] sugartits@lemmy.world 5 points 10 months ago

That's what happens when some in society are able to "print" as much money as they damn well please and the rest of us have to work for it ...

[–] downhomechunk@midwest.social 15 points 10 months ago

Think of the quarterly profits, won't someone please think of the shareholders?!?

/s

[–] simple@lemm.ee 41 points 10 months ago (1 children)

The article mentions the results are probably because of Intel's focus on AI, but it's more likely that this was because of Intel's focus on making their chips use less power. Laptops with the new generation have a significantly better battery life.

[–] EddyBot@lemmy.world 11 points 10 months ago

wasn't Intel the one which raised the bar of TDP on laptop CPUs in the first place? so they could win in CPU benchmarks

[–] Octagon9561@lemmy.ml 41 points 10 months ago (10 children)

How's the performance per watt?

Oh wait. Nevermind, Intel sucks anyway. If it's not performance issues, it's hardware exploits. Not to mention Intel's support for genocide in Gaza.

load more comments (10 replies)
[–] NounsAndWords@lemmy.world 32 points 10 months ago (1 children)

On a technical level, it's hard to say why Meteor Lake has regressed in this test, but the CPU's performance characteristics elsewhere imply that Intel simply might not have cared as much about IPC. Meteor Lake is primarily designed to excel in AI applications and comes with the company's most powerful integrated graphics yet. It also features Foveros technology and multiple tiles manufactured on different processes. So while Intel doesn't beat AMD or Apple with Meteor Lake in IPC measurements, there's a lot more going on under the hood.

[–] sugartits@lemmy.world 18 points 10 months ago

comes with the company's most powerful integrated graphics yet.

Not a particularly high bar there...

[–] jlh@lemmy.jlh.name 20 points 10 months ago

I wonder if these have increased ram latency due to the chiplet design. These are the first mobile chiplet I've seen, aside from desktop-replacements using am4/am5 ryzens.

Hopefully Anandtech will have more detailed look whenever they ever get their hands on a sample.

[–] PanArab@lemmy.world 15 points 10 months ago (2 children)

Intel is making the transition to ARM -and eventually RISC-V- inevitable.

[–] hglman@lemmy.ml 8 points 10 months ago

Legacy compatibility always has had a cost, i guess its finally meaningfully showing up.

[–] monkeyman512@lemmy.world 6 points 10 months ago

That's silly. But I'm pretty sure AMD is pretty happy with the situation.

[–] tedu@azorius.net 12 points 10 months ago (2 children)
[–] Municipal0379@lemmy.world 8 points 10 months ago

Only a TJ’s worth.

[–] swayevenly@lemm.ee 5 points 10 months ago

https://www.notebookcheck.net/Intel-Core-Ultra-7-155H-Processor-Benchmarks-and-Specs.783323.0.html

Has various tests and results. Looks like TDP is 23 watts and the range during tests is 30-77 watts with one at 90 but given that it was tested at idle, I don't know what to make of it.

[–] Haha@lemmy.world 5 points 10 months ago

Say it with me: For the shareholders!

load more comments
view more: next ›