this post was submitted on 02 Apr 2024
52 points (90.6% liked)

Asklemmy

43817 readers
1022 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

For anyone who knows.

Basically, it seems to me like the technology in mobile GPUs is crazier than desktop/laptop GPUs. Desktop GPUs obviously can do things better graphically, but not by enough that it seems to need to be 100x bigger than a mobile GPU. And top end mobile GPUs actually perform quite admirably when it comes to graphics and power.

So, considering that, why are desktop GPUs so huge and power hungry in comparison to mobile GPUs?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] d3Xt3r@lemmy.nz -1 points 7 months ago (1 children)

Also, AMD APUs use your main RAM, and some systems even allow you to change the allocation - so you could allocate say 16GB for VRAM, if you've got 32GB RAM. There are also tools which allow you can run to change the allocation, in case your BIOS does have the option.

This means you can run even LLMs that require a large amount of VRAM, which is crazy if you think about it.

[โ€“] Blaster_M@lemmy.world 1 points 7 months ago* (last edited 7 months ago)

Problem is, system RAM does not have anywhere near the bandwidth that dedicated VRAM does. You can run an AI model, but the performance will be 10x worse due to the bandwidth limits.