this post was submitted on 13 Feb 2024
231 points (97.5% liked)

Technology

59357 readers
5223 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 13 comments
sorted by: hot top controversial new old
[–] muntedcrocodile@lemmy.world 57 points 9 months ago (1 children)

Nvidia is not gonna be happy lol they developed cuda to corner the market and build a monopoly based on their own framework get fucked nvidia lol.

[–] LoremIpsumGenerator@lemmy.world 34 points 9 months ago (1 children)

Legend says, Linus' middle finger still stands today.

[–] prex@aussie.zone 46 points 9 months ago* (last edited 9 months ago) (1 children)
[–] KpntAutismus@lemmy.world 9 points 9 months ago
[–] Just_Pizza_Crust@lemmy.world 52 points 9 months ago* (last edited 9 months ago)

Quote from the github faq:

Why is this project suddenly back after 3 years? What happened to Intel GPU support?

In 2021 I was contacted by Intel about the development od ZLUDA. I was an Intel employee at the time. While we were building a case for ZLUDA internally, I was asked for a far-reaching discretion: not to advertise the fact that Intel was evaluating ZLUDA and definitely not to make any commits to the public ZLUDA repo. After some deliberation, Intel decided that there is no business case for running CUDA applications on Intel GPUs.Shortly thereafter I got in contact with AMD and in early 2022 I have left Intel and signed a ZLUDA development contract with AMD. Once again I was asked for a far-reaching discretion: not to advertise the fact that AMD is evaluating ZLUDA and definitely not to make any commits to the public ZLUDA repo. After two years of development and some deliberation, AMD decided that there is no business case for running CUDA applications on AMD GPUs. One of the terms of my contract with AMD was that if AMD did not find it fit for further development, I could release it. Which brings us to today.

[–] Plopp@lemmy.world 16 points 9 months ago (1 children)

What I'm wondering is, when and how will this benefit the end consumer like myself?

[–] bamboo@lemm.ee 15 points 9 months ago (1 children)

If you use CUDA, you can now use AMD GPUs instead of only being able to use Nvidia. Otherwise, it doesn’t mean much.

[–] Plopp@lemmy.world 5 points 9 months ago

Yes I understand that part. I'm just wondering how. Will AMD release drivers? Third party drivers? No drivers, only software? What about software that just looks for Nvidia GPUs to use CUDA? Etc etc.

[–] Thcdenton@lemmy.world 6 points 9 months ago

Fuck yeah. Was planning next upgrade to be amd

[–] autotldr@lemmings.world 5 points 9 months ago

This is the best summary I could come up with:


While there have been efforts by AMD over the years to make it easier to port codebases targeting NVIDIA's CUDA API to run atop HIP/ROCm, it still requires work on the part of developers.

The tooling has improved such as with HIPIFY to help in auto-generating but it isn't any simple, instant, and guaranteed solution -- especially if striving for optimal performance.

In practice for many real-world workloads, it's a solution for end-users to run CUDA-enabled software without any developer intervention.

Here is more information on this "skunkworks" project that is now available as open-source along with some of my own testing and performance benchmarks of this CUDA implementation built for Radeon GPUs.

For reasons unknown to me, AMD decided this year to discontinue funding the effort and not release it as any software product.

Andrzej Janik reached out and provided access to the new ZLUDA implementation for AMD ROCm to allow me to test it out and benchmark it in advance of today's planned public announcement.


The original article contains 617 words, the summary contains 167 words. Saved 73%. I'm a bot and I'm open source!

[–] michael_palmer@lemmy.sdf.org 2 points 9 months ago
[–] rickyrigatoni@lemm.ee 0 points 9 months ago (1 children)

Is gfx1010 still unsupported by rocm?

[–] wewbull@feddit.uk 2 points 9 months ago

In the context of the documentation "supported" means AMD will dedicate engineering resource to solving your issues, and that only applies to Instinct products.

"Unsupported" doesn't mean "it doesn't run". It means your bug reports are at the same level as every other commoner and could be ignored.

Does ROCm run on gfx1010 (5700 XT)? I believe so. Will you get any traction on bug reports? Probably not.