this post was submitted on 21 Dec 2023
58 points (79.0% liked)

Technology

59357 readers
4146 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
all 31 comments
sorted by: hot top controversial new old
[–] AceSLS@ani.social 56 points 10 months ago* (last edited 10 months ago) (4 children)

This comment sums it up pretty nicely:

LOL innovative invention of swapping memory to storage…… maybe they can call it something cool like “cache”.

Apple being "innovative" my ass, lmao

[–] cheese_greater@lemmy.world 17 points 10 months ago (1 children)
[–] jcg@halubilo.social 2 points 10 months ago

How bow dat?

[–] lemmylommy@lemmy.world 12 points 10 months ago

Well, if that commenter had more than just a vague idea of caching and/or swapping, they would know that the right algorithm can make or break performance.

That paper is not “we invented caching”, but “this is how we make some certain models work well despite constraints imposed by RAM and flash storage.”

It’s a worthy job for an engineer or researcher. Not quite as innovative as the invention of the wheel, but still enough to write a paper on (and read it, if you can manage to understand it).

[–] Hegar@kbin.social 10 points 10 months ago (1 children)

The easiest way to tell that something's not really innovative is if the person describing it uses the word innovative.

[–] HeartyBeast@kbin.social 4 points 10 months ago (2 children)

Can you give an example of something that actually was innovative, that no-one called innovative?

[–] Hegar@kbin.social 4 points 10 months ago (1 children)
[–] HeartyBeast@kbin.social 0 points 10 months ago (1 children)

Interestingly, it looks as if nothing was really called innovative before 1960, with usage peaking in 2000, and its now in decline

https://www.etymonline.com/word/innovative

[–] Hegar@kbin.social 1 points 10 months ago

Peaking in 2000 seems odd - I see or hear that word daily and that definitely wasn't the case back in 2000. Interesting!

[–] GluWu@lemm.ee 0 points 10 months ago
[–] 4am@lemm.ee -2 points 10 months ago

35 upvotes in the technology community…man you guys really are just all knee-jerk reactionaries and it really knowledgeable tech at all. git gud

[–] guitarsarereal@sh.itjust.works 26 points 10 months ago* (last edited 10 months ago) (1 children)

Everyone likes to trash machine learning because the power requirements are high, but what they don't realize is that we're in the very first days of this technology (well, first couple decades of the technology being around, first few years of it being advanced enough to have anything to show off). Every technology that got bundled together into your phone was equally as useless when it was first invented. Honestly, compared to the development of most other technologies I've looked at, the pace of development in AI has been shocking.

Literally once a week, I see some news story about AI researchers delivering an order of magnitude speedup in some aspect of AI inference. The technique described here apparently allows for a 20x speedup on GPU's.

[–] cybersandwich@lemmy.world 10 points 10 months ago

Whispercpp works off the ML cores on the m series chips. It's faster than my 1080ti that I have in a server doing the same things--by orders of magnitude. And it sips power.

Purpose built chips can be super powerful for their specific purposes.

[–] cheese_greater@lemmy.world 15 points 10 months ago (2 children)
[–] LazaroFilm@lemmy.world 19 points 10 months ago (1 children)

Huhum..?

Still working on that…

I’m sorry, try again later.

[–] cheese_greater@lemmy.world 3 points 10 months ago (1 children)
[–] coolmojo@lemmy.world 6 points 10 months ago (1 children)

Found the following websites about you’re triggering me lol.

[–] cheese_greater@lemmy.world 4 points 10 months ago

Sigh! [unzips] Go ahead...

[–] fruitycoder@sh.itjust.works 3 points 10 months ago

Tbh I'm more excited to see someone do use webnn, webgpu and petals together. Building smaller tighter models is good too.

[–] wizzor@sopuli.xyz 1 points 10 months ago

I don't understand the innovation, I already run LLMs and stable diffusion on a laptop from 2011.

I have no doubt it could be run on my Android phone.