this post was submitted on 14 Apr 2024
148 points (94.0% liked)

ChatGPT

8910 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] pennomi@lemmy.world 1 points 6 months ago (1 children)

What we haven’t hit yet is the point of diminishing returns for model efficiency. Small, locally run models are still progressing rapidly, which means we’re going to see improvements for the everyday person instead of just for corporations with huge GPU clusters.

That in turn allows more scientists with lower budgets to experiment on LLMs, increasing the chances of the next major innovation.

[–] CeeBee@lemmy.world 1 points 6 months ago

Exactly. We're still very early days with this stuff.

The next few years will be wild.