this post was submitted on 19 Nov 2023
467 points (87.1% liked)
Technology
59577 readers
3239 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The whole point of AI is that those systems aren't fundamentally different. There is little to no human expertise that goes into those systems, it's all self learned from the data. That's why we are getting AIs that can do images, music, chess, Go, chatbots, etc. all in short order. None of them are build on decades worth of human expertise in music or art, but simply created by throwing data at the problem and letting the AI algorithms figure out the rest.
The human expertise is in the data. There's no such thing as spontaneous AI generation of expertise from nothing. If you train up an AI on information that doesn't have it, the AI won't learn it. In a very real way, the profit margins of AI-generated content rest wholly on its ability to consume and derive output from source material developed by unpaid experts.
Also, when the data is the output of people with biases, the AI will do the same.
There are no art lessons in the training data, just labeled images. How to actually draw the AI has to figure out by itself. Even the labels on the data aren't strictly needed, they are just there so the humans can interact with the AI by text.
Same with AlphaZero, it didn't learn playing Go from humans, it learned that by playing against itself and not only beat humans, but previous versions that were still based on recording human Go games.
That's the bitter lesson of AI research: Throwing data and computation at the problem gives you much better results than human experts.
And we have barely even begun to explore this. What ChatGPT does is still just reciting information from books and websites, it can't interact with the real world to learn by itself. It being based on books written by human experts is not a benefit, but what's holding it back.