this post was submitted on 31 Jan 2025
145 points (96.8% liked)

Technology

61263 readers
3719 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

I wonder what his first clue was.

you are viewing a single comment's thread
view the rest of the comments
[–] paraphrand@lemmy.world 86 points 6 hours ago* (last edited 5 hours ago) (2 children)

Beyond prompting OpenAI to reconsider its release philosophy, Altman said that DeepSeek has pushed the company to potentially reveal more about how its so-called reasoning models, like the o3-mini model released today, show their “thought process.” Currently, OpenAI’s models conceal their reasoning, a strategy intended to prevent competitors from scraping training data for their own models.

“In light of DeepSeek showing how straightforward the text we were hiding is, and because we actually don’t have much secret sauce, we’ll show you the text, now.

But we’ll probally do the same thing again if we figure something new out. You can’t hold us to it.”

[–] eager_eagle@lemmy.world 3 points 2 hours ago

I could see this coming a parsec away lol.

How's this different than a self reflecting agent you can write today? I thought when o1 was announced, while some people were excited, even calling it some sort of proto AGI.

OpenAI is great at raising money and just feeds from the hype.

[–] Dsklnsadog@lemmy.dbzer0.com 26 points 5 hours ago

DING DING DING DING