32

We all know by now that ChatGPT is full of incorrect data but I trusted it will no go wrong after I asked for a list of sci-fi books recommendations (short stories anthologies in Spanish mostly) including book names, editorial, print year and of course ISBN.

Some of the books do exist but the majority are nowhere to be found. I pick the one that caught my interest the most and contacted the editorial directly after I did not find it in their website or anywhere else.

This is what they replied (Google Translate):


ChatGPT got it wrong.

We don't have any books with that title.

In the ISBN that has given you the last digit is incorrect. And the correct one (9788477028383) corresponds to "The Holy Fountain" by Henry James.

Nor have we published any science fiction anthologies in the last 25 years.


I quick search in the "old site" shows that others have experienced the same with ChatGPT and ISBN searches... For some reason I thought it will no go wrong in this case, but it did.

you are viewing a single comment's thread
view the rest of the comments
[-] Fixbeat@lemmy.ml 1 points 1 year ago

Okay, but how does a text predictor know how to program?

[-] inverimus@lemm.ee 4 points 1 year ago

Its training data had a lot of code in it. It does the same thing with code that it does with any other text, predict the next token given the previous tokens.

[-] themoonisacheese@sh.itjust.works 1 points 1 year ago* (last edited 1 year ago)

Code is text. It's predicting text that might show up after the text "here is a program that does x". A lot of the time, in the training data, phrases like that were followed by code.

It's not particularly good at programming by the way. It's good at basic tasks but anything complex enough that you couldn't learn it in a few hours, it can't do at all, but it will sure pretend like it can!

this post was submitted on 09 Aug 2023
32 points (75.8% liked)

ChatGPT

8824 readers
7 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS