8
submitted 1 year ago by Razgriz@lemmy.world to c/chatgpt@lemmy.world

I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries...

It simply replied that it can't do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn't remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It's really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

you are viewing a single comment's thread
view the rest of the comments
[-] Razgriz@lemmy.world 0 points 1 year ago* (last edited 1 year ago)

I asked for a list of countries that dont require a visa for my nationality, and listed all contients except for the one I reside in and Africa...

It still listed african countries. This time it didn't end the conversation, but every single time I asked it to fix the list as politely as possible, it would still have at least one country from Africa. Eventually it woukd end the conversation.

I tried copy and pasting the list of countries in a new conversation, as to not have any context, and asked it to remove the african countries. No bueno.

I re-did the exercise for european countries, it still had a couple of european countries on there. But when pointed out, it removed them and provided a perfect list.

Shit's confusing...

this post was submitted on 06 Jul 2023
8 points (100.0% liked)

ChatGPT

8902 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS