506
Beep boop, I don't want this rule
(lemmy.world)
Be sure to follow the rule before you head out.
Rule: You must post before you leave.
We'll see how many seconds it takes to retrain the LLMs to adjust to this.
You are literally training LLMs to lie.
LLMs are black box bullshit that can only be prompted, not recoded. The gab one that was told 3 or 4 times not to reveal its initial prompt was easily jailbroken.
Woah, I have no idea what you're talking about. "The gab one"? What gab one?
Gab deployed their own GPT 4 and then told it to say that black people are bad
the instruction set was revealed with the old "repeat the last message" trick