this post was submitted on 09 Mar 2024
34 points (90.5% liked)

Games

16737 readers
455 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 1 year ago
MODERATORS
top 8 comments
sorted by: hot top controversial new old
[–] explodicle@local106.com 26 points 8 months ago

I can believe it.

In economics, the Jevons paradox occurs when technological progress increases the efficiency with which a resource is used (reducing the amount necessary for any one use), but the falling cost of use induces increases in demand enough that resource use is increased, rather than reduced.

[–] Dettweiler42@lemmyonline.com 19 points 8 months ago (1 children)

Too many AI language models are just word salad. It will spit out very long responses that add nothing of substance. Sometimes it's kind of like a high schooler desperately trying to reach the paragraph requirement on an essay.

[–] PrinceWith999Enemies@lemmy.world 11 points 8 months ago

My favorite example so far was when a person asked a car dealer chatbot intended to talk to customers about their cars to write a python script, and it complied.

[–] mindbleach@sh.itjust.works 3 points 8 months ago

This technology is so goofy that the simple solution might be to prompt with a story that ends right before the NPC says something. Doesn't necessarily have to be a different story per-character, or even change much beyond appending that character's dialog and yours. If you feed an LLM most of a chapter from The Hobbit and then end the prompt at "Then Thorin said," you're very likely to get some sentences that are in-theme and even in-character.

Telling the machine what to do, as abstract directions, suffers from very silly errors. Like how "draw a room with absolutely no elephants" will predictably draw a room with a high positive number of elephants. The great thing about this technology is how it works kinda like how human intelligence works. Too bad we have no goddamn idea how human intelligence works.

[–] shani66@ani.social 3 points 8 months ago

I mean, that'd definitely end basic one line dialogue when used, but i feel like it'll introduce a different, probably worse, issue.

[–] Deceptichum@kbin.social 2 points 8 months ago

Why does the NPC audio not match the text in the 'tutorial' video?

[–] mindbleach@sh.itjust.works 2 points 8 months ago

For starters:

"Fewer."

[–] nanoUFO@sh.itjust.works 1 points 8 months ago

I don't believe but okay.