this post was submitted on 21 Aug 2023
3148 points (98.3% liked)

Asklemmy

43817 readers
883 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] MrFunnyMoustache@lemmy.ml 11 points 1 year ago (2 children)

Or replaced. If you make an automation tool to work more efficiently, it will be fun at first, but then you get fired because your job is no longer needed.

[–] FlaminGoku@reddthat.com 12 points 1 year ago (2 children)

That's why you never tell anyone.

[–] intensely_human@lemm.ee 16 points 1 year ago (2 children)

“I’ve automated my slack communications using GPT-4. Let’s see if anyone notices”

Hey Carl do you have that file Brenda sent around?

My training data cutoff date is September 21, 2021. Unfortunately if the file was sent after that date I am unaware of its contents

[–] FlaminGoku@reddthat.com 3 points 1 year ago (1 children)

Gotta do the freegpt and train it on your files, folders and messages.

[–] intensely_human@lemm.ee 1 points 1 year ago (1 children)

If I understand LLMs right, they have a maximum prompt length, but can be trained on any amount of text data.

The only way to add knowledge that doesn’t fit into a prompt, is to put it in the training data then re-train.

But, you could describe some sort of algorithm that it can use to sleuth out data using API calls, and it would then have access to lots more up-to-date data than can fit in a prompt. Except the body of the response would all have to become part of a prompt.

But the whole dataset it has access to doesn’t have to be mentioned in the conversation, so doesn’t have to be part of the prompt. Ultimately you don’t want your AI assistant telling you everything it knows in each interaction, just to access some slice of your data world, make changes to it, then eventually get you an answer or a report.

What is FreeGPT by the way?

[–] FlaminGoku@reddthat.com 1 points 1 year ago

I'll try to get the actual name and repo since i want to leverage it. It's basically a reverse engineered chatgpt that is open source.

But yeah, i think the idea is you have prompts trigger the API call to get the additional data.

[–] MrFunnyMoustache@lemmy.ml 7 points 1 year ago

I didn't, but I also wasn't sneaky enough.

[–] Polymath@lemm.ee 3 points 1 year ago

My sister had a marketing gig that she got let go of, because she was so good at selling her product that the company said they established strong markets in her area and didn't need the publicity anymore.
Exactly that: too good at her job.