44
submitted 5 months ago by Ozone6363@lemmy.world to c/chatgpt@lemmy.world

It's so frustrating.

Even very basic things like "Summarize this video transcipt" on GPTs built specifically for that purpose.

Firstly, it cannot even read text files anymore. It straight up "cannot access documents". No idea why, sometimes it will act like it can, but it becomes obvious it's hallucinating or only read part of the document.

So ok, paste info in. GPT will start giving you a detailed summary, and then just skip over like 40 fucking percent of the middle, and resume summarizing at the end.

I mean honestly, I'm hardly asking it to do complex shit.

I have absolutely no idea what lead to this decline, but it's become so bad it is hardly even worth messing with it anymore. Such an absolute shame.

you are viewing a single comment's thread
view the rest of the comments
[-] Uranium3006@kbin.social 9 points 5 months ago

local AI is the way. it's just that current models aren't gpt4 quality yet and you'd probably need 1 TB of VRAM to run them

[-] hperrin@lemmy.world 4 points 5 months ago

Surprisingly, there’s a way to run Llama 3 70b on 4GB of VRAM.

https://huggingface.co/blog/lyogavin/llama3-airllm

[-] theterrasque@infosec.pub 1 points 5 months ago

Llama3 70b is pretty good, and you can run that on 2x3090's. Not cheap, but doable.

You could also use something like runpod to test it out cheaply

this post was submitted on 14 May 2024
44 points (86.7% liked)

ChatGPT

8902 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS