this post was submitted on 25 Dec 2023
19 points (91.3% liked)

People Twitter

5182 readers
1698 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying or international politcs
  5. Be excellent to each other.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] candle_lighter@lemmy.ml 3 points 10 months ago (3 children)

I want said AI to be open source and run locally on my computer

[–] CeeBee@lemmy.world 1 points 10 months ago (2 children)

It's getting there. In the next few years as hardware gets better and models get more efficient we'll be able to run these systems entirely locally.

I'm already doing it, but I have some higher end hardware.

[–] Xanaus@lemmy.ml 1 points 10 months ago (1 children)

Could you please share your process for us mortals ?

[–] CeeBee@lemmy.world 1 points 10 months ago

Stable diffusion SXDL Turbo model running in Automatic1111 for image generation.

Ollama with Ollama-webui for an LLM. I like the Solar:7b model. It's lightweight, fast, and gives really good results.

I have some beefy hardware that I run it on, but it's not necessary to have.