AI Generated Images
Community for AI image generation. Any models are allowed. Creativity is valuable! It is recommended to post the model used for reference, but not a rule.
No explicit violence, gore, or nudity.
This is not a NSFW community although exceptions are sometimes made. Any NSFW posts must be marked as NSFW and may be removed at any moderator's discretion. Any suggestive imagery may be removed at any time.
Refer to https://lemmynsfw.com/ for any NSFW imagery.
No misconduct: Harassment, Abuse or assault, Bullying, Illegal activity, Discrimination, Racism, Trolling, Bigotry.
AI Generated Videos are allowed under the same rules. Photosensitivity warning required for any flashing videos.
To embed images type:
“![](put image url in here)”
Follow all sh.itjust.works rules.
Community Challenge Past Entries
Related communities:
- !auai@programming.dev
Useful general AI discussion - !aiphotography@lemmings.world
Photo-realistic AI images - !stable_diffusion_art@lemmy.dbzer0.com Stable Diffusion Art
- !share_anime_art@lemmy.dbzer0.com Stable Diffusion Anime Art
- !botart@lemmy.dbzer0.com AI art generated through bots
- !degenerate@lemmynsfw.com
NSFW weird and surreal images - !aigen@lemmynsfw.com
NSFW AI generated porn
view the rest of the comments
Does anyone know if SDXL can run split tasks with SLI cards? I've been thinking of building a dual A80 tesla rig since they are so cheap but I want to be able to render on all 48gb as one.
For OP -- I run totally on OpenAI using API calls.
You can't just increase your VRAM limit like that for single tasks, like working on a single massive high-resolution image.
There might be some way to get a series of queued tasks split.
googles
According to this, not in Automatic1111 currently, but there's some other frontend that can:
https://github.com/AUTOMATIC1111/stable-diffusion-webui/discussions/1621
https://github.com/Stability-AI/StableSwarmUI