I saw Generative AI for Beginners from Microsoft on GitHub. I’ve looked at https://fmhy.pages.dev/ai but I’m not sure what I’m really looking for.

I write fiction, and I want a chatbot that will function like chat gpt3.5, but not shut down if things get bloody or sexy, as they so often do.

You know ready, aim, fire? I’m in the AIM stage.

13 points

Check out the ai horde. https://aihorde.net or direct llm frontend at https://lite.koboldai.net. Free Foss crowdsourced with uncensored models that won’t ever be rugpulled

permalink
report
reply
5 points

I ought to have known you’d have a good answer! Thank you!

permalink
report
parent
reply
3 points

Spread the word!

permalink
report
parent
reply
3 points

ollama helps you to easily run llms locally: https://ollama.com/

I’m running llama2-uncensored on my laptop with 8GB of memory.

permalink
report
reply

There’s a reddit forum called local llama

permalink
report
reply
2 points

There’s also !localllama@sh.itjust.works on here

permalink
report
parent
reply
3 points
*

I would look into NovelAI for writing, it’s quite specifically for that. It’s a paid servicd similar to chatgpt, but it’s uncensored and private.

You can run your own lightweight LLM on a laptop but the output will be useless. Good output requires big boy compute.

If you do want to run it on your own hardware, look into Ollama. There’s also options to run your own LLM in the cloud with a not too difficult process for non-techies.

Frankly, id find the right LLM for your needs and just pay for it per month, maybe novelai, maybe something else, but chatgpt is not great for creative fiction.

permalink
report
reply
2 points

I got a little TOO MUCH involvement from NovelAI. I guess I want suggestion help, idea spitball help, but it’s specialized what I’m looking for.

I want my ai to stay on the shelf with my thesaurus until I’m ready to use it.

permalink
report
parent
reply
2 points

Interesting, im vaguely interested in this too. i have half of a world written that i want to turn into a game maybe (probably not but, amhaving fun) I have the hardware to turn what i have into an embedding for an open model, and the hardware to run it. So that’s the way i would go about it, though i can’t advocate for how helpful it would be (yet)

permalink
report
parent
reply
2 points

gpt4all is the easiest way in, hands down, no contest

permalink
report
reply

Asklemmy

!asklemmy@lemmy.ml

Create post

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it’s welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

Icon by @Double_A@discuss.tchncs.de

Community stats

  • 35

    Monthly active users

  • 2.3K

    Posts

  • 29K

    Comments