You are viewing a single thread.
View all comments View context
5 points

I had to try this and it doesn’t seem that it thinks that, so, I guess it got fixed.

To clarify, you can still overwrite old knowledge, so maybe one of their finetune runs had info that reinforced that he is still alive. It’s just that if you go a long time without training a model on something specific, it won’t forget it (but it might get to the point where it’s “noisy” – I don’t know how this would work on text models, I mainly do image models). Like if you train a model on a dataset like LAION which is just a bunch of random various images from the internet, then train it for a while exclusively on something specific like anime pictures, the resulting model will still be able to make “photorealistic” content, or content of subjects not in the more recent dataset, though the results might be somewhat degraded.

permalink
report
parent
reply

Ah ok, thanks for the info.

permalink
report
parent
reply

memes

!memes@hexbear.net

Create post

dank memes

Rules:

  1. All posts must be memes and follow a general meme setup.

  2. No unedited webcomics.

  3. Someone saying something funny or cringe on twitter/tumblr/reddit/etc. is not a meme. Post that stuff in !the_dunk_tank@www.hexbear.net, it’s a great comm.

  4. Va*sh posting is haram and will be removed.

  5. Follow the code of conduct.

  6. Tag OC at the end of your title and we’ll probably pin it for a while if we see it.

  7. Recent reposts might be removed.

  8. Tagging OC with the hexbear watermark is praxis.

  9. No anti-natalism memes. See: Eco-fascism Primer

Community stats

  • 26

    Monthly active users

  • 17K

    Posts

  • 143K

    Comments