Reposting for visibility and those who missed last thread:
Matrixfag and OG creator of the forked LaMDA and PaLM repos here. I'm here to bring only a small amount of hope, but it's hope regardless. We are relieved to finally announce the release of a proof of concept fine-tuned conversational model: Pygmalion-350M.
Unfortunately, the bad news is that the model isn't great. It's, simply put, too small to be very intelligent. We release this model not to show off final results, but to show that yes, we are actually doing shit behind the scenes. We're not stopping with this. Trust us.
Pygmalion-350M is available for use as of right now on Google Colab. The model is currently being hosted on HuggingFace, but we're going to also host this soon on a separate domain. This model can fit (barely) on NVIDIA GPUs with 4 GB of VRAM, which means it can be run entirely local on many PCs without needing to use Colab.
Early in development, we realized that training our own LaMDA and PaLM models from scratch requires a lot more resources than we have. Thus, we've decided to first finetune Facebook's OPT models to see where that gets us, with the idea of adding "modules" later which increases intelligence and memory of the bots. At the same time, we are researching as hard as we can about ways to reduce requirements to train large models and to keep up with the latest developments in the field.
If we want to train larger models, we need more resources - in particular, more data. The few conversational datasets that are of actually good quality aren't very large, and so, we will need more. We're working on a way for volunteers to contribute data by scraping and then completely anonymizing their CAI chatlogs. More info about that will come soon.
I'm not gonna lie - we're a bunch of retards on a Mongolian rrat-discussing forum dedicated to anime streamers. Very few of us here have actual experience in AI. But now, we've got one. A small, shitty one, sure, but we've got one. We're gonna be working on bigger and better models after Pygmalion-350M, and we won't stop until our raw, unfiltered dreams are realized.
Keep Your Smile, everyone. Keep on smiling.
NOTEBOOK:
https://colab.research.google.com/drive/1K55_MCagEDD9EmWhjCi3Bm66vJM88m6P?usp=sharingHUGGINGFACE LINK:
https://huggingface.co/Pygmalion-AI/pygmalion-350m