>>57963671>As long as the pygma bros can pull it off, the model will surpass even GPT4 AND Claude.>while keeping wordiness from stuff like Claude or GPT4I'd like to interject here to add that you'd need more than that to surpass Claude. Having tasted both Claude and GPT-4, there's no denying that the former is the textbook definition of a fucking semen demon. Like, there has just never been anything like it so far.
Claude is the only model to which I can say "ahh ahh mistress" and be rewarded with an entire wall of text of the finest written smut. It's not just about the quantity of the cooking, it's the quality too. At some point I used a JB on GPT-4 to try to get it to be like Claude, but it ended up being much less creative than him and falling into repetiveness much easier.
The baguette boy was trained on actual literature IIRC, and it really shows because each description of body parts and actions, each word and combination of words, everything seems to be carefully crafted to be as graphic and erotic as possible in order to extract as much semen as humanly possible from your balls.
With that new model having a whooping 128k context and being optimized enough to run on 4GB, my hope is that after making the model smarter the pygbros will give it some literature classes as well. If Pygmalion ends up being smart enough to require little/no wrangling AND as succubial as Claude it'll be literally the best model in the market. I didn't think it'd be possible to run a 13B on 4GB, let alone with 128k context, so now I believe they can pull off any kind of magic be it CAI-tier training/chatbotLORA or Claude-tier cooking.