>>36180907No, yeah, that's exactly what they're doing. It's a retarded way of doing things, but "work harder not smarter" seems to be the current meta right now. That doesn't mean there aren't promising steps, though. This paper found a way to beat GPT-3 at tasks while having ten times less parameters, and if you know the math/machine learning behind it, you can see its research paper here:
https://arxiv.org/abs/2205.05131Additionally, HuggingFace, a big ML library, has the model hosted on their servers/library. You're gonna need a 40 GB VRAM GPU to run the thing though, or use a 128 GB RAM machine on a cloud computing platform to run it on CPU.
I've done the latter in an attempt of seeing if it knows anything about VTubers. It basically doesn't, since its dataset is from 2019, but it's promising regardless.>Captcha: VM HAG