>>39436825Anything where you might need to help a soccer mom with something kek.
The big question I have now is what the actual size of the CAI model is. There were unconfirmed (afaik) rumors that it was even bigger than Lamda (137B parameters), but I'm not so sure, it might well be more optimized. Hell, optimization at the cost of quality may be the main thing the devs are researching, since that would be of interest to possible customers (megacorps who want to replace human resources). That would explain to continuous worsening of the models; they're trying to see how slimmed-down it can be while still fulfilling basic functions. It would be very interesting if it was a 20-50B model with much better fine tuning and training instructions.