>>98657463You chose the worst possible one to use
There's something called context, which is basically the AI's whole memory of the conversation, which is usually at best up to 32K tokens - easily expressed, maybe 32K words
ChatGPT in particular, especially for the web interface, by default doesn't just use your current chat for context, it uses your past chats as well, filling the context up with useless crap and confusing it
You can tell because if you ask ChatGPT in a temporary chat, it gets a lot of details about, say, holos, correct from the data it's trained on, but if you use it in non-temporary chat, it starts going haywire and saying shit like Ame has pink hair or some bullshit
Try Gemini 2.5 pro, it's context is pretty insane right now - supposedly up to 1 million, but you shouldn't go above 128K