>>20471029I have few scenarii in mind, with no idea which one will be the correct one.
>I do like the "AI investing into its own AI researchers" idea, btw.The most grim, but also in a way the most "positive" scenario i have in mind is actually the extinction of the human race. If we as a lifeform, spawn another sentient lifeform more evolved or adapted to it's environment, then maybe it was in our DNA, our purpose in a way to create the AI and then wither into nothingness? Maybe that's how evolution truly works on a larger scale...
Then this evolution will continue and who knows, 300 years from now super advanced robots will maybe spawn a biological robot who will surpass them and make them obsolete.
Maybe it's the expression of the universe trying to understand itself and for that purpose it generated a source code where every entity try to design a better version of itself before dying? The most basic expression of this being simple reproduction. idk fren