>>91871754there are 3 known scaling laws
are all empiric laws, not rooted in anything fundamental from information theory but inferred from observation
AI research now is where physics was in the dark ages, and we got no Newton or Einstein to get us out of this mess
AI performance scales off: training data, training compute, inference compute
training data is somewhat finite, but only somewhat - you can use artificially generated training data, you can "augment" sparse data with extra samples, you can also do reinforcement learning in some cases - there are many clever ways around data shortages, but all require compute
so, in a way, AI scales off compute^3
that is not a nice scaling, if you like humans