>>19066502Turns out that machine learning is fairly simple and straight forward, you just have to be able to see how it works in your minds eye, and then program it. Doing it is practically trivial and was done as far back as the 1960's. You make a model structure that transforms what you've got and what you know into the right answers then you put that into a for loop for every node: load all training rows and get outputs, compare the outputs to the right answers, make a 2D surface of given answers to right answers and make the 2nd dimension context, the partial derivative consumes bayes rule and produces a direction, use the direction to modify the brain to be less wrong in this context. Repeat for all nodes, if its less wrong keep it, if not try something else.
Problems:
1. Enough compute to feel like a person is getting the entire space a fair sampling, will set you back 100 megawatt hours per day. Any problem with more than 3 or 4 independingly hinged moving parts immediately flails around and tears itself apart.
Information gain on a neuron is yielded if you feed bayes rule to a partial derivative in a loop, for each neuron.
https://www.youtube.com/watch?v=hmtQPrH-gC4You can trade energy and compute for intelligence.