https://www.youtube.com/watch?v=h5EtHLXSbHICan you be more specific as to which parts of this AI is the not-existing part?
# === for the base model there are 5 steps: ===
training_data, testing_data, production_data = baseload(training_csv)
theta = base_model(training_data, testing_data, production_data)
hyperparameters = guess_hyperparameters()
loop until accuracy stops improving{
theta = train(hyperparameters)
theta = iterate_gradient_ascent(alphas_and_greeks)
}
predictions = predict(theta, data)
accuracy = accuracy(correct_answers, predictions)
Gradient ascent is a mind bending clever application of applying the partial derivative in multiple dimensions against an array, which shits out information gain if you give it 2 point sample-access to a function to be approximated, and a cost function for what counts as less wrong.
The rest really is just plumbing and speed and matrix multiplication and linear algebra, nothing too outlandish for even a high school student.
https://www.youtube.com/watch?v=ta5fdaqDT3MThe barrier to entry is huge, but once you get through it, you find it's simple enough to understand and explain to a kid.
If I give you a bowl and a marble, you can use it and gravity, to find the bottom. Neurons somehow do all this with molecules.