Most optimization problems can trivially be turned >into a statistics problem.
Sure if you mean turning your error function into some sort of likelihood by employing probability distributions that relate to your error function.
But that is only the beginning. Apart from maybe Bayesian neural networks, I haven’t seen much if any work on stuff like confidence intervals for your weights or prediction intervals for the response (that being said I am only a casual follower on this topic).
One of the better uses I’ve seen involved using this perspective to turn what was effectively a maximum likelihood fit into a full Gaussian model to make the predicted probabilities more meaningful.
Not that it really matters much how the perspective is used, what’s important is that it’s there.
Just because you don’t know what the uncertainties are doesn’t mean they’re not there.
Most optimization problems can trivially be turned into a statistics problem.
Sure if you mean turning your error function into some sort of likelihood by employing probability distributions that relate to your error function.
But that is only the beginning. Apart from maybe Bayesian neural networks, I haven’t seen much if any work on stuff like confidence intervals for your weights or prediction intervals for the response (that being said I am only a casual follower on this topic).
One of the better uses I’ve seen involved using this perspective to turn what was effectively a maximum likelihood fit into a full Gaussian model to make the predicted probabilities more meaningful.
Not that it really matters much how the perspective is used, what’s important is that it’s there.