【机器学习】-Week3 6. Advanced Optimiz

2019-12-29  本文已影响0人  Kitty_风花

拟牛顿法 Quasi-Newton Methods 是求解非线性优化最有效的方法之一

Advanced Optimization

"Conjugate gradient"(共轭梯度), "BFGS", and "L-BFGS" are more sophisticated, faster ways to optimize θ that can be used instead of gradient descent. We suggest that you should not write these more sophisticated algorithms yourself (unless you are an expert in numerical computing) but use the libraries instead, as they're already tested and highly optimized. Octave provides them.

We first need to provide a function that evaluates the following two functions for a given input value θ:

We can write a single function that returns both of these:

Then we can use octave's "fminunc()" optimization algorithm along with the "optimset()" function that creates an object containing the options we want to send to "fminunc()". (Note: the value for MaxIter should be an integer, not a character string )

We give to the function "fminunc()" our cost function, our initial vector of theta values, and the "options" object that we created beforehand.


Note: [ '100' should be 100 instead. The value provided should be an integer and not a character string.]

来源:coursera 斯坦福 吴恩达 机器学习

上一篇 下一篇

猜你喜欢

热点阅读