function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
theta = theta - alpha * X‘ * (X * theta - y) / m;
iter = iter +1;
J_history(iter) = computeCostMulti(X, y, theta);
end
end
[ML] Gradient Descend Algorithm [Octave code]
原文:http://www.cnblogs.com/KennyRom/p/6523378.html