梯度下降法matlab程序

梯度下降法matlab程序,第1张

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters) %GRADIENTDESCENT Performs gradient descent to learn theta % theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by % taking num_iters gradient steps with learning rate alpha % Initialize some useful values m = length(y)% number of training examples J_history = zeros(num_iters, 1)for iter = 1:num_iters, % ====================== YOUR CODE HERE ====================== % Instructions: Perform a single gradient step on the parameter vector % theta. % % Hint: While debugging, it can be useful to print out the values % of the cost function (computeCost) and gradient here. p=theta(1)-alpha*(1/m)*(sum((X*theta-y).*X(:,1)))q=theta(2)-alpha*(1/m)*(sum((X*theta-y).*X(:,2))) theta(1)=p theta(2)=q % ============================================================ % Save the cost J in every iteration J_history(iter) = computeCost(X, y, theta)end end

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)

%GRADIENTDESCENT Performs gradient descent to learn theta

% theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by

% taking num_iters gradient steps with learning rate alpha

% Initialize some useful values

m = length(y)% number of training examples

J_history = zeros(num_iters, 1)

for iter = 1:num_iters,

% ====================== YOUR CODE HERE ======================

% Instructions: Perform a single gradient step on the parameter vector

% theta.

%

% Hint: While debugging, it can be useful to print out the values

% of the cost function (computeCost) and gradient here.

p=theta(1)-alpha*(1/m)*(sum((X*theta-y).*X(:,1)))

q=theta(2)-alpha*(1/m)*(sum((X*theta-y).*X(:,2)))

theta(1)=p

theta(2)=q

% ============================================================

% Save the cost J in every iteration

J_history(iter) = computeCost(X, y, theta)

end

end


欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/yw/12560397.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2023-05-26
下一篇 2023-05-26

发表评论

登录后才能评论

评论列表(0条)

保存