• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

线性回归代码实现(matlab)

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

1 代价函数实现(cost function)

 

function J = computeCost(X, y, theta)
%COMPUTECOST Compute cost for linear regression
%   J = COMPUTECOST(X, y, theta) computes the cost of using theta as the
%   parameter for linear regression to fit the data points in X and y

% Initialize some useful values
m = length(y); % number of training examples

% You need to return the following variables correctly 
J = 0; 

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
%               You should set J to the cost.

predictions = X * theta;
sqrErrors = (predictions-y) .^ 2;

J = 1/(2*m) * sum(sqrErrors);



% =========================================================================

end

  1.1 详细解释

转化成了向量(矩阵)形式,如果用其他的语言,用循环应该可以实现

predictions = X * theta;        % 这里的大X是矩阵

  

 

sqrErrors = (predictions-y) .^ 2;

  

2 梯度下降

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
%GRADIENTDESCENT Performs gradient descent to learn theta
%   theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by 
%   taking num_iters gradient steps with learning rate alpha

% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);

for iter = 1:num_iters

    % ====================== YOUR CODE HERE ======================
    % Instructions: Perform a single gradient step on the parameter vector
    %               theta. 
    %
    % Hint: While debugging, it can be useful to print out the values
    %       of the cost function (computeCost) and gradient here.
    %
    theta_temp = theta;
    for j = 1:size(X, 2)
        theta_temp(j) = theta(j)-alpha*(1/m)*(X*theta - y)' * X(:, j);
    end
    theta = theta_temp;





    % ============================================================

    % Save the cost J in every iteration    
    J_history(iter) = computeCost(X, y, theta);

end

end

  2.1 解释

J_history = zeros(num_iters, 1);

 

theta_temp = theta;  

  把theta存起来。保证同时更新

for j = 1:size(X, 2)
        theta_temp(j) = theta(j)-alpha*(1/m)*(X*theta - y)' * X(:, j);
    end

  更新theta    

(X*theta - y)' 是转置

(X*theta - y)' * X(:, j);

  这步是求和,相当于sum

 

  J_history(iter) = computeCost(X, y, theta);

记录代价函数

因为随着迭代次数的增加,代价函数收敛。theta也就确定了。

代价函数的是降低,同时theta也在变化

到后面代价函数的值已经不变化了。到收敛了

 


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Delphi发布了社区版及Delphi10.3展望Delphi10.3最新消息!发布时间:2022-07-18
下一篇:
delphi函数[转]发布时间:2022-07-18
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap