本文共 1321 字,大约阅读时间需要 4 分钟。
实现:
figure;pos = find(label == 1); neg = find(label == 0);plot(data(pos, 1), data(pos,2), '+')hold onplot(data(neg, 1), data(neg, 2), 'o')hold onxlabel('Exam 1 score')ylabel('Exam 2 score')%调用函数 求代价和迭代值[theta J] = logsticTest(data,label);plot_x = [min(data(:,1))-2, max(data(:,1))+2];plot_y = (-1./theta(3)).*(theta(2).*plot_x +theta(1)); %theta(1) + theta(2)x + theta(3)y = 0 转化为直线plot(plot_x,plot_y,'g');hold off;MAX_ITR = 7;figure;plot(0:MAX_ITR-1, J, 'o--', 'MarkerFaceColor', 'r', 'MarkerSize', 8)xlabel('Iteration'); ylabel('J');
logsticTest函数:
function [theta J] = logsticTest(x,y)[m,n] = size(x);x = [ones(m,1),x];%添加第一列全为1theta = zeros(n+1, 1);% Define the sigmoid functiong = inline('1.0 ./ (1.0 + exp(-z))'); % Newton's methodMAX_ITR = 7;%最大迭代次数J = zeros(MAX_ITR, 1);for i = 1:MAX_ITR % Calculate the hypothesis function z = x * theta;%列向量 h = g(z);%计算sigmoid值 为列向量 % Calculate gradient and hessian. % The formulas below are equivalent to the summation formulas % given in the lecture videos. grad = (1/m).*x' * (h-y);% 得到列向量 表示 theta(0),theta(1) ... theta(i),的偏导得到的 H = (1/m).* (x'*diag(h) * diag(1-h)*x);%这里采用 对角阵的方式实现 分别的对第i行数据乘积 % Calculate J (for testing convergence) J(i) =(1/m)*sum(-y.*log(h) - (1-y).*log(1-h)); theta = theta - H\grad;end
分类结果:
迭代过程:
数据下面的连接中有
参考:
参考: