Coursera-MachineLearning-Week3编程题目整理

秦博延
2023-12-01

sigmoid.m

g = 1./(1+exp(-z));  %对每个元素都进行操作

costFunction.m

J = 1./m*(-y'*log(sigmoid(X*theta)) - (1-y)'*log(1-sigmoid(X*theta)));
grad = 1/m * X'*(sigmoid(X*theta) - y);

predict.m

p = round(sigmoid(X * theta))   %界限是0.5,所以可以使用四舍五入的round函数

costFunctionReg.m

[J, grad] = costFunction(theta, X, y);   %先计算出代价函数J和梯度grad
 
J = J + lambda/(2*m)*(sum(theta.^2) - theta(1).^2);   %不需要正则化参数Θ0
grad = grad + lambda/m*theta;
grad(1) = grad(1) - lambda/m*theta(1); %不需要正则化参数Θ0
 类似资料: