Ticker

6/recent/ticker-posts

Coursera: Machine Learning (Week 3) [Assignment Solution] - Andrew NG





 Recommended Courses:
1.Logistic regression and apply it to two different datasets

Don't just copy paste the code for the sake of completion. 
Make sure you understand the code first.

In this exercise, you will implement logistic regression and apply it to two different datasets. Before starting on the programming exercise, we strongly recommend watching the video lectures and completing the review questions for the associated topics.

It consist of the following files:
  • ex2.m - Octave/MATLAB script that steps you through the exercise
  • ex2 reg.m - Octave/MATLAB script for the later parts of the exercise
  • ex2data1.txt - Training set for the first half of the exercise
  • ex2data2.txt - Training set for the second half of the exercise
  • submit.m - Submission script that sends your solutions to our servers
  • mapFeature.m - Function to generate polynomial features
  • plotDecisionBoundary.m - Function to plot classifier's decision boundary
  • [*] plotData.m - Function to plot 2D classification data
  • [*] sigmoid.m - Sigmoid Function
  • [*] costFunction.m - Logistic Regression Cost Function
  • [*] predict.m - Logistic Regression Prediction Function
  • [*] costFunctionReg.m - Regularized Logistic Regression Cost
* indicates files you will need to complete

plotData.m :

function plotData(X, y)
  %PLOTDATA Plots the data points X and y into a new figure 
  %   PLOTDATA(x,y) plots the data points with + for the positive examples
  %   and o for the negative examples. X is assumed to be a Mx2 matrix.
  
  % ====================== YOUR CODE HERE ======================
  % Instructions: Plot the positive and negative examples on a
  %               2D plot, using the option 'k+' for the positive
  %               examples and 'ko' for the negative examples.
  %
  
  %Seperating positive and negative results
  pos = find(y==1); %index of positive results
  neg = find(y==0); %index of negative results
  
  % Create New Figure
  figure;
  
  %Plotting Positive Results on 
  %    X_axis: Exam1 Score =  X(pos,1)
  %    Y_axis: Exam2 Score =  X(pos,2)
  plot(X(pos,1),X(pos,2),'g+');
  
  %To keep above plotted graph as it is.
  hold on;  
  
  %Plotting Negative Results on 
  %    X_axis: Exam1 Score =  X(neg,1)
  %    Y_axis: Exam2 Score =  X(neg,2)
  plot(X(neg,1),X(neg,2),'ro');
  
  % =========================================================================
  
  hold off;
end

sigmoid.m :


function g = sigmoid(z)
  %SIGMOID Compute sigmoid function
  %   g = SIGMOID(z) computes the sigmoid of z.
  
  % You need to return the following variables correctly 
  g = zeros(size(z));
  
  % ====================== YOUR CODE HERE ======================
  % Instructions: Compute the sigmoid of each value of z (z can be a matrix,
  %               vector or scalar).
  g = 1./(1+exp(-z));
  
  % =============================================================
end

costFunction.m :


function [J, grad] = costFunction(theta, X, y)
  %COSTFUNCTION Compute cost and gradient for logistic regression
  %   J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
  %   parameter for logistic regression and the gradient of the cost
  %   w.r.t. to the parameters.
  
  % Initialize some useful values
  m = length(y); % number of training examples
  
  % You need to return the following variables correctly 
  J = 0;
  grad = zeros(size(theta));
  
  % ====================== YOUR CODE HERE ======================
  % Instructions: Compute the cost of a particular choice of theta.
  %               You should set J to the cost.
  %               Compute the partial derivatives and set grad to the partial
  %               derivatives of the cost w.r.t. each parameter in theta
  %
  % Note: grad should have the same dimensions as theta
  %
  %DIMENSIONS: 
  %   theta = (n+1) x 1
  %   X     = m x (n+1)
  %   y     = m x 1
  %   grad  = (n+1) x 1
  %   J     = Scalar
  
  z = X * theta;      % m x 1
  h_x = sigmoid(z);   % m x 1 
  
  J = (1/m)*sum((-y.*log(h_x))-((1-y).*log(1-h_x))); % scalar
  
  grad = (1/m)* (X'*(h_x-y));     % (n+1) x 1
  
  % =============================================================
  
end

predict.m :


function p = predict(theta, X)
  %PREDICT Predict whether the label is 0 or 1 using learned logistic 
  %regression parameters theta
  %   p = PREDICT(theta, X) computes the predictions for X using a 
  %   threshold at 0.5 (i.e., if sigmoid(theta'*x) >= 0.5, predict 1)
  
  m = size(X, 1); % Number of training examples
  
  % You need to return the following variables correctly
  p = zeros(m, 1);
  
  % ====================== YOUR CODE HERE ======================
  % Instructions: Complete the following code to make predictions using
  %               your learned logistic regression parameters. 
  %               You should set p to a vector of 0's and 1's
  %
  % Dimentions:
  % X     =  m x (n+1)
  % theta = (n+1) x 1
  
  h_x = sigmoid(X*theta);
  p=(h_x>=0.5);
  
  %p = double(sigmoid(X * theta)>=0.5);
  % =========================================================================
end

costFunctionReg.m :

function [J, grad] = costFunctionReg(theta, X, y, lambda)
  %COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
  %   J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
  %   theta as the parameter for regularized logistic regression and the
  %   gradient of the cost w.r.t. to the parameters. 
  
  % Initialize some useful values
  m = length(y); % number of training examples
  
  % You need to return the following variables correctly 
  J = 0;
  grad = zeros(size(theta));
  
  % ====================== YOUR CODE HERE ======================
  % Instructions: Compute the cost of a particular choice of theta.
  %               You should set J to the cost.
  %               Compute the partial derivatives and set grad to the partial
  %               derivatives of the cost w.r.t. each parameter in theta
  
  %DIMENSIONS: 
  %   theta = (n+1) x 1
  %   X     = m x (n+1)
  %   y     = m x 1
  %   grad  = (n+1) x 1
  %   J     = Scalar
  
  z = X * theta;      % m x 1
  h_x = sigmoid(z);  % m x 1 
  
  reg_term = (lambda/(2*m)) * sum(theta(2:end).^2);
  
  J = (1/m)*sum((-y.*log(h_x))-((1-y).*log(1-h_x))) + reg_term; % scalar
  
  grad(1) = (1/m)* (X(:,1)'*(h_x-y));                                  % 1 x 1
  grad(2:end) = (1/m)* (X(:,2:end)'*(h_x-y))+(lambda/m)*theta(2:end);  % n x 1
  
  % =============================================================
end

  • 一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一一
                         Machine Learning Coursera-All weeks solutions [Assignment + Quiz]   click here
                                                                                &
                         Coursera Google Data Analytics Professional Quiz Answers   click here


Have no concerns to ask doubts in the comment section. I will give my best to answer it.
If you find this helpful kindly comment and share the post.
This is the simplest way to encourage me to keep doing such work.


Thanks & Regards,
- Wolf

Post a Comment

1 Comments