Soal 2 Machine Learning

IMPLEMENTING LINEAR REGRESSION WITH ONE VARIABLE USING MATLAB 

Referensi : https://www.coursera.org/learn/machine-learning

1. Aplikasi Matlab

       Ini adalah Implementasi Algoritma Regresi Linier dengan satu variabel menggunakan matlab.
Algoritma memprediksi keuntungan yang bisa diperoleh dari kota tergantung pada populasinya.

Listing Program :


clear; close all; clc;

% read the training data
data = load('examples.txt');

% initialize Matrices and Variables
X = data(:, 1);     % featue matrix
y = data(:, 2);     % results matrix
m = length(y);      % number of training examples
theta = zeros(2, 1);     % initial weights
iterations = 1500;  % Iterations needed for Gradient Descent
alpha = 0.01;       % Learning Rate

% Plot the Data
plot(X, y, 'rx', 'MarkerSize', 10);
title('Training Examples');
xlabel('Population in 10,000');
ylabel('Profit in $10,000');

% Compute the Cost Function
X = [ones(m, 1), data(:, 1)];
J = ComputeCost(X, y, theta);

% Run Gradient Descent
[theta, Js] = GradientDescent(X, y, theta, alpha, iterations);
hold on;
plot(X(:, 2), X * theta, '-');
legend('Training data', 'Linear regression');
hold off;

% plotting the cost function
plot(1: iterations, J_history, '-b');

% Predicting Profits
fprintf('Prediction for 35000:\t%f\n', ([1, 3.5] * theta) * 10000);
fprintf('Prediction for 70000:\t%f\n', ([1, 7] * theta) * 10000);



Program Computasi Cost :

function J = ComputeCost(X, y, theta)
    % Prepare Variables
    m = length(y);
   
    % Calculate Hypothesis
    h = X * theta;
   
    % Calculate Cost
    J = 1 / (2 * m) * sum((h - y) .^ 2);
end



Cost Function Formula :



Hypothesis Formula :


Ini disebut (vektorisasi), yang lebih efisien dari loop, kami hanya kalkulus vektor hipotesis dengan mengalikan setiap baris dalam Contoh matriks dengan theta vektor kolom

Gradient Descend Formula :



Program Descent Gradient :

function [theta, Js] = GradientDescent(X, y, theta, alpha, iterations)
    % Prepare Variables
    m = length(y);
    Js = zeros(iterations, 1);
   
    for i = 1 : iterations,
        h = X * theta;
        t1 = theta(1) - (alpha * (1 / m) * sum(h - y));
        t2 = theta(2) - (alpha * (1 / m) * sum((h - y) .* X(:, 2)));
        theta(1) = t1;
        theta(2) = t2;
       
        Js(i) = ComputeCost(X, y, theta);
    end
end



Hasil dari training Desicion Tree adalah seperti gambar :




Untuk Grafik hasil dari Program pada Matlab :


VIDEO PENJELASAN




Link Datasheet : Download
Link Program : Download
Link Computation : Download
Link Descent Gradient : Download





Tidak ada komentar:

Posting Komentar