Sendo feito em Octave 6.2.0, pois o Professor Andrew Ng percebeu que quando usava Octave os alunos entendiam mais a matéria, mas tenho objetivo de refazer em Python.
In this part of this exercise, you will implement linear regression with one variable to predict profits for a food truck. Suppose you are the CEO of a restaurant franchise and are considering different cities for opening a new outlet. The chain already has trucks in various cities and you have data for profits and populations from the cities. You would like to use this data to help you select which city to expand to next.
Exercise Part | Exercise | Submitted File | Done |
---|---|---|---|
1 | Warm up exercise | warmUpExercise.m | |
2 | Compute cost for one variable | computeCost.m | |
3 | Gradient descent for one variable | gradientDescent.m |
Training Data |
---|
Training Linear Regression with One Variable Result |
Surface | Contour with minimun(X) |
---|---|
In this part of the exercise, you will build a logistic regression model to predict whether a student gets admitted into a university. Suppose that you are the administrator of a university department and you want to determine each applicant's chance of admission based on their results on two exams. You have historical data from previous applicants that you can use as a training set for logistic regression. For each training example, you have the applicant's scores on two exams and the admissions decision. Your task is to build a classification model that estimates an applicant's probability of admission based the scores from those two exams.
Training Data |
---|
Non Regularized Logistic Regression Decision Boundary |
Training Data |
---|
Regularized Logistic Regression Decision Boundary |
Underfitting | Overfitting |
---|---|
Exercise Part | Exercise | Submitted File | Done |
---|---|---|---|
5 | Compute cost for regularized LR | costFunctionReg.m | |
6 | Gradient for regularized LR | costFunctionReg.m |
In this exercise, you will implement one-vs-all logistic regression and neural networks to recognize hand-written digits.
Training Data Sample (100 from 5000 images) |
---|
Exercise Part | Exercise | Submitted File | Done |
---|---|---|---|
1 | Regularized logistic regression | lrCostFunction.m | |
2 | One-vs-all classifier training | oneVsAll.m | |
3 | One-vs-all classifier prediction | predictOneVsAll.m |
In this part of the exercise, you will implement a neural network to recognize handwritten digits using the same training set as before. The neural network will be able to represent complex models that form non-linear hypotheses. For this week, you will be using parameters from a neural network that we have already trained. Your goal is to implement the feedforward propagation algorithm to use our weights for prediction.
Exercise Part | Exercise | Submitted File | Done |
---|---|---|---|
4 | Neural network prediction function | predict.m |
Octave doesn't have 0 index, so we are using 10 to index the number 0.
GIF |
---|
In this exercise, you will implement the backpropagation algorithm for neural networks and apply it to the task of hand-written digit recognition.
In this exercise, you will implement regularized linear regression and use it to study models with diferent bias-variance properties.
Training Data |
---|
Linear Fit | Linear Regression Learning Curve |
---|---|
Underfitting | High Bias Problem (Underfit) |
Exercise Part | Exercise | Submitted File | Done |
---|---|---|---|
4 | Polynomial Feature Mapping | polyFeatures.m |
Using Polynomial Feature Mapping to make the model more complex.
Polynomial Fit - lambda = 0 | Polynomial Learning Curve |
---|---|
Overfitting | High Variance Problem (Overfit) |
Polynomial Fit - lambda = 3 | Polynomial Learning Curve |
---|---|
Good Fit | Low Variance and Bias |
Exercise Part | Exercise | Submitted File | Done |
---|---|---|---|
5 | Cross Validation Curve | validationCurve.m |
In this exercise, you will be using support vector machines (SVMs) to build a spam classifier.
In this exercise, you will implement the K-means clustering algorithm and apply it to compress an image. In the second part, you will use principal component analysis to find a low-dimensional representation of face images.
Exercise Part | Exercise | Submitted File | Done |
---|---|---|---|
1 | Find Closest Centroids | findClosestCentroids.m | |
2 | Compute Centroid Means | computeCentroids.m |
K-means iteration GIF |
---|
Image compression with K-means |
---|
In this exercise, you will use principal component analysis (PCA) to perform dimensionality reduction. You will first experiment with an example 2D dataset to get intuition on how PCA works, and then use it on a bigger dataset of 5000 face image dataset.
Exercise Part | Exercise | Submitted File | Done |
---|---|---|---|
3 | PCA | pca.m |
Dataset with Computed eigenvectors |
---|
Exercise Part | Exercise | Submitted File | Done |
---|---|---|---|
4 | Project Data | projectData.m | |
5 | Recover Data | recoverData.m |
Projected(red) and Reconstructed(blue) data |
---|
PCA on the face dataset |
---|
Original and Reconstructed Images |
---|
In this exercise, you will implement the anomaly detection algorithm and apply it to detect failing servers on a network. In the second part, you will use collaborative ltering to build a recommender system for movies.
Training Data |
---|
Training Data with Gaussian Estimation Contours |
Exercise Part | Exercise | Submitted File | Done |
---|---|---|---|
1 | Estimate Gaussian Parameters | estimateGuassian.m | |
2 | Select Threshold | selectThreshold.m |
Detected Anomaly |
---|
In this part of the exercise, you will implement the collaborative filtering learning algorithm and apply it to a dataset of movie ratings.