Machine Learning

This machine learning note records some important concepts and some most used methods, including: Bias variance tradeoff Bias variance tradeoff

Image

Notes about image processing.

Modules


Math

Math notes.

Modules


Datasets

Image datasets Minist ImageNet ImageNet labels

Calculus

I can’t stand my poor math anymore, and start to re-learn math from calculus.

This notes will record some formulas and anything intersting associates with calculus.


Data Augmentation

Data augmentation is the process of increasing the size of a dataset by transforming it in ways that a neural network is unlikely to learn by itself.

This article will introduce:

  • Common data augmentation methods.
  • Image augmentation with imgaug.
  • Popular tools for data augmentation.


Linear Algebra

Linear Algebra notes.


Metrics

Sometimes it’s hard to tell the differences between precision, accuracy, recall and so on especially for newbees like me.

But let’s try to distinguish them with stories. In this article, you will see some common used metrics, including:

  • Metrics for binary classification: accuracy, precision, reacall, f1-score and so on


Loss Function

This article will introdcue:

  • expected risk
  • some common loss function

Loss function in classification

The goal of classification problem, or many machine learning porblem, is given training sets \(\{(x^{(i)}, y^{(i)}); i=1,\cdots,m\}\), to find a good predictor \(f\) so that \(f(x^{(i)})\) is a good estimate of \(y^{(i)}\).

Why we need a loss function?

We need a loss function to measure how “close” of estimate value \(\hat y^{(i)}\) and the target value \(y^{(i)}\) and we usually optimize our model by minimizing the loss.


Python

Some python notes.