-
Ensembles, PCA, and K-Means on MNISTGradient descent for kernel learning, decision tree ensembles with bagging and boosting, PCA eigendigits, and k-means clustering on handwritten digits.
9 min -
Kernel Methods and Random Fourier FeaturesLinear SVM, least squares classifiers, Gaussian and Laplacian kernels, random Fourier feature approximation, and RKHS theory on MNIST digits.
11 min -
Bayes Classifiers, k-NN, and VC DimensionBayes optimal classifiers, k-NN error bounds, the curse of dimensionality in high dimensions, and VC dimension for disks and boxes.
11 min -
Logistic Regression Decision BoundariesBuilding logistic regression from scratch in NumPy with gradient checking, feature scaling, and decision boundary visualization on exam data.
6 min -
Soft-Margin SVMs: Linear and KernelImplementing soft-margin linear SVM and polynomial kernel SVM from scratch with gradient descent, grid search over learning rates and regularization.
6 min -
Logistic Regression from Scratch on MNISTImplementing binary logistic regression with gradient descent on 784-dimensional image data. Numerically stable gradients and learning rate tuning.
4 min -
Bayes vs LDA and VC Generalization BoundsComparing the Bayes-optimal quadratic boundary to Fisher LDA on 2D Gaussians, then testing VC dimension bounds on a rectangle concept learner.
6 min -
Linear Regression and Bayes Decision BoundariesOLS regression on synthetic data, computing Bayes error probability with the error function, and visualizing the Bayes decision boundary for two Gaussians.
4 min
Back