AdaBoost with Decision Stumps

Implementation from scratch with MNIST digit classification

Algorithm Overview

Decision Stumps

Weak learners (depth-1 decision trees) that make predictions based on a single feature threshold.

Weighted Error

Sample weights are updated to focus on misclassified examples in each boosting round.

Classifier Weights

Each stump's contribution is weighted by its accuracy (β = ½ ln((1-err)/err)).

MNIST Dataset

Classes 0 and 1

Binary classification task distinguishing between digits 0 and 1.

Dimensionality Reduction

PCA applied to reduce 784 features to 5 principal components.

Training Size

1000 samples per class for training, full test set for evaluation.

Run AdaBoost Training

Made with DeepSite LogoDeepSite - 🧬 Remix