computer-science

Machine Learning & AI Algorithms

The algorithms that learn from data — from gradient descent optimization and neural networks to decision trees, clustering, and the bias-variance tradeoff.

machine learningAIneural networksgradient descentdecision treesclusteringoverfitting

Machine learning is the engine behind modern artificial intelligence. Rather than programming explicit rules, ML algorithms discover patterns in data — adjusting millions of parameters to minimize error, partition feature spaces, or find hidden structure. The field spans supervised learning (labeled examples), unsupervised learning (discovering clusters), and the deep learning revolution that powers language models, image recognition, and autonomous systems.

These simulations let you see ML algorithms at work. Watch gradient descent navigate a loss landscape. Build a neural network and observe activation propagation. Grow a decision tree that splits data into pure regions. Cluster points with k-means. Explore overfitting — the central tension of machine learning — by adjusting model complexity against training data.

5 interactive simulations

simulator

K-Means Clustering Visualizer

Watch k-means clustering assign points to clusters — adjust k, data distribution, and initialization to see centroid convergence and Voronoi partitions

simulator

Decision Tree Classifier Visualizer

Grow a decision tree on 2D data — adjust tree depth, split criteria, and data complexity to see how the tree partitions feature space

simulator

Gradient Descent Optimizer Visualizer

Watch gradient descent navigate a 2D loss landscape — adjust learning rate, momentum, and surface shape to see convergence, oscillation, and divergence

simulator

Neural Network Forward Pass Visualizer

Build a neural network and watch activations propagate — adjust layer sizes, activation functions, and input values to see how networks transform data

simulator

Overfitting & Bias-Variance Tradeoff

Explore overfitting by fitting polynomials to noisy data — adjust model complexity, data size, and noise to see training vs test error diverge