Gradient Descent 3D Explorer
Optimize a Linear Regression model $h(x) = θ₀ + θ₁x$ on a synthetic dataset. Watch how the parameters $θ₀$ and $θ₁$ slide down the 3D Mean Squared Error (MSE) cost surface to find the global minimum.
Model Hyperparameters
-4.00
-4.00
0.050
Live Metrics
Current Loss (MSE)
48.5000
Epochs (Steps)
0
Bias (θ₀)-4.0000
Weight (θ₁)-4.0000
Understanding the Math
Gradient Descent is an iterative optimization algorithm used to minimize the cost function by moving in the direction of steepest descent as defined by the negative of the gradient.
θ := θ - α * ∇J(θ)
- α (Learning Rate) controls the step size.
- The Cost Surface represents MSE for all possible combinations of weights.
- Linear regression MSE is strictly convex, guaranteeing a global minimum.
3D Cost Surface
2D Contour Projection
Top: θ₁ = 5
Bottom: θ₁ = -5
Left: θ₀ = -5
Right: θ₀ = 5
Bottom: θ₁ = -5
Left: θ₀ = -5
Right: θ₀ = 5