This post walks through building a linear regression model from scratch using gradient descent in Python.
- •A simple dataset of 5 points is used to train a model y = mx + b by minimizing mean-squared error (MSE)
- •Gradients of MSE with respect to slope (m) and intercept (b) are computed manually and used to update parameters each epoch
- •The algorithm runs for 1000 epochs with a learning rate of 0.01, converging to slope ≈ 0.60 and intercept ≈ 1.20
- •Learning rate sensitivity is demonstrated: lr=0.0001 converges slowly, while lr=1 causes divergence
- •The final model predicts y=4.8 for x=6, validating the learned parameters
This summary was automatically generated by AI based on the original article and may not be fully accurate.