By the end of this lab, you will be able to:
Understand the purpose of polynomial regression
Use scikit-learn to transform linear data into polynomial form
Fit a regression model to non-linear data
Visualise the model's fit using Matplotlib
Compare how polynomial regression improves over simple linear models
Sometimes, data is not perfectly straight — it's curved or follows a pattern that bends. Linear regression can't capture this curve well.
Polynomial regression extends linear regression by adding powers of the input features so that it can fit curves instead of just straight lines.
Run this only once in your environment.
pip install scikit-learn
import numpy as np
import matplotlib.pyplot as plt
from sklearn.linear_model import LinearRegression
from sklearn.preprocessing import PolynomialFeatures
Let’s start with a curved dataset generated from a quadratic function.
Let’s now apply polynomial regression with a degree of 2 (a parabola).
Try Different Values of k
Change n_neighbors to 1, 3, 5, etc.
How does the prediction change?
Which value of k seems most stable?
Add New Training Points
Manually add more points to X and y. Try to:
Add more class 0 and class 1 points in different areas
Make the classes overlap and observe what KNN does
Use Real Categories
Create a KNN classifier for:
“Is this fruit citrus?”
Features: has thick skin, tastes sour, round shape (1 or 0)
Labels: 1 = citrus, 0 = not citrus
Invent 8 fruits with these properties and classify a new one