Multiple Linear Regression ( Toy Example)
Experiment
Multiple Linear Regression
Aim
To implement Multiple Linear Regression using multiple input features and evaluate the model using:
-
Mean Squared Error (MSE)
-
R-squared (R²)
Objectives
-
Understand multiple linear regression
-
Implement regression using matrix method
-
Predict output using multiple features
-
Compute MSE and R² manually
-
Visualize actual vs predicted values
🛠️ Tools Required
-
Python
-
NumPy
-
Matplotlib
📖 Theory
🔹 Multiple Linear Regression
Multiple Linear Regression models the relationship between one dependent variable and multiple independent variables:
🔹 Vector / Matrix Form
Where:
-
→ Predicted output
-
→ Feature matrix (including bias column)
-
→ Parameter vector
🔹 Expanded Form (for 2 features)
🔹 General Form (n features)
🔹 Structure of θ (Theta Vector)
-
→ Intercept (bias term)
-
→ Feature coefficients
🔹 Structure of X (Feature Matrix)
-
First column = 1s (bias term)
-
Each row = one training example
🔹 Final Parameter Equation (Normal Equation)
In matrix form:
Where:
-
= Feature matrix (with bias term)
-
= Parameter vector
-
= Output vector
🔹 Normal Equation
To compute optimal parameters:
🔹 Mean Squared Error (MSE)
🔹 R-squared (R² Score)
📋 Procedure
-
Define dataset with multiple features
-
Add bias (intercept term)
-
Compute parameters using Normal Equation
-
Predict values
-
Calculate MSE
-
Calculate R²
-
Plot actual vs predicted values
💻 Program
Output
-
Model parameters (θ values)
-
Mean Squared Error (MSE)
-
R² score
-
Scatter plot of actual vs predicted values
Result
The multiple linear regression model was successfully implemented using the Normal Equation.
Multiple regression handles more than one feature
-
Matrix form simplifies computation
-
R² indicates how well multiple features explain the target

Comments
Post a Comment