Linear
Regression

Dr. Ajay Kumar Koli, PhD | SARA Institute of Data Science, India

Regression


Regression is a statistical method to model and analyze relationships between variables.

Types of Regression

  • Linear Regression: Simple (1 predictor) and Multiple (more than 1 predictor).

  • Logistic Regression, Polynomial Regression, etc.

Objectives of Regression

  • Predict outcomes based on predictor variables.

  • Understand the strength and direction of relationships.

  • Make informed decisions using models.

Linear Regression Model

  • Simple Linear Regression Equation:

y=β0+β1x+ϵ

  • y = Dependent variable (outcome).

  • x = Independent variable (predictor).

  • β0 = Intercept.

  • β0 = Slope.

  • ϵ = Error term.

Assumptions of Linear Regression

  • Linearity: Relationship between x and y is linear.

  • Independence: Observations are independent.

  • Homoscedasticity: Errors have constant variance.

  • Normality: Residuals are normally distributed.

Practice GLM

General Explanation of Regression Metrics:

  • Intercept: The baseline prediction when all predictors are zero.

  • Coefficients: The change in the dependent variable for a one-unit increase in the predictor.

  • P-value: Determines statistical significance (p < 0.05 is typically significant).

  • R-squared: Explains the proportion of variance captured by the model (higher values indicate a better fit).

1 / 7
Linear Regression Dr. Ajay Kumar Koli, PhD | SARA Institute of Data Science, India

  1. Slides

  2. Tools

  3. Close
  • Linear Regression
  • Regression
  • Types of Regression
  • Objectives of Regression
  • Linear Regression Model
  • Assumptions of Linear Regression
  • Practice GLM
  • f Fullscreen
  • s Speaker View
  • o Slide Overview
  • e PDF Export Mode
  • r Scroll View Mode
  • ? Keyboard Help