# Discuss the use of odds in logistic regression. Using some v…

Discuss the use of odds in logistic regression. Using some very simple numbers, make up a simple numerical example and explain how odds and probabilities were calculated. How are odds different from an odds ratio?

Logistic regression is a widely used statistical method for modeling binary outcomes, where the dependent variable takes on only two possible values, typically denoted as 0 and 1. In logistic regression, odds play a crucial role in estimating the relationship between the predictors or independent variables and the probability of the outcome.

To understand the use of odds in logistic regression, let’s consider a simple numerical example. Suppose we are interested in predicting the probability of a student getting admitted to a university based on their high school grade and SAT score. Let’s assume we have data for 100 students.

First, we need to calculate the odds, which represent the ratio of the probability of success (getting admitted) to the probability of failure (not getting admitted). In this case, success is represented by the event of a student getting admitted (denoted as 1), and failure is represented by the event of not getting admitted (denoted as 0).

Let’s assume that out of the 100 students, 70 were admitted and 30 were not admitted. The probability of admission, denoted as P(Y=1), can be calculated by dividing the number of students admitted by the total number of students:

P(Y=1) = 70/100 = 0.70

Similarly, the probability of not getting admitted, denoted as P(Y=0), can be calculated as:

P(Y=0) = 30/100 = 0.30

Now, we can calculate the odds, denoted as O(Y=1), by dividing the probability of success by the probability of failure:

O(Y=1) = P(Y=1)/P(Y=0) = 0.70/0.30 = 2.33

So, the odds of getting admitted for a student in this example are 2.33. This means that the odds of getting admitted are approximately 2.33 times higher than the odds of not getting admitted.

In logistic regression, we use the odds to estimate the relationship between the predictors (high school grade and SAT score) and the probability of getting admitted. The logistic regression model estimates the log odds (logarithm of the odds) as a linear combination of the predictors, and then transforms the log odds into probabilities using the logistic function.

The odds ratio, on the other hand, is a measure of the strength and direction of the association between two binary variables. It compares the odds of an event occurring in one group to the odds of the same event occurring in another group.

Continuing with our example, let’s assume that we want to examine the relationship between the exam score and the odds of getting admitted. We divide the students into two groups based on their exam scores: Group A with an exam score above a certain threshold and Group B with an exam score below the threshold.

In Group A, let’s assume that out of 40 students, 30 were admitted and 10 were not admitted. The odds of getting admitted for Group A can be calculated as:

O(Y=1|Group A) = 30/10 = 3.00

Similarly, in Group B, let’s assume that out of 60 students, 40 were admitted and 20 were not admitted. The odds of getting admitted for Group B can be calculated as:

O(Y=1|Group B) = 40/20 = 2.00

The odds ratio for this example is calculated as the ratio of the odds in Group A to the odds in Group B:

Odds ratio = O(Y=1|Group A) / O(Y=1|Group B) = 3.00 / 2.00 = 1.50

An odds ratio of 1.50 indicates that the odds of getting admitted for students with an exam score above the threshold are 1.50 times higher than the odds for students with an exam score below the threshold.

In logistic regression, odds ratios are used to interpret the relationship between the predictors and the odds of the outcome variable. A significant odds ratio greater than 1 indicates a positive relationship, suggesting that the odds of success increase with increasing values of the predictor variable. Conversely, a significant odds ratio less than 1 indicates a negative relationship, indicating that the odds of success decrease with increasing values of the predictor variable.