Mathematics/Statistics/Regression Lines: Difference between revisions

From Dev Wiki
Jump to navigation Jump to search
(Create regression equation section)
 
(One intermediate revision by the same user not shown)
Line 6: Line 6:


Since this is regression line is a standard straight line, it can be represented by a standard {{ ic | <math>y = mx + b</math>}} equation.
Since this is regression line is a standard straight line, it can be represented by a standard {{ ic | <math>y = mx + b</math>}} equation.
== Correlation Coefficients ==
{{ Tip | This is often represented as '''r'''.}}
The '''Correlation Coefficient''' is a value that describes "how well can a straight line fit this data", and will always be between [-1, 1].
A value of exactly 1 indicates that there is a strong positive correlation between x and y. That is, as x increases, so does y. And as y increases, so does x.
A value of exactly -1 indicates that there is a strong negative correlation between x and y. That is, as x increases, y decreases. And as y increases, x decreases.
As values approach 0, it indicates a weaker and weaker correlation, with 0 indicating that there is absolutely no correlation between x and y.
The equation to calculate the correlation coefficient is the following.
<math>r = \frac{1}{n - 1}\sum_{i=1}{n}(\frac{x_i-\bar{x}}{S_x})(\frac{y_i-\bar{y}}{S_y})</math>
For further explanation, see [https://www.khanacademy.org/math/statistics-probability/describing-relationships-quantitative-data/scatterplots-and-correlation/v/calculating-correlation-coefficient-r this Khan Academy video].




Line 51: Line 34:
  <math>m = r \frac{S_y}{S_x}</math>
  <math>m = r \frac{S_y}{S_x}</math>
Where
Where
* <math>r</math> is the [[#Correlation Coefficients|correlation coefficient]] for our dataset.
* <math>r</math> is the [[Statistics/Correlation Coefficients|correlation coefficient]] for our dataset.
* <math>S_y</math> is the [[Statistics/Core_Measurements#Standard Deviation|standard deviation]] of our x.
* <math>S_y</math> is the [[Statistics/Core_Measurements#Standard Deviation|standard deviation]] of our x.
* <math>S_x</math> is the [[Statistics/Core_Measurements#Standard Deviation|standard deviation]] of our y.
* <math>S_x</math> is the [[Statistics/Core_Measurements#Standard Deviation|standard deviation]] of our y.

Latest revision as of 17:21, 25 October 2020

Template:ToDo

A regression line is effectively an attempt to fit some data to a straight line.

Basically, given some set of data points, you create a line that as closely as possible fits the data. This is accomplished by trying to minimize the total distance between all points and the line itself.

Since this is regression line is a standard straight line, it can be represented by a standard equation.


Residuals

Given a single data point, a residual is the distance between that single point and the regression line.

A positive residual value indicates that the data point is somewhere above the regression line. A negative residual value indicates that the data point is somewhere below the regression line. Larger values indicate the point is farther away.

To calculate the residual for some point at , we have the following equation for some and from our regression line:



We can then take this a step further and combine all residuals in our dataset to get an overall view of "how closely the regression line matches our dataset".


Squaring means means that negative residual values don't cancel out positive values. It also means points that are farther away from the line end up with more weight than points closer to the line.


Least Squares Regression

Least Squares Regression is one of the more popular ways to minimize the sum of all residuals in our dataset. We can calculate this regression line with a few steps:

First, we note that the equation we want to calculate is ultimately some form of .

Our regression line will always go through the point , which denotes the means of our x and y. So it's safe to use that as the and in our equation. From there, we can move to calculating our and .

Calculating our M and B

Next, we can calculate line slope with the equation


Where


We can then proceed to get our with simple algebra:


where

  • is the mean of our x.
  • is the mean of our y.
  • is the m we just calculated, above.


Putting all of this together, we can replace the m and b in our , which will give us a line equation for our regression.

For further explanation, see this Khan Academy video.