Regression

Math Topics A - Z listing

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Regression Topics

Sort by:

Statistical correlation

For two random variates and , the correlation is defined bY(1)where denotes standard deviation and is the covariance of these two variables. For the general case of variables and , where , 2, ..., ,(2)where are elements of the covariance matrix. In general, a correlation gives the strength of the relationship between variables. For ,(3)The variance of any quantity is always nonnegativeby definition, so(4)From a property of variances, the sum can be expanded(5)(6)(7)Therefore,(8)Similarly,(9)(10)(11)(12)Therefore,(13)so .For a linear combination of two variables,(14)(15)(16)(17)Examine the cases where ,(18)(19)The variance will be zero if , which requires that the argument of the variance is a constant. Therefore, , so . If , is either perfectly correlated () or perfectly anticorrelated () with ...

Least squares fitting--exponential

To fit a functional form(1)take the logarithm of both sides(2)The best-fit values are then(3)(4)where and .This fit gives greater weights to small values so, in order to weight the points equally, it is often better to minimize the function(5)Applying least squares fitting gives(6)(7)(8)Solving for and ,(9)(10)In the plot above, the short-dashed curve is the fit computed from (◇) and (◇) and the long-dashed curve is the fit computed from (9) and (10).

Least squares fitting

A mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the offsets ("the residuals") of the points from the curve. The sum of the squares of the offsets is used instead of the offset absolute values because this allows the residuals to be treated as a continuous differentiable quantity. However, because squares of the offsets are used, outlying points can have a disproportionate effect on the fit, a property which may or may not be desirable depending on the problem at hand.In practice, the vertical offsets from a line (polynomial, surface, hyperplane, etc.) are almost always minimized instead of the perpendicular offsets. This provides a fitting function for the independent variable that estimates for a given (most often what an experimenter wants), allows uncertainties of the data points along the - and -axes to be incorporated simply, and also provides a much..

Correlation ratio

Let there be observations of the th phenomenon, where , ..., and(1)(2)(3)Then the sample correlation ratio is defined by(4)Let be the population correlation ratio. If for , then(5)where(6)(7)(8)and is the confluent hypergeometric limit function. If , then(9)(Kenney and Keeping 1951, pp. 323-324).

Normal equation

Given a matrix equationthe normal equation is that which minimizes the sum of the square differences between the left and right sides:It is called a normal equation because is normal to the range of .Here, is a normal matrix.

Least squares fitting--power law

Given a function of the form(1)least squares fitting gives the coefficientsas(2)(3)where and .

Correlation coefficient--bivariate normal distribution

For a bivariate normal distribution, the distribution of correlation coefficients is given by(1)(2)(3)where is the population correlation coefficient, is a hypergeometric function, and is the gamma function (Kenney and Keeping 1951, pp. 217-221). The moments are(4)(5)(6)(7)where . If the variates are uncorrelated, then and(8)(9)so(10)(11)But from the Legendre duplication formula,(12)so(13)(14)(15)(16)The uncorrelated case can be derived more simply by letting be the true slope, so that . Then(17)is distributed as Student's t with degrees of freedom. Let the population regression coefficient be 0, then , so(18)and the distribution is(19)Plugging in for and using(20)(21)(22)gives(23)(24)(25)(26)so(27)as before. See Bevington (1969, pp. 122-123) or Pugh and Winslow (1966, §12-8). If we are interested instead in the probability that a correlation coefficient would be obtained , where is the observed..

Nonlinear least squares fitting

Given a function of a variable tabulated at values , ..., , assume the function is of known analytic form depending on parameters , and consider the overdetermined set of equations(1)(2)We desire to solve these equations to obtain the values , ..., which best satisfy this system of equations. Pick an initial guess for the and then define(3)Now obtain a linearized estimate for the changes needed to reduce to 0,(4)for , ..., , where . This can be written in component form as(5)where is the matrix(6)In more concise matrix form,(7)where is an -vector and is an -vector.Applying the transpose of to both sides gives(8)Defining(9)(10)in terms of the known quantities and then gives the matrix equation(11)which can be solved for using standard matrix techniques such as Gaussian elimination. This offset is then applied to and a new is calculated. By iteratively applying this procedure until the elements of become smaller than some prescribed limit, a solution..

Least squares fitting--polynomial

Generalizing from a straight line (i.e., first degree polynomial) to a th degree polynomial(1)the residual is given by(2)The partial derivatives (again dropping superscripts)are(3)(4)(5)These lead to the equations(6)(7)(8)or, in matrix form(9)This is a Vandermonde matrix. We can also obtainthe matrix for a least squares fit by writing(10)Premultiplying both sides by the transpose of the firstmatrix then gives(11)so(12)As before, given points and fitting with polynomial coefficients , ..., gives(13)In matrix notation, the equation for a polynomial fitis given by(14)This can be solved by premultiplying by the transpose ,(15)This matrix equation can be solved numerically,or can be inverted directly if it is well formed, to yield the solution vector(16)Setting in the above equations reproduces the linear solution...

Correlation coefficient

The correlation coefficient, sometimes also called the cross-correlation coefficient, Pearson correlation coefficient (PCC), Pearson's , the Perason product-moment correlation coefficient (PPMCC), or the bivariate correlation, is a quantity that gives the quality of a least squares fitting to the original data. To define the correlation coefficient, first consider the sum of squared values , , and of a set of data points about their respective means,(1)(2)(3)(4)(5)(6)(7)(8)(9)(10)(11)(12)These quantities are simply unnormalized forms of the variances and covariance of and given by(13)(14)(15)For linear least squares fitting, the coefficient in(16)is given by(17)(18)and the coefficient in(19)is given by(20)The correlation coefficient (sometimes also denoted ) is then defined by(21)(22)The correlation coefficient is also known as the product-moment coefficient of correlation or Pearson's correlation. The correlation..

Least squares fitting--perpendicular offsets

In practice, the vertical offsets from a line (polynomial, surface, hyperplane, etc.) are almost always minimized instead of the perpendicular offsets. This provides a fitting function for the independent variable that estimates for a given (most often what an experimenter wants), allows uncertainties of the data points along the - and -axes to be incorporated simply, and also provides a much simpler analytic form for the fitting parameters than would be obtained using a fit based on perpendicular offsets.The residuals of the best-fit line for a set of points using unsquared perpendicular distances of points are given by(1)Since the perpendicular distance from a line to point is given by(2)the function to be minimized is(3)Unfortunately, because the absolute value function does not have continuous derivatives, minimizing is not amenable to analytic solution. However, if the square of the perpendicular distances(4)is minimized instead,..

Least squares fitting--logarithmic

Given a function of the form(1)the coefficients can be found from leastsquares fitting as(2)(3)

Subscribe to our updates
79 345 subscribers already with us
Math Subcategories
Check the price
for your project