Why are regression problems called "regression" problems?
I was just wondering why regression problems are called "regression" problems. What is the story behind the name? One definition for regression: "Relapse to a less perfect or developed state."
Regression with multiple dependent variables? - Cross Validated
Is it possible to have a (multiple) regression equation with two or more dependent variables? Sure, you could run two separate regression equations, one for each DV, but that doesn't seem like it ...
regression - When is R squared negative? - Cross Validated
Also, for OLS regression, R^2 is the squared correlation between the predicted and the observed values. Hence, it must be non-negative. For simple OLS regression with one predictor, this is equivalent to the squared correlation between the predictor and the dependent variable -- again, this must be non-negative.
How should outliers be dealt with in linear regression analysis ...
What statistical tests or rules of thumb can be used as a basis for excluding outliers in linear regression analysis? Are there any special considerations for multilinear regression?
correlation - What is the difference between linear regression on y ...
The Pearson correlation coefficient of x and y is the same, whether you compute pearson(x, y) or pearson(y, x). This suggests that doing a linear regression of y given x or x given y should be the ...
How to describe or visualize a multiple linear regression model
Then this simplified version can be visually shown as a simple regression as this: I'm confused on this in spite of going through appropriate material on this topic. Can someone please explain to me how to "explain" a multiple linear regression model and how to visually show it.
regression - Trying to understand the fitted vs residual plot? - Cross ...
A good residual vs fitted plot has three characteristics: The residuals "bounce randomly" around the 0 line. This suggests that the assumption that the relationship is linear is reasonable. The res...
regression - Converting standardized betas back to original variables ...
I have a problem where I need to standardize the variables run the (ridge regression) to calculate the ridge estimates of the betas. I then need to convert these back to the original variables scale.
How does the correlation coefficient differ from regression slope?
The regression slope measures the "steepness" of the linear relationship between two variables and can take any value from $-\infty$ to $+\infty$. Slopes near zero mean that the response (Y) variable changes slowly as the predictor (X) variable changes.
regression - Difference between forecast and prediction ... - Cross ...
I was wondering what difference and relation are between forecast and prediction? Especially in time series and regression? For example, am I correct that: In time series, forecasting seems to mea...
|