autocorrelation
|
The value of one error term does not allow us to predict the value of another error term. As such their covariances must be zero
|
collinearity
|
If two variables are functionally dependent on each other we call these collinear. Should this apply to multiple variables at the same time we call these multi-collinear. (multi-)collinearity exists if – and only if – the values follow this function precisely. Such a situation is rare, as usually variables are more loosely related to one another
|
heteroscedasticity
|
The variability of the error terms changes for different observations \(X\)
|
homoscedasticity
|
Refers to a constant variance of the error terms
|
multicollinearity
|
A situation in which an independent variable is a function of multiple other independent variables in the regression model
|
omitted variable bias
|
If you omit an important variable in a regression model, it will bias the size of other coefficients if the variables are correlated.
|
robust standard errors
|
Also known as Huber-White standard errors, correct for heteroscedasticity by adjusting the model-based standard errors using the empirical variability of the model residuals
|
time-series data
|
Time-series data review a certain characteristic over time \(t\), where \(t\) runs from 1 to \(T\)
|
Variance Inflation Factor (VIF)
|
A measure to quantify the how much larger the observed variance of a coefficient is compared to a scenario in which the variable was totally functionally independent from the other independent variables in the model
|