Glossary

Table 21: Glossary Week 10
Term Description
autocorrelation The value of one error term does not allow us to predict the value of another error term. As such their covariances must be zero
collinearity If two variables are functionally dependent on each other we call these collinear. Should this apply to multiple variables at the same time we call these multi-collinear. (multi-)collinearity exists if – and only if – the values follow this function precisely. Such a situation is rare, as usually variables are more loosely related to one another
heteroscedasticity The variability of the error terms changes for different observations \(X\)
homoscedasticity Refers to a constant variance of the error terms
law of iterative expectations According to this law, “the expected value of X is equal to the expectationover the conditional expectation of X given Y” (Robinson, 2022)
multicollinearity A situation in which an independent variable is a function of multiple other independent variables in the regression model
omitted variable bias If you omit an important variable in a regression model, it will bias the size of other coefficients if the variables are correlated.
robust standard errors Also known as Huber-White standard errors, correct for heteroscedasticity by “[adjusting] the model-based standard errors using the empirical variability of the model residuals” (Mansournia et al., 2020, p. 347)
time-series data Time-series data review a certain characteristic over time \(t\), where \(t\) runs from 1 to \(T\)
Variance-Covariance Matrix A variance-covariance matrix is a square matrix that encapsulates the variances of different variables on its diagonal and the covariances between pairs of variables in its off-diagonal elements. 
Variance Inflation Factor (VIF) A measure to quantify the how much larger the observed variance of a coefficient is compared to a scenario in which the variable was totally functionally independent from the other independent variables in the model