SEMAINE D'ÉTUDE SUR LE ROLE DE L’ANALYSE ECONOMETRIQUE ETC. 397
rem of Worp (!?). That theorem states that the inconsistency
of least squares will be small the smaller are the correlations
between the explanatory variables in the equation to be estim-
ated and the disturbance from that equation and also the smaller
is the variance of that disturbance. A perhaps more illuminat-
ing way of looking at the same thing is to consider the disturb-
ance as made up of a linear combination of omitted variables.
The inconsistencies in the parameter estimates can then be
shown to be equal to the coefficients of the multiple regression
of the disturbance term on the explanatory variables (!3).
For our purposes, the Proximity Theorem shows that if
(R.1), (R.2), and (R.3) or (R.3*) hold approximately, the
inconsistency of ordinary least squares will be small. Indeed,
that inconsistency will be small in a given equation if the ap-
propriate columns of
(2.7
premultiplied by the inverse of the variance-covariance mau
of the variables appearing on the right of that equation is smal.
Since that inverse enters the ordinary least squares parameter
estimates in precisely the same way, we may say that (roughly)
relative inconsistencies will be small provided that W(o) and
W(x) are small. Thus, if all terms above the diagonal in A
are nearly zero; if cross-equation covariance between contem-
porary disturbances is small; and if there is little serial cor
relation, ordinary least squares will not do too badly.
Unfortunately, there is reason to believe that this will not
generallv be the case. The arguments given above for the
(1) Worp and JURÉEN [34, p. 189 and pp. 37-38]. The Proximity
Theorem as stated by WoLD is one concerning bias; we discuss inconsistency
since unbiasedness is not in any case a property of least squares in models
with lagged endogenous variables. See Hurwicz [14]
(13) See FISHER [8] and THEIL [31]
Fisher - pag. 1: