To this point we have completed our minimal specifications for our
example; now we proceed to analyse these specifications. We have
uncertain quantities and
, with educated guesses for
their locations: expectations
and
; and for the accuracy
of these guesses: their variances
and
. We have also
the observable quantities
and
; expectations and
variances for them; and covariances linking
and
with
and
. Additionally we have some data on
and
. The learning process essentially consists of modifying our
expectations for
and
, and of improving the accuracy
of these expectations in the sense of reducing variances, in the light
of the information contained in
. The terms we use for such
modified expectations and variances are adjusted expectations and
adjusted variances, and inter alia we obtain them by
adjusting the belief structure
by the belief structure
.
Recall that by belief structure we mean the entirety of specifications
over a particular base: essentially the covariance matrix and the
expectations for the quantities in the base. One belief structure is
adjusted by another via covariances specified between the two underlying
bases.
Whilst one of the aims of the analysis is to modify expectations and
variances for in the light of data, let us remember that before
we see any data, part of our learning process is to assess exactly
how the data will be used when it comes. To use the analogy of a
traditional statistical estimation procedure, we usually wish to examine
not only the ``estimate'' but also the ``estimator''
and its properties. Following such examination, when the data arrives we
obtain estimates and then check for consistency between what we expected
to happen, and what actually happened.