Definitions of the terms used in Uncertainty in AI class

Calibration of a measuring instrument: comparison of the results of using this instrument and the results of measuring the same quantity by a more accurate measuring instrument; it is used to determine the probability distribution of the measurement errors.

Cdf: an abbreviation for Cumulative distribution function (see).

Cumulative distribution function F(X) describes, for each possible value X of a quantity, the probability that the actual value x of this quantity does not exceed X.

Entropy:: average number of binary (yes-no) questions that we need to ask to determine the actual value (or, in the continuous case, to determine the actual value with a given accuracy ε. When we have n alternatives with probabilities p1, ..., pn, this average value is determined by Shannon's formula − p1 * log2(p1) − ... − pn * log2(pn).

Expected utility of an action: if we have several possible outcomes a1, ..., an with probabilities p1, ..., pn and utilities u1, ..., un, then the utility of this action -- known as expected utility -- is equal to p1 * u1 + ... + pn * un; a rational person should select an action with the largest utility.

Expected value of a quantity: if we have several values v1, ..., vn with probabilities p1, ..., pn, then the expected value is equal to p1 * v1 + ... + pn * vn.

Fuzzy techniques: techniques for translating expert knowledge -- which is described by using imprecise words from natural language -- such as "approximately" or "small" -- into computer-understandable numbers.

Fuzzy uncertainty: when we only know the expert's estimate described by using imprecise words from natural language -- such as "approximately" or "small".

Interval uncertainty: when for some quantity, we only know the interval [L,U] of its possible values -- but we have no information about the probability of different values within this interval.

Least squares approach: we want to describe how the quantity y depends on a quantity x; to determine this dependence, we can use several situations k = 1, ..., N in which we know both the value xk of x and the value yk of y; out of several possible models f(x) of this dependence, we then select the one for which the following sum of the squares is the smallest: (f(x1) − y1)2 + ... + (f(xN) − yN)2.

Mathematical expectation: same as expected value (see).

Maximum Likelihood: if we have several possible models for a process, then we select the model for which the probability of observed events is the largest.

Measurement error: the difference X − x between the result X of measuring a quantity and the actual value of this quantity.

Moment of a random quantity: if we have several values v1, ..., vn with probabilities p1, ..., pn, then the k-th order moment Mk is equal to p1 * v1k + ... + pn * vnk.

P-box same as probability box (see).

Partial derivative: if we have a function of several variables, then to get its partial derivative with respect to each variable, we consider all other variables to be constants; for example, for f(x,y) = x2 − 2x*y + y2, the partial derivative with respect to x is equal to 2x − 2y + 0 = 2x − 2y.

Probability box: when we do not know the exact values of the cdf F(x) (see), we only know bounds L(x) and U(x) on the cdf, i.e., we only know that L(x) ≤ U(x).

Probabilistic uncertainty: when we know which values of the quantity are possible, and we know the probability of each possible value.

Standard measuring instrument: a measuring instrument that is much more accurate than the instrument that we use for our measurements; it is used to calibrate the measuring instrument that we use.

Subjective probability of an event E: we select some possible gain G (e.g., $100); the subjective probability is then the probability for which the expert cannot decide which of the following two alternative is better -- since they are of equal value to him/her:

Taylor series: a representation of a function as the sum of power of the unknown(s): f(x) = a0 + a1 * x + a2 * x2 + ...; this is how computers compute all special functions like exp(x), sin(x), etc.: by computing the sum of several terms in the Taylor series.

Uncertainty means that we do not have full information about the situation.

Utility of an alternative A: we select two extreme alternatives:

The utility of the alternative A is the probability for which the expert cannot decide which of the following two alternative is better -- since they are of equal value to him/her: