Publication Date
3-2017
Abstract
Most data processing techniques traditionally used in scientific and engineering practice are statistical. These techniques are based on the assumption that we know the probability distributions of measurement errors etc.
In practice, often, we do not know the distributions, we only know the bound D on the measurement accuracy -- hence, after the get the measurement result X, the only information that we have about the actual (unknown) value x of the measured quantity is that $x$ belongs to the interval [X − D, X + D]. Techniques for data processing under such interval uncertainty are called interval computations; these techniques have been developed since 1950s.
In many practical problems, we have a combination of different types of uncertainty, where we know the probability distribution for some quantities, intervals for other quantities, and expert information for yet other quantities. The purpose of this paper is to describe the theoretical background for interval and combined techniques and to briefly describe the existing practical applications.
Comments
Technical Report: UTEP-CS-17-22
To appear in Proceedings of the Eighth International Conference on the Applications of Digital Information and Web Technologies ICADIWT'2017, Ciudad Juarez, Chihuahua, Mexico, March 29-31, 2017