Computer Science Department

Abstracts of 1998 Reports

Technical Report UTEP-CS-98-35, December 1998

Towards Intelligent Virtual Environment for Teaching Telemanipulation Operators: Virtual Tool Approach and its Interval-Based Justification

L. Olac Fuentes and Vladik Kreinovich

Published in:
*Proceedings of the
Second International Workshop on Intelligent Virtual Environments*,
Xalapa, Veracruz, Mexico, September 11-12, 1998, pp. 8-11.

File in Compressed Postscript and pdf.

Technical Report UTEP-CS-98-34, December 1998

Interval Image Classification is NP-Hard

Alejandro E. Brito and Vladik Kreinovich

Published in:
*Proceedings of the
Second International Workshop on Intelligent Virtual Environments*,
Xalapa, Veracruz, Mexico, September 11-12, 1998, pp. 12-15.

Feature extraction from images to perform object classification is a
very hard problem for general solution. We prove that under
interval uncertainty, *linear classification* is NP-hard.

File in Compressed Postscript and pdf.

Technical Report UTEP-CS-98-33, December 1998

Towards Optimal Pain Relief: Acupuncture and Spinal Cord Stimulation

Richard Alo, Kenneth Alo, Obinna Ilochonwu, Vladik Kreinovich, and Hoang Phuong Nguyen

Published in:
*Proceedings of the
Second International Workshop on Intelligent Virtual Environments*,
Xalapa, Veracruz, Mexico, September 11-12, 1998, pp. 16-24.

One of the important potential areas of application of intelligent virtual environment is to training medical doctors. One of the main problems in designing the corresponding intelligent system is the computational complexity of the corresponding computational problems. This computational complexity is especially high when the corresponding optimization is a discrete optimization problem, e.g., for pain relief methodologies such as acupuncture and spinal cord stimulation. In this paper, we show how to efficiently solve the corresponding discrete optimization problems. As a result, we get, e.g., a theoretical justification for the heuristic method of "guarded cathode".

File in Compressed Postscript and pdf.

Technical Report UTEP-CS-98-32, December 1998

Interval Computations, Soft Computing, and Aerospace Applications

Vladik Kreinovich

Published in:
*Proceedings of the
Second International Workshop on Intelligent Virtual Environments*,
Xalapa, Veracruz, Mexico, September 11-12, 1998, pp. 25-41.

File in Compressed Postscript and pdf.

Technical Report UTEP-CS-98-31, December 1998

Justification of Heuristic Methods in Data Processing Using Fuzzy Theory, with Applications to Detection of Business Cycles from Fuzzy Data

Vladik Kreinovich, Hung T. Nguyen, and Berlin Wu

Published in *Proceedings of FUZZ-IEEE'99*, Seoul, Korea, August
22-25, 1999, Vol. 2, pp. 1131-1136.
Full version published
in *East-West Journal of Mathematics*, 1999, Vol. 1,
No. 2, pp. 147-157.

In this paper, we show that fuzzy theory can explain heuristic methods in inverse problems and numerical computations. As an example of application of these results, we analyze the problem of detecting business cycles from fuzzy data.

Technical Report UTEP-CS-98-30, October 1998

Towards Foundations for Traditional Oriental Medicine

Hoang Phuong Nguyen, Scott A. Starks, and Vladik Kreinovich

In: Nguyen Hoang Phuong and Ario Ohsato (eds.),
*Proceedings of the Vietnam-Japan Bilateral Symposium
on Fuzzy Systems and Applications VJFUZZY'98*,
HaLong Bay, Vietnam, 30th September-2nd October, 1998, pp. 704-708.

Technical Report UTEP-CS-98-29, September 1998

NP-Hardness in Geometric Construction Problems with One Interval Parameter

Nuria Mata and Vladik Kreinovich

Published as *Universitat Politecnica de Catalunya,
Departament de Llenguatges i Sistemes Informatics
Barcelona, Spain*, Technical Report No. LSI-98-55-R, November 1998;
final version appeared in:
*Proceedings of the Workshop
on Applications of Interval Analysis to Systems and Control
with special emphasis on recent advances in Modal Interval Analysis
MISC'99*, Girona, Spain, February 24-26, 1999, pp. 85-98.

Technical Report UTEP-CS-98-28, September 1998

Fuzzy Justification of Heuristic Methods in Inverse Problems and in Numerical Computations, with Applications to Detection of Business Cycles from Fuzzy and Intuitionistic Fuzzy Data

Vladik Kreinovich, Hung T. Nguyen, Berlin Wu, and Krassimir T. Atanassov

Published in *Notes on Intuitionistic Fuzzy Sets (NIFS)*,
1998, Vol. 4, No. 2, pp. 47-56.

Technical Report UTEP-CS-98-27, August 1998

From Semi-Heuristic Fuzzy Techniques to Optimal Fuzzy Methods: Mathematical Foundations and Applications

Vladik Kreinovich

In: Basil Papadopoulos and Apostolos Syropoulos (eds.),
*Current Trends and Developments in Fuzzy Logic,
Proceedings of the First International Workshop,
Thessaloniki, Greece, October 16-20, 1998,* pp. 1-62.

Fuzzy techniques have been successfully used in various application areas ranging from control to image processing to decision making. In all these applications, there is usually:

- a general idea, and then
- there are several possible implementations of this idea;
e.g., we can use:
- different membership functions,
- different "and" and "or" operations,
- different defuzzifications, etc.

- if we want to further improve the semi-heuristic "good enough" control or image processing techniques,
- we must actually make the selection that would lead to an optimal fuzzy method.

Even if we know the exact optimality criterion, the resulting optimization problem is so complicatedly non-linear that, often, no known optimization techniques can be used. In many real-life situations, we do not even know the exact criterion: the relative quality of different controls or image processing techniques may be not by exact numerical criteria, but rather by expert opinions which are, themselves, fuzzy.

In this talk, we describe a general mathematical technique for solving such optimization problems. This technique is based on the group-theoretic (symmetry) methodology, a methodology which is one of the main successful tools of modern physics. This methodology has lead us to the justification of many heuristic formulas in fuzzy logic, fuzzy control, neural networks, etc. (many examples are presented in our 1997 Kluwer book with H.T. Nguyen "Applications of continuous mathematics to computer science").

We also present new applications of this techniques, including applications:

- to optimal fuzzy control, and
- to optimal fuzzy image processing.

Technical Report UTEP-CS-98-26, August 1998

Towards Combining Fuzzy and Logic Programming Techniques

Hung T. Nguyen, Vladik Kreinovich, Daniel E. Cooke, Luqi, and Olga Kosheleva

In: Nguyen Hoang Phuong and Ario Ohsato (eds.),
*Proceedings of the Vietnam-Japan Bilateral Symposium
on Fuzzy Systems and Applications VJFUZZY'98*,
HaLong Bay, Vietnam, 30th September-2nd October, 1998, pp. 482-489.

Technical Report UTEP-CS-98-25, August 1998

Towards Formalization of Feasibility, Randomness, and Commonsense Implication: Kolmogorov Complexity, and the Necessity of Considering (Fuzzy) Degrees

Vladik Kreinovich, Luc Longpre, and Hung T. Nguyen

In: Nguyen Hoang Phuong and Ario Ohsato (eds.),
*Proceedings of the Vietnam-Japan Bilateral Symposium
on Fuzzy Systems and Applications VJFUZZY'98*,
HaLong Bay, Vietnam, 30th September-2nd October, 1998, pp. 294-302.

Technical Report UTEP-CS-98-24, August 1998

Cooperative Learning is Better: Explanation Using Dynamical Systems, Fuzzy Logic, and Geometric Symmetries

Vladik Kreinovich, Edye Johnson-Holubec, Leonid K. Reznik, and Misha Koshelev

In: Nguyen Hoang Phuong and Ario Ohsato (eds.),
*Proceedings of the Vietnam-Japan Bilateral Symposium
on Fuzzy Systems and Applications VJFUZZY'98*,
HaLong Bay, Vietnam, 30th September-2nd October, 1998, pp. 154-160.

Technical Report UTEP-CS-98-23, August 1998

How to Describe Partially Ordered Preferences: Mathematical Foundations

Olga Kosheleva, Vladik Kreinovich, Hung T. Nguyen, and Bernadette Bouchon-Meunier

In: Nguyen Hoang Phuong and Ario Ohsato (eds.),
*Proceedings of the Vietnam-Japan Bilateral Symposium
on Fuzzy Systems and Applications VJFUZZY'98*,
HaLong Bay, Vietnam, 30th September-2nd October, 1998, pp. 269-278.

Postscript file; Compressed Postscript file; pdf file.

Technical Report UTEP-CS-98-22, August 1998

Uncertainty Representation Explains and Helps Methodology of Physics and Science in General

Misha Koshelev, Vladik Kreinovich, Hung T. Nguyen, and Bernadette Bouchon-Meunier

In: Nguyen Hoang Phuong and Ario Ohsato (eds.),
*Proceedings of the Vietnam-Japan Bilateral Symposium
on Fuzzy Systems and Applications VJFUZZY'98*,
HaLong Bay, Vietnam, 30th September-2nd October, 1998, pp. 577-585.

Postscript file; Compressed Postscript file; pdf file.

Technical Report UTEP-CS-98-21, August 1998

Complex Problems: Granularity is Necessary, Granularity Helps

Oscar N. Garcia, Vladik Kreinovich, Luc Longpre, and Hung T. Nguyen

In: Nguyen Hoang Phuong and Ario Ohsato (eds.),
*Proceedings of the Vietnam-Japan Bilateral Symposium
on Fuzzy Systems and Applications VJFUZZY'98*,
HaLong Bay, Vietnam, 30th September-2nd October, 1998, pp. 449-455.

Technical Report UTEP-CS-98-20, August 1998

Possible New Directions in Mathematical Foundations of Fuzzy Technology: A Contribution to the Mathematics of Fuzzy Theory

Hung T. Nguyen and Vladik Kreinovich

In: Nguyen Hoang Phuong and Ario Ohsato (eds.),
*Proceedings of the Vietnam-Japan Bilateral Symposium
on Fuzzy Systems and Applications VJFUZZY'98*,
HaLong Bay, Vietnam, 30th September-2nd October, 1998, pp. 9-32.

Technical Report UTEP-CS-98-19, July 1998

Updated version UTEP-98-19a, June 1999

Final version UTEP-98-19b, October 1999

Interval Methods in Non-Destructive Testing of Material Structures

Keith Worden, Roberto Osegueda, Carlos Ferregut, Soheil Nazarian, Debra L. George, Mary J. George, and Vladik Kreinovich, Olga Kosheleva and Sergio Cabrera

Published in *Reliable Computing*, 2001, Vol. 7, No. 4,
pp. 341-352.

In many practical situations, e.g., in aerospace applications and in mammography, it is important to test the structural integrity of material structures. We show that interval methods can help.

Postscript file of the first version;

Compressed Postscript file of the updated
version;

Final version UTEP-98=19b in
Compressed
Postscript and in
pdf

Technical Report UTEP-CS-98-18, July 1998

Beyond Interval Systems: What is Feasible and What is Algorithmically Solvable?

Vladik Kreinovich

Published in: Panos M. Pardalos (ed.),
*Approximation and Complexity in Numerical Optimization:
Continuous and Discrete Problems*, Kluwer, Dordrecht, 1999,
pp. 364-379.

In many real-life applications of interval computations, the desired quantities appear (in a good approximation to reality) as a solution to a system of interval linear equations. It is known that such systems are difficult to solve (NP-hard) but still algorithmically solvable. If instead of the (approximate) interval linear systems, we consider more realistic (and more general) formulations, will the corresponding problems still be algorithmically solvable? We consider three natural generalizations of interval linear systems: to conditions which are more general than linear systems, to multi-intervals instead of intervals, and to dynamics (differential and difference equations) instead of statics (linear and algebraic equations). We show that the problem is still algorithmically solvable for non-linear systems and even for more general conditions, and it is still solvable if we consider linear or non-linear systems with multi-intervals instead of intervals. However, generalized conditions with multi-interval uncertainty are already algorithmically unsolvable. For dynamics: difference equations are still algorithmically solvable, differential equations are, in general, unsolvable.

Technical Report UTEP-CS-98-17, June 1998

A Simplified Version of the Tomography Problem Can Help to Estimate the Errors of Indirect Measurements

V. Kreinovich

In: Ali Mohamad-Djafari (ed.),
*Bayesian Inference for Inverse Problems*,
Proceedings of the SPIE/International Society for Optical
Engineering, Vol. 3459, San Diego, CA, 1998, pp. 106-115.

In many real-life situations, it is very difficult or even impossible
to directly measure the quantity *y* in which we are interested: e.g., we
cannot directly measure a distance to a distant galaxy or the amount
of oil in a given well. Since we cannot measure such quantities
*directly*, we can measure them *indirectly*: by first
measuring some relating quantities *x1,...,xn*, and then by
using the known relation between *xi* and *y* to reconstruct the
value of the desired quantity *y*.

In practice, it is often very important to estimate the error of the resulting indirect measurement. In this paper, we show that in a natural statistical setting, the problem of estimating the error of indirect measurement can be formulated as a simplified (finite-dimensional) version of a tomography problem (reconstructing an image from its projections).

In this paper, we use the ideas of invariance (natural in statistical setting) to find the optimal algorithm for solving this simplified tomography problem, and thus, for solving the statistical problem of error estimation for indirect measurements.

Technical Report UTEP-CS-98-16, June 1998

Multi-Spectral Inverse Problems in Satellite Image Processing

S. A. Starks and V. Kreinovich

In: Ali Mohamad-Djafari (ed.),
*Bayesian Inference for Inverse Problems*,
Proceedings of the SPIE/International Society for Optical
Engineering, Vol. 3459, San Diego, CA, 1998, pp. 138-146.

Satellite imaging is nowadays one of the main sources of geophysical and environmental information. It is, therefore, extremely important to be able to solve the corresponding inverse problem: reconstruct the actual geophysics- or environment-related image from the observed noisy data.

Traditional image reconstruction techniques have been developed for the case when we have a single observed image. This case corresponds to a single satellite photo. Existing satellites (e.g., Landsat) take photos in several (up to 7) wavelengths. To process this multiple-spectral information, we can use known reasonable multi-image modifications of the existing single-image reconstructing techniques. These modifications, basically, handle each image separately, and try to merge the resulting information.

Currently, a new generation of imaging satellites (Lewis) is being launched, that will enable us to collect visual images for about 500 different wavelengths. This two order of magnitude increase in data amount should lead to a similar increase in the processing time, but surprisingly, it does not. An analysis and explanation of this paradoxical simplicity is given in the paper.

Technical Report UTEP-CS-98-15, June 1998

Case Study of Non-Linear Inverse Problems: Mammography and Non-Destructive Evaluation

O. Kosheleva, S. Cabrera, R. Osegueda, C. Ferregut, S. Nazarian, D. L. George, M. J. George, V. Kreinovich, and K. Worden

In: Ali Mohamad-Djafari (ed.),
*Bayesian Inference for Inverse Problems*,
Proceedings of the SPIE/International Society for Optical
Engineering, Vol. 3459, San Diego, CA, 1998, pp. 128-135.

The inverse problem is usually difficult because the signal (image) that we want to reconstruct is weak. Since it is weak, we can usually neglect quadratic and higher order terms, and consider the problem to be linear. Since the problem is linear, methods of solving this problem are also, mainly, linear (with the notable exception of the necessity to take into consideration, e.g., that the actual image is non-negative).

In most real-life problems, this linear description works pretty well. However, at some point, when we start looking for a better accuracy, we must take into consideration non-linear terms. This may be a minor improvement for normal image processing, but these non-linear terms may lead to a major improvement and a great enhancement if we are interested in outliers such as faults in non-destructive evaluation or bumps in mammography. Non-linear terms (quadratic or cubic) give a great relative push to large outliers, and thus, in these non-linear terms, the effect of irregularities dominate. The presence of the non-linear terms can serve, therefore, as a good indication of the presence of irregularities.

We describe the result of the experiments in which these non-linear terms are really helpful.

Technical Report UTEP-CS-98-14, June 1998

Kolmogorov Complexity, Statistical Regularization of Inverse Problems, and Birkhoff's Formalization Of Beauty

V. Kreinovich, L. Longpre, and M. Koshelev

In: Ali Mohamad-Djafari (ed.),
*Bayesian Inference for Inverse Problems*,
Proceedings of the SPIE/International Society for Optical
Engineering, Vol. 3459, San Diego, CA, 1998, pp. 159-170.

Most practical applications of statistical methods are based on the implicit assumption that if an event has a very small probability, then it cannot occur. For example, the probability that a kettle placed on a cold stove would start boiling by itself is not 0, it is positive, but it is so small, that physicists conclude that such an event is simply impossible.

This assumption is difficult to formalize in traditional probability theory, because this theory only describes measures on sets (e.g., for an inverse problem, on the set of all functions) and does not allow us to divide functions into "random" (possible) and non-random ("impossible") ones. This distinction was made possible by the idea of algorithmic randomness, introduced by Kolmogorov and his student Martin-Lof in the 1960s.

We show that this idea can also be used for inverse problems. In particular, we prove that for every probability measure, the corresponding set of random functions is compact, and, therefore, the correspondingly restricted inverse problem is well-defined.

The resulting techniques turns out to be interestingly related with the qualitative esthetic measure introduced by G. Birkhoff as order/complexity.

File in Postscript, Compressed Postscript, and pdf.

Technical Report UTEP-CS-98-13, May 1998

Updated version UTEP-CS-98-13a, June 1998

Fair Division under Interval Uncertainty

Ronald R. Yager and Vladik Kreinovich

Published in *International Journal of Uncertainty, Fuzziness, and
Knowledge-Based Systems (IJUFKS)*, 2000, Vol. 8, No. 5, pp. 611-618.

It is often necessary to divide a certain amount of money between n
participants, i.e., to assign, to each participant, a certain
*portion* w(i)>=0 of the whole sum (so that w(1)+...+w(n)=1).
In some situations, from the fairness
requirements, we can uniquely determine these "weights" w(i).
However, in
some other situations, general considerations do not allow us to
uniquely determine these weights, we only know the
*intervals* [w-(i),w+(i)] of possible fair weights. We show that
natural fairness requirements enable us to choose unique weights from
these intervals; as a result, we present an algorithm for fair
division under interval uncertainty.

Postscript file of the original paper;

Updated version in Compression
Postscript and pdf.

Technical Report UTEP-CS-98-12, May 1998

Updated version UTEP-CS-98-12a, June 1999

Decision Making under Interval Probabilities

Ronald R. Yager and Vladik Kreinovich

Published in *International Journal of Appeoximate Reasoning*,
Vol. 22, No. 3, pp. 195-215.

If we know the probabilities p(1),...,p(n) of different situations
s1,...,sn, then we can choose a decision Ai for which the
expected benefit C(i)=p(1)*c(i,1)+...+p(n)*c(i,n) takes
the largest possible value, where c(i,j) denotes the benefit of
decision Ai in situation sj. In many real life situations,
however, we do not know the *exact* values of the probabilities p(j);
we only know the *intervals* [p-(j),p+(j)] of possible
values of these probabilities. In order to make decisions under such
interval probabilities, we would like to generalize the notion of
expected benefits to interval probabilities. In this
paper, we show that natural requirements lead to a unique (and easily
computable) generalization. Thus, we have a natural way of decision
making under interval probabilities.

Postscript file of the first version;

Updated version in
Compressed Postscript
and in pdf.

Technical Report UTEP-CS-98-11, March 1998

Towards the Use of Aesthetics in Decision Making: Kolmogorov Complexity Formalizes Birkhoff's Idea

Misha Koshelev, Vladik Kreinovich, and Yeung Yam

Decision making is traditionally based on utilitarian
criteria such as cost, efficiency, time, etc. These criteria are
reasonably easy to formalize; hence, for such criteria,
we can select the best decision by solving the corresponding
well-defined optimization problem.
In many engineering projects, however, e.g., in designing cars,
building, airplanes, etc., an important additional
criterion which needs to be
satisfied is that the designed object should be good looking. This
additional criterion is difficult to formalize and, because of that, it is
rarely taken into consideration in formal decision making.
In the 1930s, the famous mathematician
G. D. Birkhoff has proposed a formula that described
beauty in terms of "order" and "complexity". In the simplest
cases, he formalized these notions and showed that his formula is
indeed working. However, since there was no general notion of
complexity, he was unable to formalize his idea in the general
case. In this paper, we show that the existing computer-based general
notion of object complexity (called *Kolmogorov complexity*) leads to a
reasonable formalization of Birkhoff's idea and thus, leads to a
possibility to take aesthetic criteria into consideration in decision
making.

Technical Report UTEP-CS-98-10, February 1998

An Operationalistic Reformulation of Einstein's Equivalence Principle

V. Ya. Kreinovich and R. R. Zapatrine

Published in *Gravitacija*, 1997, Vol. 3, No. 2, pp. 51-59 (in
Russian).

Technical Report UTEP-CS-98-9, February 1998

Encryption Algorithms Made (Somewhat) More Natural (A Pedagogical Remark)

Misha Koshelev, Vladik Kreinovich, and Luc Longpre

Short version published in
*Bulletin of the European Association for Theoretical
Computer Science (EATCS)*, 1999, Vol. 67, pp. 153-156;
full version published in
*Inroads: ACM SIGCSE Bulletin*, 1999, Vol. 31, No. 4,
pp. 50-51.

Modern cryptographic algorithms, such as DES, IDEA, etc., are
very complex and therefore difficult to learn.
Textbooks explain in detail *how* these algorithms work,
but they usually do not explain *why* these algorithms were
designed as they were. In this paper,
we explain why, and thus, hopefully, make cryptographic algorithms
easier to learn.

Technical Report UTEP-CS-98-8, January 1998

Application of Kolmogorov Complexity to Image Compression: It is Possible to Have a Better Compression, but It is Not Possible to Have the Best One

Sanjeev Subbaramu, Ann Q. Gates, and Vladik Kreinovich

Published in *Bulletin of the European Association for Theoretical
Computer Science (EATCS)*, 1999, Vol. 69, pp. 145-150.

Technical Report UTEP-CS-98-7, January 1998

Telemanipulation: The Virtual Tool Approach and its Interval-Based Justification

Vladik Kreinovich and Luic Olac Fuentes

In: Goetz Alefeld and
Raul A. Trejo (eds.), *Interval Computations and its Applications to
Reasoning Under Uncertainty, Knowledge Representation,
and Control Theory. Proceedings of MEXICON'98, Workshop on Interval
Computations, 4th World Congress
on Expert Systems*, Mexico City, Mexico, 1998.

File in Postscript and pdf.

Technical Report UTEP-CS-98-6, January 1998

Run-Time Correctness Checking is Algorithmically Undecidable for Pointer Data Structures

Mikhail Auguston, Vladik Kreinovich, and Luc Longpre

Programs routinely use complicated pointer (linked list-type) data structures such as linked lists, doubly linked lists, different types of trees, etc. These data structures are usually defined {\it inductively}: e.g., a tree can be defined as a structure that results from an empty tree by an arbitrary sequence of adding and deleting elements.

When the program runs, these data structures
take dynamically changing shapes.
To test program correctness, it is important to check, at run-time,
whether a current shape is a correct implementation of the
corresponding structure.
Algorithms are known for checking the ``shape correctness'' for *basic*
pointer-based data structures such as linked list, binary tree, etc.

In this paper, we show that the *general* problem -
verifying that a given shape is an
instance of an inductively defined data structure -
is algorithmically undecidable.

Technical Report UTEP-CS-98-5, January 1998

Ordinal Explanation of the Periodic System of Chemical Elements

Eric R. Scerri, Vladik Kreinovich, Piotr Wojciechowski, and Ronald R. Yager

Published in the *International Journal of Uncertainty, Fuzziness, and
Knowledge-Based Systems (IJUFKS)*, 1998, Vol. 6, No. 4, pp. 387-399.

Textbooks often claim that quantum mechanics explained the periodic system: namely, the actual configuration of electronic orbits that is responsible for the element's chemical properties can be described as the one that minimizes the total energy, and the energy of each configuration can be computed by using quantum mechanics.

However, a careful analysis of this explanation reveals that,
in addition to the basic equations of quantum mechanics, we need some
*heuristic* rules that do not directly follow from quantum
physics. One reason why additional heuristics are necessary is that
the corresponding numerical equations are extremely difficult to
solve, and as we move to atoms with larger and larger atomic numbers Z,
they become even more difficult. Moreover, as Z grows,
we must take relativistic effects into consideration, and this means
going from partial differential equations to even more mathematically
difficult operator equations.

In this paper, we show that if instead of the (often impossible)
*numerical* optimization, we consider the (available)
*ordinal* information, we can then explain the observed periodic
system.

Technical Report UTEP-CS-98-4, January 1998

Human Visual Perception and Kolmogorov Complexity: Revisited

Vladik Kreinovich and Luc Longpre

Published in the *Bulletin of the European Association for
Theoretical Computer Science (EATCS),* 1998, Vol. 64, pp. 155-158.

Experiments have shown
that we can only memorize images up to a
certain complexity level, after which, instead of memorizing the image
itself, we, sort of, memorize a probability distribution in terms of
which this image is "random" (in the intuitive sense of this word),
and next time, we reproduce a "random" *sample* from
this distribution. This random sample may be *different* from the
original image, but since it belongs to the same distribution, it,
hopefully, *correctly reproduces the statistical characteristics* of
the original image.

The reason why a complex image cannot be accurately memorized is, probably, that our memory is limited. If storing the image itself exhausts this memory, we store its probability distribution instead. With this limitation in mind, we conclude that we cannot store arbitrary probability distributions either, only sufficient simple ones.

In this paper, we show that an arbitrary image is indeed either itself simple, or it can be generated by a simple probability distribution. This result provides a mathematical foundations for the above theoretical explanation of the randomized-memorization phenomena.

File in Postscript and in pdf.

Technical Report UTEP-CS-98-3, January 1998

On Geometry of Radio Antenna Placements

Olga Kosheleva, Vladik Kreinovich, Andrei M. Finkelstein, and Steven Chan

Published in *Geombinatorics*, 1998, Vol. 8, No. 2, pp. 46-49.

Technical Report UTEP-CS-98-2b, March 1998

Decision Making Based on Satellite Images: Optimal Fuzzy Clustering Approach

Vladik Kreinovich, Hung T. Nguyen, Scott A. Starks, and Yeung Yam

Published in *Proceedings of the 37th
IEEE Conference on Decision and Control CDC'98*,
Tampa, Florida, December 16-18, 1998, pp. 4246-4251.

In many real-life decision-making
situations, in particular, in
processing satellite images, we have an enormous amount of information
to process. To speed up the information processing, it is reasonable
to first classify the situations into a few meaningful classes
(clusters), find the best decision for each class, and then, for each
new situation, to apply the decision which is the best for the
corresponding class. One of the most efficiently clustering
methodologies is *fuzzy clustering*, which is based on the use of
fuzzy logic. Usually, *heuristic* clusterings are used, i.e.,
methods which are selected based on their
empirical efficiency rather than on their proven optimality. Because
of the importance of the corresponding decision making situations, it is
therefore desirable to theoretically analyze these empirical choices.
In this paper, we formulate the problem of choosing the *optimal* fuzzy
clustering as a precise mathematical problem, and we show that in the
simplest cases, the empirically best fuzzy clustering methods are
indeed optimal.

Technical Report UTEP-CS-98-2, January 1998

Optimal Choices of Potential Functions in Fuzzy Clustering

Vladik Kreinovich, Hung T. Nguyen, and Yeung Yam

Published by The Chinese University of Hong Kong, Department of Mechanical and Automation Engineering, as Technical Report CUHK-MAE-98-001, January 1998.

Fuzzy logic-based clustering techniques are widely used in situations where statistical assumptions are not valid. Whether in estimating cluster centers for model identification purposes or in determining clusters the existing techniques are essentially based upon the choice of some potential functions. As in any design problems of this kind, the choice of such a function has to be justified on a theoretical basis. In this work, we set up a decision frame work and show that optimal potential functions are the ones which are used in current techniques.

Technical Report UTEP-CS-98-1, January 1998

On How to Merge Sorted Lists Coming from Different Web Search Tools

Ronald R. Yager and Vladik Kreinovich

Published in *Soft Computing*, 1999, Vol. 3, pp. 83-88.

Different web search tools often complement each other. So, if we want to have a good coverage of all relevant web items, a reasonable strategy is to use different search tools and then merge the resulting lists. How to merge them? In this paper, we describe reasonable axioms for the merging procedure and describe all mergings that satisfy these reasonable axioms.