Publication Date
9-2003
Abstract
Often, we need to divide n objects into clusters based on the value of a certain quantity x. For example, we can classify insects in the cotton field into groups based on their size and other geometric characteristics. Within each cluster, we usually have a unimodal distribution of x, with a probability density d(x) that increases until a certain value x0 and then decreases. It is therefore natural, based on d(x), to determine a cluster as the interval between two local minima, i.e., as a union of adjacent increasing and decreasing segments. In this paper, we describe a feasible algorithm for solving this problem.
Original file: UTEP-CS-03-02
Comments
Technical Report: UTEP-CS-03-02a
Published in Numerical Algorithms, 2004, Vol. 37, pp. 225-232.