Loading...
Please wait, while we are loading the content...
Similar Documents
A HIERARCHICAL CLUSTERING BASED ON MUTUAL INFORMATION MAXIMIZATION
| Content Provider | CiteSeerX |
|---|---|
| Author | M. Aghagolzadeh, A. H. Soltanian-Zadeh, A. B. Araabi, A. A. Aghagolzadeh, B. |
| Abstract | Mutual information has been used in many clustering algorithms for measuring general dependencies between random data variables, but its difficulties in computing for small size datasets has limited its efficiency for clustering in many applications. A novel clustering method is proposed which estimates mutual information based on information potential computed pair-wise between data points and without any prior assumptions about cluster density function. The proposed algorithm increases the mutual information in each step in an agglomerative hierarchy scheme. We have shown experimentally that maximizing mutual information between data points and their class labels will lead to an efficient clustering. Experiments done on a variety of artificial and real datasets show the superiority of this algorithm, besides its low computational complexity, in comparison to other information based clustering methods and also some ordinary clustering algorithms. Index Terms — agglomerative hierarchical clustering, information potential, mutual information (MI), Renyi’s entropy 1. |
| File Format | |
| Access Restriction | Open |
| Subject Keyword | Mutual Information Information Potential Index Term Small Size Datasets Many Application Agglomerative Hierarchy Scheme Real Datasets Hierarchical Clustering Ordinary Clustering Algorithm Prior Assumption Low Computational Complexity General Dependency Novel Clustering Method Random Data Variable Cluster Density Function Class Label Efficient Clustering Clustering Method |
| Content Type | Text |