Loading...
Please wait, while we are loading the content...
Similar Documents
On metric entropy, Vapnik-Chervonenkis dimension, and learnability for a class of distributions (1989)
| Content Provider | CiteSeerX |
|---|---|
| Author | Kulkarni, Sanjeev R. |
| Abstract | In [23], Valiant proposed a formal framework for distribution-free concept learning which has generated a great deal of interest. A fundamental result regarding this framework was proved by Blumer et al. [6] characterizing those concept classes which are learnable in terms of their Vapnik-Chervonenkis (VC) dimension. More recently, Benedek and Itai [4] studied learnability with respect to a fixed probability distribution (a variant of the original distribution-free framework) and proved an analogous result characterizing learnability in this case. They also stated a conjecture regarding learnability for a class of distributions. In this report, we first point out that the condition for learnability obtained in [4] is equivalent to the notion of finite metric entropy (which has been studied in other contexts). Some relationships, in addition to those shown in [4], between the VC dimension of a concept class and its metric entropy with respect to various distributions are then discussed. Finally, we prove some partial results regarding learnability for a class of distributions. |
| File Format | |
| Language | English |
| Publisher Date | 1989-01-01 |
| Access Restriction | Open |
| Subject Keyword | Metric Entropy Vapnik-chervonenkis Dimension Concept Class Analogous Result Great Deal Partial Result Vc Dimension Distribution-free Concept Formal Framework Fixed Probability Distribution Various Distribution Original Distribution-free Framework Finite Metric Entropy Fundamental Result |
| Content Type | Text |
| Resource Type | Technical Report |