Blog Archives

Topic Archive: VC-dimension

Friday, March 4, 20162:00 pmGC 4102 (Science Center)

Vapnik-Chervonenkis dimension and the mind’s eye

Brown University -- Division of Applied Mathematics

Google engineers routinely train query classifiers, for ranking advertisements or search results, on more examples than any human being hears in a lifetime. A human being who sees a meaningfully new image every second for one-hundred years will not see as many images as Google has in its libraries for training object detectors and image classifiers. Children learn more efficiently, achieving nearly perfect competence on about 30,000 categories in their first eight years. Upper bounds on the number of training samples needed to learn a classifier with similar competence can be derived using the Vapnik-Chervonenkis dimension, or the metric entropy, but these suggest that not only does Google need more examples, but all of evolution might fall short.

I will discuss machine learning and human learning, with a focus on representation. I will argue that brains simulate rather than classify, and I will suggest that the rich relational information available in an imagined world argues against abstraction and in favor of topological, almost literal, representation. I will speculate about physiological mechanisms that would support topologically organized neuronal activity patterns.

This talk is sponsored by the Initiative for the Theoretical Sciences at the CUNY Graduate Center.

CUNY Logic WorkshopFriday, May 1, 20152:00 pmGC 6417

Uri Andrews

A Local Characterization of VC-minimality

University of Wisconsin

(Work joint with Vincent Guingona.) I’ll talk about a problem in the intersection of computable model theory and classical model theory. The notion of VC-minimality, though intriguing, has proven very difficult to work with. It has even been difficult to check whether familiar examples are VC-minimal. This has led model theorists (chiefly my co-author) to ask whether there was a “local” (read “simpler”) characterization of VC-minimality. I suggested a computable model theoretic version of this question in terms of the index set of VC-minimality. We answered this question by giving a local characterization of VC-minimality, and we showed that VC-minimality is a Π04-complete notion.

Model theory seminarFriday, May 9, 201412:30 pmGC6417

Artem Chernikov

VC-dimension in model theory and other subjects

Université Paris Diderot - Paris 7

Finite VC-dimension, a combinatorial property of families of
sets, was discovered simultaneously by Vapnik and Chervonenkis in the
context of probabilistic learning theory, and by Shelah in model
theory in the context of classification of unstable first-order
theories (where it is called NIP). From the model theoretic point of
view it is a very attractive setting generalizing stability and
o-minimality, and admitting a deep theory which had been recently used
to study ordered and valued fields. I will give an overview of some
results around NIP related to set theory (counting Dedekind cuts in
infinite linear orders), topological dynamics and compression schemes
in computational learning theory.

Model theory seminarFriday, December 6, 201312:30 pmGC 6417

VC Density and Breadth on Modules

CUNY Grad Center

I will review the notions of VC-dimension, VC-density, and breadth on modules, before describing and motivating some partial results to open questions from a paper by Aschenbrenner, Dolich, Haskell, Macpherson, and Starchenko.

Model theory seminarFriday, November 1, 201312:30 pmGC6417

David Lippel

Seminar Cancelled

Haverford College

Seminar cancelled Lipell will give his talk at 2:00 in the Workshop.