ALADDIN
CENTER Carnegie Mellon UniversityCarnegie Mellon Computer Science DepartmentSchool of Computer Science
Abstracts
The Joint ALADDIN/Theory/Operations Research Seminar
Aladdin
About
Calendar
People
PROBEs
Workshops
Papers
Education
Related Activities
Corporate
Contact
 
Captcha
Outreach Roadshow

Agnostically Learning Halfspaces

Adam Klivans
University of Texas at Austin

Friday, September 22nd, 2006, 3:30 pm
Wean Hall 7220

Abstract

We give the first algorithm that efficiently learns halfspaces (under distributional assumptions) in the notoriously difficult agnostic framework of Kearns, Schapire, and Sellie. In this model, a learner is given arbitrarily labeled examples from a fixed distribution and must output a hypothesis competitive with the optimal halfspace hypothesis.

Our algorithm constructs a hypothesis whose error rate on future examples is within an additive $\epsilon$ of the optimal halfspace in time poly($n$) for any constant $\epsislon$ > 0 under the uniform distribution over $\{0,1\}^n$ or the unit sphere in $R^n$, as well as under any log-concave distribution over $R^n$. It also agnostically learns Boolean disjunctions in time $2^{\tilde{O}(\sqrt{n})}$ with respect to *any* distribution. The new algorithm, essentially L_1 polynomial regression, is a noise-tolerant arbitrary-distribution generalization of the "low-degree" Fourier algorithm of Linial, Mansour, and Nisan. Our Fourier-type algorithm over the unit sphere makes use of approximation properties of various classes of orthogonal polynomials.

This is joint work with A. Kalai, Y. Mansour, and R. Servedio.

This material is based upon work supported by National Science Foundation under Grant No. 0122581.
Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the
National Science Foundation