Classification with Margin Constraints: A Unification with Applications to Optimization
File(s)OPT2015_paper_48(1).pdf (213.36 KB)
Published version
Author(s)
Joulani, P
Gyorgy, A
Szepesvari, C
Type
Conference Paper
Abstract
This paper introduces Classification with Margin Constraints (CMC), a simple
generalization of cost-sensitive classification that unifies several learning settings.
In particular, we show that a CMC classifier can be used, out of the box, to solve
regression, quantile estimation, and several anomaly detection formulations. On
the one hand, our reductions to CMC are at the loss level: the optimization problem
to solve under the equivalent CMC setting is exactly the same as the optimization
problem under the original (e.g. regression) setting. On the other hand,
due to the close relationship between CMC and standard binary classification, the
ideas proposed for efficient optimization in binary classification naturally extend
to CMC. As such, any improvement in CMC optimization immediately transfers
to the domains reduced to CMC, without the need for new derivations or programs.
To our knowledge, this unified view has been overlooked by the existing
practice in the literature, where an optimization technique (such as SMO or PEGASOS)
is first developed for binary classification and then extended to other
problem domains on a case-by-case basis. We demonstrate the flexibility of CMC
by reducing two recent anomaly detection and quantile learning methods to CMC.
generalization of cost-sensitive classification that unifies several learning settings.
In particular, we show that a CMC classifier can be used, out of the box, to solve
regression, quantile estimation, and several anomaly detection formulations. On
the one hand, our reductions to CMC are at the loss level: the optimization problem
to solve under the equivalent CMC setting is exactly the same as the optimization
problem under the original (e.g. regression) setting. On the other hand,
due to the close relationship between CMC and standard binary classification, the
ideas proposed for efficient optimization in binary classification naturally extend
to CMC. As such, any improvement in CMC optimization immediately transfers
to the domains reduced to CMC, without the need for new derivations or programs.
To our knowledge, this unified view has been overlooked by the existing
practice in the literature, where an optimization technique (such as SMO or PEGASOS)
is first developed for binary classification and then extended to other
problem domains on a case-by-case basis. We demonstrate the flexibility of CMC
by reducing two recent anomaly detection and quantile learning methods to CMC.
Date Issued
2015-12-11
Date Acceptance
2015-11-02
Citation
2015
Copyright Statement
© 2015 The Authors
Identifier
http://opt-ml.org/oldopt/opt15/papers.html
Source
8th NIPS Workshop on Optimization for Machine Learning
Publication Status
Published
Start Date
2015-12-11
Finish Date
2015-12-11
Coverage Spatial
Montreal, Quebec, Canada