ORIE Colloquium
We propose and analyze methods in logistic regression that induce sparse solutions. We introduce a novel condition number associated with the data for logistic regression that measures the degree of non-separability of the data, and that naturally enters the analysis and the computational properties of methods. We take advantage of very recent new results in first-order methods for convex optimization due to Bach and others, to present new computational guarantees for the performance of methods for logistic regression. In the high-dimensional regime in particular, these guarantees precisely characterize how the methods impart both data-fidelity and implicit regularization, for any dataset. Time permitting, we will also discuss related results for AdaBoost. This is joint work with Paul Grigas and Rahul Mazumder.