Sell Your Projects | My Account | Careers | This email address is being protected from spambots. You need JavaScript enabled to view it. | Call: +91 9573777164

The Generalization Ability of Online Algorithms for Dependent Data - 2013

1 1 1 1 1 Rating 4.80 (90 Votes)

PROJECT TITLE :

The Generalization Ability of Online Algorithms for Dependent Data - 2013

ABSTRACT:

We study the generalization performance of online learning algorithms trained on samples coming from a dependent source of data. We show that the generalization error of any stable online algorithm concentrates around its regret-an easily computable statistic of the online performance of the algorithm-when the underlying ergodic process is β- or φ -mixing. We show high-probability error bounds assuming the loss function is convex, and we also establish sharp convergence rates and deviation bounds for strongly convex losses and several linear prediction problems such as linear and logistic regression, least-squares SVM, and boosting on dependent data. In addition, our results have straightforward applications to stochastic optimization with dependent data, and our analysis requires only martingale convergence arguments; we need not rely on more powerful statistical tools such as empirical process theory.


Did you like this research project?

To get this research project Guidelines, Training and Code... Click Here


The Generalization Ability of Online Algorithms for Dependent Data - 2013 - 4.8 out of 5 based on 90 votes

Project EnquiryLatest Ready Available Academic Live Projects in affordable prices

Included complete project review wise documentation with project explanation videos and Much More...