In this paper, based on the “locality” property of gradient descent algorithms, we propose a new framework, termed “sequential coreset”, which effectively avoids these obstacles. A key obstacle is that they often rely on the pseudo-dimension and total sensitivity bound that can be very high or hard to obtain. However, most of existing coreset methods are problem-dependent and cannot be used as a general tool for a broader range of applications. Coreset is a popular data compression technique that has been extensively studied before. %X A wide range of optimization problems arising in machine learning can be solved by gradient descent algorithms, and a central question in this area is how to efficiently compress a large-scale dataset so as to reduce the computational complexity. %C Proceedings of Machine Learning Research %B Proceedings of the 38th International Conference on Machine Learning %T A Novel Sequential Coreset Method for Gradient Descent Algorithms In practice, the experimental results suggest that our method can save a large amount of running time compared with the baseline algorithms.Ĭite this = Moreover, our method is particularly suitable for sparse optimization whence the coreset size can be further reduced to be only poly-logarithmically dependent on the dimension. A wide range of optimization problems arising in machine learning can be solved by gradient descent algorithms, and a central question in this area is how to efficiently compress a large-scale dataset so as to reduce the computational complexity.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |