Effortless optimization through gradient flows – Machine Learning Research Blog

Optimization algorithms often rely on simple intuitive principles, but their analysis quickly leads to a lot of algebra, where the original idea is not transparent. In last month post, Adrien Taylor explained how convergence proofs could be automated. This month, I will show how proof sketches can be obtained easily for algorithms based on gradient descent. This will be done using vanishing step-sizes that lead to gradient flows. The intuitive principle behind gradient descent is the quest for

6 mentions: @BachFrancis
Date: 2020/05/01 14:21

Referring Tweets

@BachFrancis No muguet to start this month of May, but a new blog post on gradient flows. t.co/6NSFT6DoC1

Related Entries

Read more Complete Statistical Theory of Learning (Vladimir Vapnik) | MIT Deep Learning Series - YouTube
0 users, 6 mentions 2020/02/18 14:21
Read more [2002.10376] The Two Regimes of Deep Network Trainingcontact arXivarXiv Twitter
0 users, 3 mentions 2020/03/01 02:20
Read more [2005.00770] Exploring and Predicting Transferability across NLP Tasksopen searchopen navigation men...
0 users, 3 mentions 2020/06/28 06:52
Read more [2007.04131] Pitfalls to Avoid when Interpreting Machine Learning Modelsopen searchopen navigation m...
0 users, 7 mentions 2020/07/09 08:21
Read more GitHub - facebookresearch/fairscale: PyTorch extensions for high performance and large scale trainin...
0 users, 2 mentions 2020/07/21 06:51