Smarter training of neural networks | MIT News

MIT CSAIL project shows the neural nets we typically train contain smaller “subnetworks” that can learn just as well, and often faster.

19 mentions: @KirkDBorne@dnlcrl@waynerad@SigOpt@jreuben1@THEAdamGabriel@iamChuckRussell@sharon_smith_1
Date: 2019/05/20 14:18

Referring Tweets

@KirkDBorne #MachineLearning Research Scientists Have Figured Out How to Make #NeuralNetworks 90 Percent Smaller—but Just as Smart! by @MIT_CSAIL #BigData #DataScientists #DataScience #AI
@dnlcrl Neural nets typically contain smaller “subnetworks” that can often learn faster
@waynerad "The team likens traditional deep learning methods to a lottery. Training large neural networks is kind of like trying to guarantee you will win the lottery by blindly buying every possible ticket."