![]() ![]() ![]() You can find my exercise solutions in this GitHub repository. I think this is a really good book for anyone wanting to dive into deep learning. There were some chapters where I have found it hard to connect the theory and practice (such as RNNs and GANs), but maybe this was just because I was not focused. The authors don’t go too heavy on the math and they always provide the implementation of the math concepts. Gated Recurrent Units (GRU) As RNNs and particularly the LSTM architecture ( Section 10.1 ) rapidly gained popularity during the 2010s, a number of researchers began to experiment with simplified architectures in hopes of retaining the key idea of. I think this book has a good balance of theory and practice. Gated Recurrent Units (GRU) Dive into Deep Learning 1.0.3 documentation. I have also read the Appendix: Mathematics for Deep Learning (I read it before reading the other sections). Dive into this book if you want to dive into deep learning Jiawei Han, Michael Aiken Chair Professor, University of Illinois at Urbana-Champaign This is a highly welcome addition to the machine learning literature, with a focus on hands-on experience implemented via the integration of Jupyter notebooks. ![]() Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard. In some of the exercises I left some TODO’s and notes to myself so that I know what to revisit when I come back to what I’ve done. Interactive deep learning book with multi-framework code, math, and discussions. There are really a lot of exercises (about 3-4 per chapter). The book is divided into sections and every section has multiple chapters. It goes all the way from simple multi-layer perceptrons (MLPs) to generative adversarial networks (GANs). Dive into Deep Learning is a book which goes deep into deep learning. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |