Artwork

Kandungan disediakan oleh Brian Carter. Semua kandungan podcast termasuk episod, grafik dan perihalan podcast dimuat naik dan disediakan terus oleh Brian Carter atau rakan kongsi platform podcast mereka. Jika anda percaya seseorang menggunakan karya berhak cipta anda tanpa kebenaran anda, anda boleh mengikuti proses yang digariskan di sini https://ms.player.fm/legal.
Player FM - Aplikasi Podcast
Pergi ke luar talian dengan aplikasi Player FM !

Where'd My Gradient Go? It Vanished!

8:39
 
Kongsi
 

Manage episode 446714678 series 3605861
Kandungan disediakan oleh Brian Carter. Semua kandungan podcast termasuk episod, grafik dan perihalan podcast dimuat naik dan disediakan terus oleh Brian Carter atau rakan kongsi platform podcast mereka. Jika anda percaya seseorang menggunakan karya berhak cipta anda tanpa kebenaran anda, anda boleh mengikuti proses yang digariskan di sini https://ms.player.fm/legal.

This video discusses the vanishing gradient problem, a significant challenge in training deep neural networks. The speaker explains how, as a neural network becomes deeper, gradients—measures of how changes in network parameters affect the loss function—can decrease exponentially, leading to a situation where early layers of the network are effectively frozen and unable to learn. This problem arises because common activation functions like the sigmoid function can produce very small derivatives, which compound during backpropagation. The video then explores solutions like using different activation functions (like ReLU) and architectural changes (like residual networks and LSTMs) to mitigate this issue.

Watch the video: https://www.youtube.com/watch?v=ncTHBi8a9uA&pp=ygUSdmFuaXNoaW5nIGdyYWRpZW50

  continue reading

71 episod

Artwork
iconKongsi
 
Manage episode 446714678 series 3605861
Kandungan disediakan oleh Brian Carter. Semua kandungan podcast termasuk episod, grafik dan perihalan podcast dimuat naik dan disediakan terus oleh Brian Carter atau rakan kongsi platform podcast mereka. Jika anda percaya seseorang menggunakan karya berhak cipta anda tanpa kebenaran anda, anda boleh mengikuti proses yang digariskan di sini https://ms.player.fm/legal.

This video discusses the vanishing gradient problem, a significant challenge in training deep neural networks. The speaker explains how, as a neural network becomes deeper, gradients—measures of how changes in network parameters affect the loss function—can decrease exponentially, leading to a situation where early layers of the network are effectively frozen and unable to learn. This problem arises because common activation functions like the sigmoid function can produce very small derivatives, which compound during backpropagation. The video then explores solutions like using different activation functions (like ReLU) and architectural changes (like residual networks and LSTMs) to mitigate this issue.

Watch the video: https://www.youtube.com/watch?v=ncTHBi8a9uA&pp=ygUSdmFuaXNoaW5nIGdyYWRpZW50

  continue reading

71 episod

Все серии

×
 
Loading …

Selamat datang ke Player FM

Player FM mengimbas laman-laman web bagi podcast berkualiti tinggi untuk anda nikmati sekarang. Ia merupakan aplikasi podcast terbaik dan berfungsi untuk Android, iPhone, dan web. Daftar untuk melaraskan langganan merentasi peranti.

 

Panduan Rujukan Pantas

Podcast Teratas
Dengar rancangan ini semasa anda meneroka
Main