Sequence-to-Sequence (seq2seq) Encoder-Decoder Neural Networks, Clearly Explained!!!
StatQuest with Josh Starmer StatQuest with Josh Starmer
1.17M subscribers
152,420 views
0

 Published On May 7, 2023

In this video, we introduce the basics of how Neural Networks translate one language, like English, to another, like Spanish. The ideas is to convert one sequence of things into another sequence of things, and thus, this type of neural network can be applied to all sort so of problems, including translating amino acids into 3-dimensional structures.

NOTE: This StatQuest assumes that you are already familiar with...
Long, Short-Term Memory (LSTM):    • Long Short-Term Memory (LSTM), Clearl...  
...and...
Word Embedding:    • Word Embedding and Word2Vec, Clearly ...  

Also, if you'd like to go through Ben Trevett's tutorials, see: https://github.com/bentrevett/pytorch...

If you'd like to support StatQuest, please consider...
Patreon:   / statquest  
...or...
YouTube Membership:    / @statquest  

...buying my book, a study guide, a t-shirt or hoodie, or a song from the StatQuest store...
https://statquest.org/statquest-store/

...or just donating to StatQuest!
https://www.paypal.me/statquest

Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
  / joshuastarmer  

0:00 Awesome song and introduction
3:43 Building the Encoder
8:27 Building the Decoder
12:58 Training The Encoder-Decoder Model
14:40 My model vs the model from the original manuscript

#StatQuest #seq2seq #neuralnetwork

show more

Share/Embed