Seq2seq Model with Attention

video-placeholder
Loading...
シラバスを表示

学習するスキル

Reformer Models, Neural Machine Translation, Chatterbot, T5+BERT Models, Attention Models

レビュー

4.3 (834 件の評価)

  • 5 stars
    66.18%
  • 4 stars
    15.10%
  • 3 stars
    9.11%
  • 2 stars
    5.27%
  • 1 star
    4.31%

SB

2020年11月20日

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

DB

2023年1月24日

I learned a lot from this course, and the ungraded and graded problems are relevant to understanding and knowing how to build a transformer or a reformer from scratch

レッスンから

Neural Machine Translation

Discover some of the shortcomings of a traditional seq2seq model and how to solve for them by adding an attention mechanism, then build a Neural Machine Translation model with Attention that translates English sentences into German.

講師

  • Placeholder

    Younes Bensouda Mourri

    Instructor

  • Placeholder

    Łukasz Kaiser

    Instructor

  • Placeholder

    Eddy Shyu

    Curriculum Architect

コース一覧で検討

サインアップは無料です。今すぐサインアップして、パーソナライズされたお勧め、更新、サービスを利用しましょう。