Natural Language Processing with Attention Models
Proud to share my last achievement on the final course of deeplearning.ai 's #NLP certification on Coursera:
"Natural Language with Attention Models"
I must say this course was indeed the most challenging of the whole specialization but gave me an amazing overview, covering the state-of-the-art NLP models that came out even just this year. I learned about
- Neural #machinetranslation with the #attention mechanism for alignment to overcome classic #seq2seq models, sampling, and decoding.
- Machine translation evaluation metrics: BLEU, ROUGE.
- Text #summarization
- The #transformer architecture base.
- Dot product attention, self-attention, and multi-head attention.
- Transformer-based state-of-the-art architectures such as BERT, ELMo, GPT, and T5, fine-tuning, and evaluation.
- Question answering
- LSH Attention, Reversible Networks, and the Reformer state-of-the-art architecture.
This is only the beginning, but can't wait to get deeper into these and use them in real-world applications.
#deeplearning #machinelearning #datascience #ai #nlp
Comments
Post a Comment