Natural Language Processing with Probabilistic Models
My second accomplishment in the new #NLP specialization by deeplearning.ai on Coursera:
"Natural Language Processing with Probabilistic Models"Certificate: https://lnkd.in/dwSxEpN
I must say, this course was particularly exciting as it gave me a chance to go deeper into some of the theoretical, technical, and implementational details, either from the course or getting me to know more on my own as a byproduct. The topics covered were:
- #autocorrect ,minimum-edit distance (Levenshtein distance), #dynamicprogramming , and backtracking.
- Corpus preprocessing and preparation, OOV words, Laplace and K-smoothing, backoff, and interpolation model.
- #POS (Part of Speech) tagging with Markov Chains and the #Viterbi algorithm.
- Unigram, bigram, and N-gram #language #models and sequence probabilities, count, and probability matrices.
- The concept of #generative #languagemodels.
- #Perplexity metric evaluation for language models.
- #wordembeddings: Word representations such as Word2Vec (CBOW and Skip-gram), with an in-depth. focus on #CBOW.
This took me more than expected but this motivates me to continue further and work harder. Learning never ends!
#machinelearning #nlp #coursera #deeplearningai #ai #datascience #datascientist
Comments
Post a Comment