We have already looked at TOP 100 Coursera Specializations and today we will check out Natural Language Processing Specialization from deeplearning.ai
Coursera Specialization is a series of courses that help you master a skill. To begin, you can enroll in the Specialization directly, or review its courses and choose the one you’d like to start with. When you subscribe to a course that is part of a Specialization, you’re automatically subscribed to the full Specialization. You can either complete just one course or you can pause your learning or end your subscription at any time.
This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.
By the end of this Specialization, you will be ready to design NLP applications that perform question-answering and sentiment analysis, create tools to translate languages and summarize text, and even build chatbots. These and other NLP applications are going to be at the forefront of the coming transformation to an AI-powered future.
This Specialization will equip you with the state-of-the-art deep learning techniques needed to build cutting-edge NLP systems:
• Use logistic regression, naïve Bayes, and word vectors to implement sentiment analysis, complete analogies, and translate words, and use locality sensitive hashing for approximate nearest neighbors.
• Use dynamic programming, hidden Markov models, and word embeddings to autocorrect misspelled words, autocomplete partial sentences, and identify part-of-speech tags for words.
• Use dense and recurrent neural networks, LSTMs, GRUs, and Siamese networks in TensorFlow and Trax to perform advanced sentiment analysis, text generation, named entity recognition, and to identify duplicate questions.
• Use encoder-decoder, causal, and self-attention to perform advanced machine translation of complete sentences, text summarization, question-answering and to build chatbots. Models covered include T5, BERT, transformer, reformer, and more!
There are 4 Courses in this Specialization
In Course 1 of the Natural Language Processing Specialization, offered by deeplearning.ai, you will:
a) Perform sentiment analysis of tweets using logistic regression and then naïve Bayes,
b) Use vector space models to discover relationships between words and use PCA to reduce the dimensionality of the vector space and visualize those relationships, and
c) Write a simple English to French translation algorithm using pre-computed word embeddings and locality sensitive hashing to relate words via approximate k-nearest neighbor search. Please make sure that you’re comfortable programming in Python and have a basic knowledge of machine learning, matrix multiplications, and conditional probability.
In Course 2 of the Natural Language Processing Specialization, offered by deeplearning.ai, you will:
a) Create a simple auto-correct algorithm using minimum edit distance and dynamic programming,
b) Apply the Viterbi Algorithm for part-of-speech (POS) tagging, which is important for computational linguistics,
c) Write a better auto-complete algorithm using an N-gram language model, and d) Write your own Word2Vec model that uses a neural network to compute word embeddings using a continuous bag-of-words model. Please make sure that you’re comfortable programming in Python and have a basic knowledge of machine learning, matrix multiplications, and conditional probability.
In Course 3 of the Natural Language Processing Specialization, offered by deeplearning.ai, you will:
a) Train a neural network with GLoVe word embeddings to perform sentiment analysis of tweets,
b) Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model,
c) Train a recurrent neural network to perform named entity recognition (NER) using LSTMs with linear layers, and
d) Use so-called ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning.
In Course 4 of the Natural Language Processing Specialization, offered by deeplearning.ai, you will:
a) Translate complete English sentences into French using an encoder-decoder attention model,
b) Build a Transformer model to summarize text,
c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model.
By the end of this Specialization, you will be ready to design NLP applications that perform question-answering and sentiment analysis, create tools to translate languages and summarize text, and even build chatbots. These and other NLP applications are going to be at the forefront of the coming transformation to an AI-powered future.
Like this post? Don’t forget to share it!
The cryptocurrency trading world has grown significantly in recent years, with automation playing a key…
The non-fungible token (NFT) market has witnessed explosive growth over the past few years, transforming…
There are few things as valuable to a business as well-designed software. Organizations today rely…
The cryptocurrency industry is being reshaped by the fusion of blockchain technology and artificial intelligence…
Introduction Artificial Intelligence (AI) has also found its relevance in graphic design and is quickly…
Imagine a world where the brilliance of Artificial Intelligence (AI) meets the unbreakable security of…
This website uses cookies.