You are familiar with regular Feed Forward Neural Networks (FFNNs). The data flows from the input layer to the output layer.

Source: https://medium.com/@sprhlabs/understanding-deep-learning-dnn-rnn-lstm-cnn-and-r-cnn-6602ed94dbff

Source: https://medium.com/@sprhlabs/understanding-deep-learning-dnn-rnn-lstm-cnn-and-r-cnn-6602ed94dbff

There are several limitations to typical feed-forward networks:

This means that FFNNs are unsuitable for sequential data (like text)!

For example, if you want to create a model that will predict the diet of someone from his physical characteristics (like his weight). It is essential to take into account the person’s history and not only his current weight.

While some dense neural networks or SVM can be used with word embeddings, they ignore the orde of the words!

Traditional recurrent neural networks (RNN)

Long short-term memory (LSTM)


Previous Section

Home