Emojifier
🎯 BRIEF
The project involves using LSTM (Long Short-Term Memory) to analyze a large dataset of sentences and assign appropriate emojis to them taking the word order into account. The LSTM model is trained to understand the context and sentiment of the sentences and then generate a suitable emoji that reflects the meaning of the sentence. This project proves why LSTM is good at capturing long-term dependencies in the input sequences.
🔧 TOOLS
Python, Tensorflow, Keras, Matplotlib
🤝 CONTRIBUTION
Developed a 2-layer LSTM sequence classifier in Python using TensorFlow to identify the most appropriate emoji for a given sentence, taking into account the word order.
Utilized pre-trained 50-dimensional GloVe embeddings to convert each word into its corresponding vector.
Implemented different model architecture and hyperparameters, and managed to get an accuracy of 92% on the test set.
🏆 TAKEAWAYS
It is important to consider the order in which the words appear, as well as the context in which they are used.
LSTMs are able to do this by using a memory cell to store information about previous words in the sequence and update this information as new words are encountered.
📷 SCREENSHOT
Final Output