Bot essentials 16: NLP implementations in chatbots - 2

Anwesh Roy
|
3
min read
Bot essentials 16: NLP implementations in chatbots - 2

There are many existing out-of-box NLP models available today. These can take a sentence and match it with a list of available sentences and pick the top matches from them. However, they make basic assumptions about the sentence structure or words in them. Those may not apply necessarily to a chatbot conversation. Let's find out more about the implementations.

Keyword-based matching is a popular implementations technique

The technique can help to some extent but will fail when we use words having similar meanings.

Applying POS tags and using NER to identify entities does not work well for lower case sentences and results in many false positives. Most existing NLP models are trained with proper case sentences. These carry context information regarding the entities mentioned in the sentence. However, these models do not work well for lower case sentences.

Matching exact phrases or using co-occurrence of words doesn't work since users frequently interchange words while meaning the same thing.

Grammatical constructs are flexible in chatbot conversations

Deep Learning-based NLP algorithms are the latest promise to help solve the problem of understanding the meaning of words and/or sentences. Using word vectors to derive semantics is a practice that we widely use. Numerous algorithms e.g. Word2Vec, Glove are available to generate contextual word vectors from training sentences.

These word vectors are case insensitive and can handle lowercase sentences. However they do not handle OOV and polysemy problems. These word vectors do not generate contextual vectors at run time for a given input sentence but provide constant word vectors based on the training data set.

ELMO is a new algorithm that can now generate run-time word vectors based on the input sentence

Creating sentence level vectors and matching these sentence vectors is another technique. Some NLP implementations represent sentence vectors as an average vector value of all the words contained in the sentence. It is difficult to perceive how a mathematical average can create an aggregate meaning for a sentence. The meaning of a sentence is an abstract form.

Deep learning methods to create representations of sentences and comparing them for similarity is promising. Many Deep Learning models are available to generate sentence representations and determine their similarity using supervised data but there are no good unsupervised techniques.

A large supervised dataset is a must-have success factor for supervised techniques and something that is difficult to get.

Many research groups are addressing the challenges described above and the hope is that we will get better NLP models in the future that will enable us to understand human conversations much better.

This concludes the NLP implementation series. We hope you’ve enjoyed it. These practices are implemented with Engati’s chatbots to create meaningful relationships between your customers and your business. 

Register with Engati today to free your business form all the mundane day-to-day responsibilities with power of conversational automation.

Tags
No items found.
About Engati

Engati is a one-stop platform for delighted customers. With our intelligent bots, we help you create the smoothest of Customer Experiences. And now, we're helping you find those customers too. The award-winning Marketing Automation platform, LeadMi, received some major upgrades and joined our family as Engati Acquire. So, let's get started?

Get Started Free