<!-- JSON-LD markup generated by Google Structured Data Markup Helper. -->
<script type="application/ld+json"></script>
{
 "@context" : "http://schema.org",
 "@type" : "Article",
 "name" : "Bot essentials 8: Chatbots NLP aspects - The deep dive 2",
 "author" : {
   "@type" : "Person",
   "name" : "Deepak Nachnani"
 },
 "image" : "https://global-uploads.webflow.com/5ef788f07804fb7d78a4127a/5ef788f17804fb78cfa41b73_hatbots-nlp-aspects-deep-dive-2.jpg",
 "articleSection" : "Another use of NLP",
 "articleBody" : "sentiment analysis. Figuring out the sentiment of the text that the user has typed. This is quite useful in determining an empathetic response for the user. Platforms that build bots allow you to incorporate sentiment analysis. With the intelligent path flows set the emotive and empathy context of the response that the user generates.",
 "url" : "https://www.engati.com/blog/chatbots-nlp-aspects-deep-dive-2",
 "publisher" : {
   "@type" : "Organization",
   "name" : "Engati"
 }
}

Conversational Automation

Bot essentials 8: Chatbots NLP aspects - The deep dive 2

Deepak Nachnani
.
Nov 23
.

Table of contents

Key takeawaysCollaboration platforms are essential to the new way of workingEmployees prefer engati over emailEmployees play a growing part in software purchasing decisionsThe future of work is collaborativeMethodology

In the previous blog, we understood the advances in algorithmic science to improve the accuracy of Natural Language Processing (NLP) constructs and how vector scoring could be improved. We trace the evolution path further in this blog for base elements of the network and how adding a memory for long term dependencies optimise it further.

Evolution 4

A further evolution was the use of semantic matching through a global word to word co-occurrence matching matrix. The base algorithm relied on the difference vectors between words. Further multiplied by the context word, if equal to the ratio of their co-occurrence probabilities.

Understanding Natural Language Processing (NLP)

Evolution 5

Though the use of the global word to word co-occurrence matrix in Evolution 4 increased the accuracy of base model predictions, long and short term dependency determination was still found to be lacking.

This evolution called for the application of Recurrent Neural Networks (RNNs) which fed text data in a sequence to the network. The RNN application was an improvement since it handled local temporal dependencies quite effectively. The problem arose in figuring out long sentence forms. To address the issue of long form sentences, researchers designed an algorithm called Long-short term memory (LSTM). This algorithm introduces a memory cell that stores long term dependencies of the long form sentence being processed.

Think of it as an indicator score that updates based on every word context to store the overall vector in long form sentences. This allowed for weightage based on binary science that allowed for the network to forget or weigh it higher for relevance, making the RNN further optimized for relevance.

Evolution 6

Researchers were advancing the use of RNN and LSTM techniques for word context and intent. And another parallel application was producing success and results. The name of the approach is Convolutional Neural Networks or CNNs. Would it be possible to use a CNN in an NLP solution framework to further enhance accuracy? It sure did seem so.

Consider a 2D model that image applications use to solve for a small segment of an image to do an ID filter. Could we use the technique for linear sentences of words to predict context and intent?

Read about Bot Essentials 9: The NLU Deepdive

It sure did seem so. ID-based CNNs were more accurate than RNNs. Multiple CNNs working on a construct could create a feature map. They could learn from the features that appear the most often. CNNs still remain a work in progress with promising results on the default Q&A mechanisms used in bots for lightweight determinations of questions and the answers that match the search intent and context.

Another use of NLP...

.. is with sentiment analysis. Figuring out the sentiment of the text that the user has typed. This is quite useful in determining an empathetic response for the user. Platforms that build bots allow you to incorporate sentiment analysis. With the intelligent path flows set the emotive and empathy context of the response that the user generates. For a further use of bots and utilising an NLP engine for better accuracy, please sign up with Engati to start your bot journey.

Our chatbot can change the trajectory of your business. Register with Engati today for your free chatbot!

Share
Share
Deepak Nachnani

Andy is the Co-Founder and CIO of SwissCognitive - The Global AI Hub. He’s also the President of the Swiss IT Leadership Forum.

Andy is a digital enterprise leader and is transforming business strategies keeping the best interests of shareholders, customers, and employees in mind.

Follow him for your daily dose of AI news and thoughts on using AI to improve your business.

Catch our interview with Andy on AI in daily life

Continue Reading

Bot essentials 8: Chatbots NLP aspects - The deep dive 2

Deepak Nachnani
|
3
min read

In the previous blog, we understood the advances in algorithmic science to improve the accuracy of Natural Language Processing (NLP) constructs and how vector scoring could be improved. We trace the evolution path further in this blog for base elements of the network and how adding a memory for long term dependencies optimise it further.

Evolution 4

A further evolution was the use of semantic matching through a global word to word co-occurrence matching matrix. The base algorithm relied on the difference vectors between words. Further multiplied by the context word, if equal to the ratio of their co-occurrence probabilities.

Understanding Natural Language Processing (NLP)

Evolution 5

Though the use of the global word to word co-occurrence matrix in Evolution 4 increased the accuracy of base model predictions, long and short term dependency determination was still found to be lacking.

This evolution called for the application of Recurrent Neural Networks (RNNs) which fed text data in a sequence to the network. The RNN application was an improvement since it handled local temporal dependencies quite effectively. The problem arose in figuring out long sentence forms. To address the issue of long form sentences, researchers designed an algorithm called Long-short term memory (LSTM). This algorithm introduces a memory cell that stores long term dependencies of the long form sentence being processed.

Think of it as an indicator score that updates based on every word context to store the overall vector in long form sentences. This allowed for weightage based on binary science that allowed for the network to forget or weigh it higher for relevance, making the RNN further optimized for relevance.

Evolution 6

Researchers were advancing the use of RNN and LSTM techniques for word context and intent. And another parallel application was producing success and results. The name of the approach is Convolutional Neural Networks or CNNs. Would it be possible to use a CNN in an NLP solution framework to further enhance accuracy? It sure did seem so.

Consider a 2D model that image applications use to solve for a small segment of an image to do an ID filter. Could we use the technique for linear sentences of words to predict context and intent?

Read about Bot Essentials 9: The NLU Deepdive

It sure did seem so. ID-based CNNs were more accurate than RNNs. Multiple CNNs working on a construct could create a feature map. They could learn from the features that appear the most often. CNNs still remain a work in progress with promising results on the default Q&A mechanisms used in bots for lightweight determinations of questions and the answers that match the search intent and context.

Another use of NLP...

.. is with sentiment analysis. Figuring out the sentiment of the text that the user has typed. This is quite useful in determining an empathetic response for the user. Platforms that build bots allow you to incorporate sentiment analysis. With the intelligent path flows set the emotive and empathy context of the response that the user generates. For a further use of bots and utilising an NLP engine for better accuracy, please sign up with Engati to start your bot journey.

Our chatbot can change the trajectory of your business. Register with Engati today for your free chatbot!

Tags
No items found.
About Engati

Engati powers 45,000+ chatbot & live chat solutions in 50+ languages across the world.

We aim to empower you to create the best customer experiences you could imagine. 

So, are you ready to create unbelievably smooth experiences?

Check us out!