Enabling cognitive search
We have previously discussed the potential of Cognitive Search. In this article we're delving into a neural approach to enabling cognitive search.
Enabling cognitive search
In simple terms, we're trying to find the most relevant documents and passages in those documents that have a high contextual and semantic similarity to an input search query.
Below is a reference architecture of a typical Question/Answer system that can be utilized for building a Cognitive Search system.
The starting point is to setup a document store that contains the searchable documents with some sort of indexes to quicken the search. If the order of such documents is over a million, then it makes sense to use a text-based search engine like Elasticsearch or Solr to generate inverted indexes at ‘word’ level and store the passages as searchable documents.
The retriever is responsible for using the query to search for similar documents in the document store. It then retrieves the top K document passages based on a predefined cutoff threshold.
For getting the top K matches from a text indexing system, algorithms like TF-IDF or BM25 are used. These algorithms focus on exact word, or phrase matches based on weighting factor that focus on unique terms in the documents.
The shortcomings of using this approach is that the context of words, meaning of the query or passage, polysemy occurrences are not handled. This could lead to losing out some of the relevant documents in the initial shortlisting.
If the goal is to quickly get to the candidate documents- this is one of the best ways of doing that.
The reader is a powerful neural model that reads through the top K candidate passages and uses contextual and semantic similarity to create a shortlist of the most relevant matches between the query vector and passage/sentence vectors using cosine similarity.
Nowadays, most of the reader models are large pre-trained language models like BERT, Roberta, XLNet, ALBERT or distilBERT. All of these are transformer-based language models that have been trained for next word, next sentence prediction.
Using transfer learning techniques these pre-trained models can also be fine-tuned to get domain specific encoders.
A ranking algorithm can be used to rank the output of the reader to arrange the most relevant documents and corresponding passages first.
The above mechanism to build a cognitive search engine is an unsupervised technique. It will require heuristic rules in the ranker to prune out false positives or perform re-ranking to get better results.
Supervised technique can be in the form of a Learning to Rank (LTR) algorithm to personalize the ranking, based on user’s feedback.
The reader is computationally intensive and will need a GPU server for larger models or multi-core CPUs for smaller models.
The scope for Cognitive Search
Cognitive Search will open up access to thousands of documents which are available in an enterprise and allow users to lookup these documents within minutes rather than reading through those documents to look for an answer to their question or enhance their learning.
Any use case that needs shifting through thousands of voluminous text documents in an intelligent way that would mimic how human beings read and comprehend information is prime for using Cognitive Search.
Cognitive Search can also be used to enhance results generated from current text-based search engines like Elasticsearch or SOLR. By encoding context and intent into search, Ecommerce, news websites, search engines research websites etc. can employ cognitive search to enhance the quality of search results and improve user engagement.
The future of search is going to be exciting, so stay tuned! Until then, start adopting modern AI solutions with Engati's chatbot solutions.
Engati is a one-stop platform for delighted customers. With our intelligent bots, we help you create the smoothest of Customer Experiences. And now, we're helping you find those customers too. The award-winning Marketing Automation platform, LeadMi, received some major upgrades and joined our family as Engati Acquire. So, let's get started?Get Started Free