The Zenith of Natural Language Technologies: Conversational AI

Conversational AI is the layman’s term for Natural Language Interaction (NLI), a subset of Natural Language Processing (NLP) that involves almost all natural language technologies.

NLI synthesizes aspects of NLP, Natural Language Understanding (NLU), Natural Language Generation (NLG), and Natural Language Querying (NLQ) to facilitate the rapid, conversational exchanges of popular platforms such as Amazon Alexa, for example.

Although each aforesaid natural language technology is based on NLP, one can argue the most vital to conversational AI is NLG, which produces linguistic summaries or explanations of what are oftentimes quantified data.

When coupled with NLQ, this capacity enables users to swiftly ask (and receive answers) to questions, which forms the bulk of conversational AI.

NLP’s role is to accurately convert data (the questions asked) into text according to conventions for parts of speech and grammar.

NLU specializes in contextualizing that data while facilitating a greater semantic understanding of the intention of the language, the question, or the speaker.

googletag.

cmd.

push(function() { googletag.

display(div-gpt-ad-1439400881943-0); }); The junction of these natural language capabilities enables what Arria NLG CEO Sharon Daniels termed “answers on demand”, which is pivotal for interacting with analytics, Business Intelligence, and workflows in a rapid, conversational manner that’s influential for setting the pace of business today.

Speech to Text, Text to Speech Although the speech recognition elements characterizing spoken interactions are frequently desired with conversational AI, this technology also involves written exchanges between end users and analytics systems.

In fact, text is likely the centerpiece of NLI in that even when language is spoken, it’s actually converted to text prior to the generation of responses.

NLP is critical for implementing this phase of conversational AI, particularly as it relates to NLG.

“We have to know what’s being asked and convert the spoken word into text that can then be analyzed,” Daniels mentioned.

  “So that’s speech to text.

” Once the content of that language has been parsed and understood (the latter of which is aided by NLU), there’s an analytics component integral to culling the appropriate data for a relevant response.

In some instances, these analytics may involve NLG options.

“We analyze data specifically in preparation for turning it into language,” Daniels revealed.

“So then, we’re doing the analysis, the computational linguistics, and then we’re doing what I like to call the communication layer where we’re then communicating that information in the form of written summaries and written reports.

” For spoken responses, the final step is converting that information into speech.

Episodic Memory Whether applied to speech recognition or not, a particularly fascinating aspect of conversational AI is its ability to leverage episodic memory.

Full-fledged episodic memory enables NLI systems knowing what specific references are to, without the user having to repeat their antecedents verbatim each time they’re mentioned.

Classic examples include demonstrative pronouns (“these”, “those”, “this”, etc.

) that conversational AI mechanisms are tasked with knowing correlate to a previous reference to marketing reports, for example.

Daniels alluded to this capability when mentioning top platforms in this space “remember the question that you asked so you don’t have to repeat the question again.

” For example, “if you say how are my sales today, and you get an answer about a particular sales category in a region, you don’t want to have to say all over again how are my sales in this region compared to last year,” Daniels explained.

“You can just say ‘how does that compare to last year’.

” Quintessential episodic memory enables systems to recollect what those references are for specific subjects, callers, or interactions, before applying the same word (‘that’, in the use case Daniels articulated) to other referents during subsequent interactions.

Natural Language Querying One of the limitations ascribed to traditional BI (largely bereft of cognitive computing endowments) is the questions must be predetermined.

Spontaneous, ad-hoc questions that weren’t defined in advance by laborious, IT-intensive efforts required extensive remodeling.

Although there are plentiful means of overcoming this barrier, Daniels implied that when driven by a robust NLG implementation, conversational AI is one of them.

“It goes beyond just your typical templated-question answer,” she indicated.

Alternative approaches with graph mechanisms support exploratory analytics and ad-hoc questioning based on naturally expanding ontologies.

With conversational AI, “you may not know what to specifically ask, but [this] technology will tell you what’s relevant in the context of the question,” Daniels commented.

The natural language underpinnings of this approach enable users to phrase questions in any number of ways to get germane answers.

Additionally, “it goes beyond just knowing what to ask specifically,” Daniels remarked.

“You might ask a general question but get much more detailed information that you may not have thought to ask.

For example, it might be sales in this region exceeded the previous year’s, and you may want to consider looking into a particular product line because it’s doing much better.

” Predictive Model Underpinnings One of the most significant developments to impact the natural language technologies underpinning conversational AI is the incorporation of machine learning models to provide their knowledge of terms instead of rules-based methods.

Oftentimes, these approaches involve neural networks, although there is a diversity of others as well.

According to Lore IO CEO Digvijay Lamba, “Machine learning is used in all these things to understand the natural language.

” Moreover, the fundamentals of NLI are applicable to use cases outside of NLG.

Natural language search, for example, enables one to apply everyday language “to do search and extract information from search,” Lamba divulged.

NLP and subsets like NLU are also instrumental in enabling users to interact with sophisticated data management systems with simple language, as opposed to arcane command line scripts.

These capabilities allow end users to create business rules for data quality or data modeling “where you describe the rules in your own language; you don’t worry about the underlying data,” Lamba said.

Intelligent Interfaces Regardless of the use case, the individual and collective technologies involved in NLI achieve the same objective.

They function as a means of simplifying the interface between humans and data.

Conversational AI is the acme of these capabilities in that it supports speaking to data systems with everyday terms to extract analytics results in equally quotidian language.

The deployments for these capabilities will only continue to expand as reliance on data-driven processes grows, solidifying their worth across the IT landscape as a whole.

  About the Author Jelani Harper is an editorial consultant servicing the information technology market.

He specializes in data-driven applications focused on semantic technologies, data governance and analytics.

Sign up for the free insideBIGDATA newsletter.

.

Leave a Reply