Deconstructing Natural Language Generation

Natural Language Generation (NLG) is rapidly becoming one of the most desired capabilities in the Natural Language Processing stack.

Prudent enterprises—and the most sophisticated use cases—amalgamate this dimension of natural language technologies in conjunction with Natural Language Understanding (NLU) and Natural Language Querying (NLQ).

Nevertheless, there’s no mistaking the undeniable business value of NLG’s core value proposition, which Arria NLG CEO Sharon Daniels characterized as “the value of this AI capability of turning data into text speaks to a better understanding of the data and speaks to delivering information across the enterprise.

It’s still the most widely adopted form of communication, natural language speaking.

” Consequently, NLG empowers a host of contemporary use cases from automatically producing reports with Business Intelligence tools to begetting narratives about data from sources as varied as Excel and Industrial Internet sensor data.

There are varying types of NLG predicated on a multiplicity of technological approaches including rules, templates, models, and algorithms.

googletag.

cmd.

push(function() { googletag.

display(div-gpt-ad-1439400881943-0); }); By using an artful admixture of techniques involving what Daniels termed “language analytics and computational linguistics”, as well as a heaping layer of machine learning, organizations can access “true operational efficiencies,” Daniels reflected.

“What used to take a team of people days or weeks can now be done with technology literally in seconds.

” Computational Linguistics The foundation of creditable NLG solutions hinges on computational linguistics focused on the science of language itself.

By relying on various static AI algorithms and models, this facet of NLG is the enabler of the language analytics component allowing NLG to transmute data (or numbers) into fluent natural language.

The most popular NLG deployments provide summaries—what Daniels described as “narratives”—of the significance of data of almost any variety.

In this process, computational linguistics support “an entire morphology of the English language and how to use language in sentence form in a way that is very natural,” Daniels explained.

With computational linguistics, NLG systems can state the same thing using different terms, distinguish antecedents for pronouns, understand where to punctuate a sentence, and provide other functions traditionally associated with NLP.

Moreover, they accomplish this objective with a degree of proficiency so that “you cannot decipher if it was written by an expert or written by…NLG,” Daniels added.

Language Analytics Language analytics is associated more with the data analysis side of the NLG process of transforming data to text.

This capability leverages the computational linguistics framework “to know how to analyze the data [in order] to know how to speak about it,” Daniels revealed.

The pairing of these capabilities supports a burgeoning number of use cases for swiftly understanding data’s significance—oftentimes in relation to business objectives—to underpin a “computer’s ability to generate words and sentences on the fly,” Daniels mentioned.

Financial services is one of the earliest adopters of NLG, which assists analysts by rapidly summarizing the meaning of enormous quantities of sources related to investment opportunities, for example.

The larger play may be for explicating Industrial Internet data.

Daniels referenced a use case in which NLG was applied to understand the massive amounts of sensor data in the oil and gas industry, which seemingly supersedes the value of conventional BI use cases because “the future as industrial control systems or sensor data is probably 10 times the size of what’s going on with BI right now,” Daniels admitted.

Machine Learning According to Daniels, the core functionality of the computational linguistics and language analytics approach is largely bereft of machine learning, which is instead leveraged as a means of “enhancing” NLG deployments.

In this respect, machine learning is vital to curating the narratives generated by NLG, whereas less mutable algorithms are necessary for the detailed exactness upon which the accuracy of NLG hinges.

For example, when providing narration of valuable healthcare data for situations of epic importance and vitality, “You’re not going to want to have any kind of potential variation that’s not fact based,” Daniels noted.

Instead, machine learning is applied by creditable NLG solutions as a means of smoothing out various facets of nuance and jargon inherent in different types of NLG use cases.

With this approach, machine learning is essential for “detecting the way people are talking about things and when adding a level of richness to the narrative, it does not happen at the actual conversion of data to language level,” Daniels remarked.

“It actually enhances the understanding of how to make the narrative richer over time.

” For instance, machine learning can perceive features related to how people write reports or query data.

“This is teaching the machine to act like a human: to analyze like a human, to speak like a human,” Daniels commented.

“The findings from that are then brought into the tool so it’s enhancing the NLG tool at all times.

” Natural Language Interaction As previously denoted, optimal NLG use cases leverage the different elements Daniels described to work alongside the full natural language technology suite.

The synthesis of these technologies reinforces Natural Language Interaction methods in which people are able to issue verbal queries of their data for real-time natural language responses with striking degrees of deftness and insight.

The result is, “You can ask your data a question and we’re using Business Intelligence with Amazon Alexa to bring together NLG with NLP and NLU and NLQ,” Daniels said.

“Really, the future with automation is the capability of bringing all of these technologies together.

” About the Author Jelani Harper is an editorial consultant servicing the information technology market.

He specializes in data-driven applications focused on semantic technologies, data governance and analytics.

Sign up for the free insideBIGDATA newsletter.

.

Leave a Reply