How NLP and BERT Will Change the Language Game

One Word: BERT.

googletag.

cmd.

push(function() { googletag.

display(div-gpt-ad-1439400881943-0); }); There have been a range of techniques applied to NLP, but deep learning, in particular, is showing some exceptional results.

Deep learning has been applied to NLP for several years, and research and development breaks new ground so quickly that new methods and increasingly capable models are rapidly occurring.

For example, FastText, an extension of Word2Vec, now significantly reduces the training load, which makes the application to a specific data set and language context much easier.

  Source: https://peltarion.

com/webinars/nlp-and-bert But a recent model open-sourced by Google in October 2018, BERT (Bidirectional Encoder Representations from Transformers, is now reshaping the NLP landscape.

BERT is significantly more evolved in its understanding of word semantics given its context and has an ability to process large amounts of text and language.

BERT also makes it easier to reuse a pretrained model (transfer learning) and then fine-tune your data and the specific language situation and problem you face.

Changing the NLP Benchmarks Against many measures, BERT performance has improved language processing by a significant amount.

In some circumstances, BERT can be applied directly to the data and problem with no further training (zero-shot training) and deliver a high-performing model.

Source: https://peltarion.

com/webinars/nlp-and-bert Since BERT was published last October, there has been a wave of transformer-based methods (GPT-2, XLNet, RoBERTa) which keep raising the bar by demonstrating better performance or easier training or some other specific benefit – for instance, text/language generation.

The performance level of BERT is being likened as a moment in natural language processing akin to the “ImageNet 2012 moment,” where a deep learning model demonstrated a big uplift in performance and then led to a wave of image-based deep learning applications and research.

We know we are onto a good thing when the standard performance measures for language need to be radically changed and uplifted because the new methods simply outperform the old benchmarks!.But while BERT is powerful and has reduced certain barriers for use, as with other deep learning approaches, it still needs to be fitted into a framework where the models can be more easily used for real-world circumstances and deployed by a wider group of organizations.

This is a critical part of the deep learning adoption cycle.

The true benefits of NLP will only be realized when there is broader adoption of these powerful models which operate in and improve live scenarios, supporting a wide range of applications across organizations and users.

You can check out this webinar to find out how BERT will power a wave of language-based applications.

About the Author Rob Dalgety, industry expert at Peltarion, has extensive experience in commercializing and positioning software into enterprises and other organizations in areas including mobility, big data and analytics, collaboration and digital.

Sign up for the free insideBIGDATA newsletter.

.

. More details

Leave a Reply