Fix your text thought attention before NLP tasks

proposed a attention mechanism for correcting word usage and spelling.After reading this post, you will understand:Nested Attention Neural Hybrid Model DesignArchitectureImplementationTake AwayNested Attention Neural Hybrid Model DesignJi et al..The input of S2S model is a sequence of word and transforming to sequence to vectors..While they serves for difference purpose:Word Level: Correcting global grammar and fluency errorsCharacter Level: Spelling or inflected formsArchitecture of Nested Attention Hybrid Model (Ji et al., 2017)Word-based sequence-to-sequence modelFirst of all, sequence of text will be transformed as sequence of vectors..GRU is used in decoder as well and it outputs the sequence of vectors based on word attention input.Hybrid encoderOne of the word embeddings limitation is OOV and hybrid encoder is designed to tackle it.. More details

Leave a Reply