ANALYSIS OF EXSITING NEURAL NETWORK ARCHITECTURES FOR THE GENERATION OF NATURAL LANGUAGE TEXTS FOR THE PURPOSE OF RESEARCHING CURRENT TECHNIQUES IN CREATING NEURAL NETWORK MODELS
Скачать PDF
Annotation: This article discusses the existing models of neural networks designed to natural language generation. Their architectures are presented, ranging from the classical Seq2Seq model to the generation of a natural language using a generative adversarial network, the differences between them are given, their advantages and disadvantages are analyzed in terms of their accuracy of generating a natural language.
Keywords: neural network, learning, neural network model, machine translation, Seq2Seq, GA N, natural language generation, VAE, embeddings
For citation: Poltorak A.V., Nabatov S.I. Analysis of exsiting neural network architectures for the generation of natural language texts for the purpose of researching current techniques in creating neural network models // Electronic Scientific Journal IT-Standard. – 2020. – No. 3. – pp. .