Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
ELECTRONIC DEVICE FOR ANALYZING MEANING OF SPEECH, AND OPERATION METHOD THEREFOR
Document Type and Number:
WIPO Patent Application WO/2019/117466
Kind Code:
A1
Abstract:
An electronic device using an artificial neural network model including an attention mechanism, according to various embodiments, can comprise: a memory configured to store information including a plurality of recurrent neural network (RNN) layers; and at least one processor connected with the memory and configured to set, as a first key and a value, at least one first hidden representation acquired through at least one layer among the plurality of RNN layers, set, as a second key, at least one second hidden representation acquired through at least one second layer among the plurality of RNN layers, and acquire an attention included in an attention structure at least on the basis of data on the first key, data on the second key, or data on the value.

Inventors:
KIM JUN SEONG (KR)
Application Number:
PCT/KR2018/013416
Publication Date:
June 20, 2019
Filing Date:
November 07, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SAMSUNG ELECTRONICS CO LTD (KR)
International Classes:
G10L15/16; G10L15/04; G10L15/18
Foreign References:
US20170148431A12017-05-25
KR20170050029A2017-05-11
US20150364128A12015-12-17
US9263036B12016-02-16
KR20170095582A2017-08-23
Other References:
ANKUR BAPNAGOKHAN TURDILEK HAKKANI-TURLARRY HECK.: "Towards Zero-Shot Frame Semantic Parsing for Domain Scaling", PROC. INTERSPEECH, 2017
ASHISH VASWANINOAM SHAZEERNIKI PARMARJAKOB USZKOREITLLION JONESAIDAN N. GOMEZLUKASZ KAISERILLIA POLOSUKHIN., ATTENTION IS ALL YOU NEED., 2017
BAOLIN PENGKAISHENG YAOLI JINGKAM-FAI WONG: "ecurrent Neural Networks with External Memory for Spoken Language Understanding", NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, 2015, pages 25 - 35
BING LIUIAN LANE: "Recurrent Neural Network Structured Output Prediction for Spoken Language Understanding", PROC. NIPS, 2015
BING LIUIAN LANE: "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling", PROC. INTERSPEECH, 2016
CHARLES T. HEMPHILLJOHN J. GODFREYGEORGE R. DODDINGTON: "The ATIS Spoken Language Systems Pilot Corpus", PROC. DARPA SPEECH AND NATURAL LANGUAGE WORKSHOP, 1990
DILEK HAKKANI-TURGOKHAN TURASLI CELIKYILMAZYUN-NUNG CHENJIANFENG GAOLI DENGYE-YI WANG: "Multi-Domain Joint Semantic Frame Parsing using Bi-directional RNN-LSTM", PROC. INTERSPEECH, 2016
DZMITRY BAHDANAUKYUNGHYUN CHOYOSHUA BENGIO: "Neural Machine Translation by jointly learning to align and translate", ICLR, 2015
FELIX A.: "Gers and Jurgen Schmidhuber. 2000. Recurrent Nets that Time and Count. In Neural Networks", PROC. IJCNN
GABOR MELISCHRIS DYERPHIL BLUNSOM., ON THE STATE OF THE ART OF EVALUATION IN NEURAL LANGUAGE MODELS, 2017
GAKUTO KURATABING XIANGBOWEN ZHOUMO YU., LEVERAGING SENTENCE-LEVEL INFORMATION WITH ENCODER LSTM FOR NATURAL LANGUAGE UNDERSTANDING, 2016
GREGOIRE MESNILYANN DAUPHINKAISHENG YAOYOSHUA BENGIOLI DENGDILEK HAKKANI-TURXIAODONG HELARRY HECKGOKHAN TURDONG YU: "Using Recurrent Neural Networks for Slot Filling in Spoken Language Understanding", IEEE/ACM, 2015
JACOB ANDREASDAN KLEIN: "When and why are log-linear models self-normalizing?", PROC. NAACL, 2014
JASON PC CHIUERIC NICHOLS., NAMED ENTITY RECOGNITION WITH BIDIRECTIONAL LSTM-CNNS, 2015
JONAS GEHRINGMICHAEL AULIDAVID GRANGIERYANN N. DAUPHIN: "A Convolutional Encoder Model for Neural Machine Translation", ARXIV, 2016
JONAS GEHRINGMICHAEL AULIDAVID GRANGIERDENIS YARATSYANN N. DAUPHIN: "Convolutional Sequence to Sequence Learning", ARXIV, 2017
JOO-KYUNG KIMGOKHAN TURASLI CELIKYILMAZBIN CAOYE- YI WANG: "Intent detection using semantically enriched word embeddings", PROC. IEEE SLT, 2016
JUNYOUNG CHUNGCAGLAR GULCEHREKYUNGHYUN CHOYOSHUA BENGIO.: "Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling", PROC. NIPS, 2014
KAISHENG YAOBAOLIN PENGYU ZHANGDONG YUGEOFFREY ZWEIGYANGYANG SHI.: "Spoken Language Understanding using Long Short-Term Memory Neural Networks", PROC. IEEE SLT, 2014
LYAN VERWIMPJORIS PELEMANSHUGO VAN HAMMEPATRICK WAMBACQ.: "Character-Word LSTM Language Models", PROC. EACL, 2017
MATTHEW D. ZEILER., ADADELTA: AN ADAPTIVE LEARNING RATE METHOD, 2012
NITISH SRIVASTAVAGEOFFREY HINTONALEX KRIZHEVSKYILYA SUTSKEVERRUSLAN SALAKHUTDINOV.: "Dropout: A Simple Way to Prevent Neural Networks from Overfitting", JMLR, vol. 15, 2014, pages 1929 - 1958, XP055193568
ORHAN FIRATKYUNGHYUN CHOYOSHUA BENGIO.: "Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism", PROC. NAACL, 2016
SERGEY IOFFECHRISTIAN SZEGEDY.: "Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift", PROC. ICML, 2015
STEPHEN MERITYNITISH SHIRISH KESKARRICHARD SOCHER, REGULARIZING AND OPTIMIZING LSTM LANGUAGE MODELS, 2017
SUMAN RAVURIANDREAS STOLCKE: "A comparative study of neural network models for lexical intent classification", PROC. IEEE ASRU, 2015
SUMAN RAVURIANDREAS STOLCKE: "Recurrent Neural Network and LSTM Models for Lexical Utterance Classification", PROC. INTERSPEECH, 2015
VOLODYMYR MNIHNICOLAS HEESSALEX GRAVESKORAY KAVUKCUOGLU: "Recurrent Models of Visual Attention", PROC. NIPS, 2014
YANGYANG SHIKAISHENG YAOHU CHENYI-CHENG PANMEI-YUH HWANGBAOLIN PENG.: "Contextual spoken language understanding using recurrent neural networks", PROC. ICASSP, 2015
YARIN GALZOUBIN GHAHRAMANI.: "A Theoretically Grounded Application of Dropout in Recurrent Neural Networks", PROC. NIPS, 2016
YOON KIMYACINE JERNITEDAVID SONTAGALEXANDER M. RUSH: "Character-Aware Neural Language Models", PROC. AAAI, 2016
ZHIHENG HUANGWEI XUKAI YU., BIDIRECTIONAL LSTM-CRF MODELS FOR SEQUENCE TAGGING, 2015
See also references of EP 3726525A4
Attorney, Agent or Firm:
KWON, Hyuk-Rok et al. (KR)
Download PDF: