Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
ATTENTION BASED SEQUENCE CONVERSION NEURAL NETWORK
Document Type and Number:
Japanese Patent JP2023052483
Kind Code:
A
Abstract:
To provide a system for converting a sequence using a neural network and a plurality of computer storage media.SOLUTION: The present invention is directed to a neural network system having one or a plurality of encoder neural networks 110 for receiving an input sequence and generating expressions obtained by encoding network inputs in the input sequence; and a decoder neural network 150 for receiving the encoded expressions to generate an output sequence. An encoder subnetwork 130 for receiving an encoder subnetwork input of each of a plurality of input positions to generate a subnetwork output for each of the plurality of input positions has an encoder self-attention sublayer 132 for receiving the subnetwork input for each of the plurality of input positions.SELECTED DRAWING: Figure 1

Inventors:
Nome M Shazir
Aidan Nicholas Gomez
Lucas Mieticslav Kaiser
Jacob Dee Utsukolite
Rion Owen Jones
Niki Jay Palmer
Ilya Polo Skin
Ashish Tech Vaswani
Application Number:
JP2023006053A
Publication Date:
April 11, 2023
Filing Date:
January 18, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
Google LLC
International Classes:
G06N3/0455
Attorney, Agent or Firm:
Murayama Yasuhiko
Shinya Mihiro
Tatsuhiko Abe