Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
LANGUAGE GENERATION METHOD, DEVICE AND ELECTRONIC APPARATUS
Document Type and Number:
Japanese Patent JP2021117989
Kind Code:
A
Abstract:
To provide a language generation method capable of improving the overall semantic learning effect of an input sequence.SOLUTION: A method comprises: encoding an input sequence using a preset encoder to generate a hidden state vector corresponding to input; using N decoders to decode each of a first target fragment vector, the hidden state vector, and a position vector corresponding to a second target fragment when a particle size category of a second target fragment to be predicted is a phrase, to generate N second target fragments; determining a loss value based on the difference between each of the N second target fragments and a second target tagging fragment; and updating the parameters for the preset encoder, a preset classifier and the N decoders based on the loss value, and using an updated language generation model to generate language.SELECTED DRAWING: Figure 1

Inventors:
ZHANG HAN
XIAO DONGLING
LI YUKUN
SUN YU
TIAN HAO
WU HUA
WANG HAIFENG
Application Number:
JP2020215548A
Publication Date:
August 10, 2021
Filing Date:
December 24, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
BEIJING BAIDU NETCOM SCI & TECH CO LTD
International Classes:
G06F40/44; G06F16/35; G06F40/56
Domestic Patent References:
JP2018132969A2018-08-23
JP2019537096A2019-12-19
Other References:
ALLAMAR JAY, THE ILLUSTRATED GPT-2 (VISUALIZING TRANSFORMER LANGUAGE MODELS), JPN6021049440, 12 August 2019 (2019-08-12), US, ISSN: 0004716652
Attorney, Agent or Firm:
Hideno Kono
Nobuo Kono