Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
DISTRIBUTED DEEP LEARNING SYSTEM
Document Type and Number:
WIPO Patent Application WO/2019/159783
Kind Code:
A1
Abstract:
In the present invention, distributed deep learning is performed at a high speed. Learning nodes (2-0)-(2-3) each calculate the gradient of a loss function from an output result obtained by inputting learning data to a learning-target neural network, packetize a calculation result, and transmit the packetized calculation result to a computing interconnect device (1). The computing interconnect device (1) receives the packets transmitted from the learning nodes (2-0)-(2-3), acquires the values of the gradients stored in these packets, calculates the sum of the gradients, packetizes a calculation result, and transmits the packetized calculation result to the learning nodes (2-0)-(2-3). The learning nodes (2-0)-(2-3) receive the packets transmitted from the computing interconnect device (1), and update a configuration parameter of the neural network on the basis of the values stored in these packets.

Inventors:
KATO JUNICHI (JP)
KAWAI KENJI (JP)
NGO HUYCU (JP)
ARIKAWA YUKI (JP)
ITO TSUYOSHI (JP)
SAKAMOTO TAKESHI (JP)
Application Number:
PCT/JP2019/004213
Publication Date:
August 22, 2019
Filing Date:
February 06, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NIPPON TELEGRAPH & TELEPHONE (JP)
International Classes:
G06N3/08
Other References:
HIDAKA, MASATOSHI ET AL.,: "Development of environment-independent and distributed computable deep convolutional neural network framework by JavaScript", IPSJ SIG TECHNICAL REPORTS: GRAPHICS AND CAD (CG), vol. 2015-CG-161, no. 3, 2015, pages 1 - 9
DEAN, JEFFREY ET AL.: "Large Scale Distributed Deep Networks", PROCEEDINGS OF ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEM 25(NIPS2012), 2012, pages 1 - 9, XP055534679, Retrieved from the Internet [retrieved on 20190423]
Attorney, Agent or Firm:
YAMAKAWA, Shigeki et al. (JP)
Download PDF: