Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND DEVICE FOR FUSION AND CLASSIFICATION OF REMOTE SENSING HYPERSPECTRAL IMAGE AND LASER RADAR IMAGE
Document Type and Number:
WIPO Patent Application WO/2024/040828
Kind Code:
A1
Abstract:
The present invention relates to a method for fusion and classification of a remote sensing hyperspectral image and a laser radar image, comprising: acquiring a hyperspectral image and a laser radar image; performing intrinsic image decomposition on the hyperspectral image to obtain an intrinsic image and an illumination image, and for each hyperspectral intrinsic pixel, each hyperspectral illumination pixel, and each laser radar pixel, selecting neighborhood blocks thereof; training a plurality of deep network branches by using the neighborhood blocks; splicing outputs of the plurality of deep network branches in pairs by using a splicing layer; and performing multi-modal fusion on spliced outputs to obtain a final output category. According to the method for fusion and classification of the remote sensing hyperspectral image and the laser radar image provided by the present invention, important discrimination information in a multi-source remote sensing image can be fully fused, thereby achieving the objective of high-precision classification of target pixels, and missing and loss of important information during fusion are fully avoided, thereby reducing the problems of classification precision reduction and the like caused by information loss.

Inventors:
YU WENBO (CN)
HUANG HE (CN)
SHEN GANGXIANG (CN)
Application Number:
PCT/CN2022/142160
Publication Date:
February 29, 2024
Filing Date:
December 27, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIV SOOCHOW (CN)
International Classes:
G06V20/10
Foreign References:
CN115331110A2022-11-11
CN109993220A2019-07-09
CN112967350A2021-06-15
CN114742985A2022-07-12
US20090318815A12009-12-24
Attorney, Agent or Firm:
CENTRAL SOUTH WELL INTELLECTUAL PROPERTY OFFICE (CN)
Download PDF: