Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AUTOMATIC DETECTION AND DIFFERENTIATION/CLASSIFICATION OF THE ESOPHAGUS, STOMACH, SMALL BOWEL AND COLON LESIONS IN DEVICE-ASSISTED ENTEROSCOPY USING A CONVOLUTIONAL NEURONAL NETWORK
Document Type and Number:
WIPO Patent Application WO/2023/018344
Kind Code:
A4
Abstract:
The present invention relates to a computer-implemented method capable of automatically detecting esophageal, stomach, small bowel and colon lesions, such as blood or hematic residues, small bowel ulcers and erosions, small bowel protruding and vascular lesions in device-assisted enteroscopy image/videos data, by classifying pixels as lesion or non-lesion, using a convolutional image feature extraction step followed by a classification step and indexing such lesions in the set of one or more classes.

Inventors:
SOUSA FERREIRA JOÃO PEDRO (PT)
DA QUINTA E COSTA DE MASCARENHAS SARAIVA MIGUEL JOSÉ (PT)
CASAL CARDOSO HÉLDER MANUEL (PT)
GONÇALVES DE MACEDO MANUEL GUILHERME (PT)
RIBEIRO ANDRADE ANA PATRÍCIA (PT)
LAGES PARENTE MARCO PAULO (PT)
NATAL JORGE RENATO MANUEL (PT)
LIMA AFONSO JOÃO PEDRO (PT)
CARNEIRO RIBEIRO TIAGO FILIPE (PT)
MOREIRA SÁ CARDOSO PEDRO MARÍLIO (PT)
Application Number:
PCT/PT2022/050024
Publication Date:
April 06, 2023
Filing Date:
August 08, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
DIGESTAID ARTIFICIAL INTELLIGENCE DEV LDA (PT)
International Classes:
G06T7/00; G06N3/04
Attorney, Agent or Firm:
DE NOVAES, Francisco (PT)
Download PDF:
Claims:
AMENDED CLAIMS received by the International Bureau on 02 FEB 2023 (02.02.2023)

CLAIMS

1 . A computer-implemented method capable of automatically detecting and differentiating esophageal , stomach , small bowel and colon lesions in device-assisted enteroscopy images by classifying the pixels as esophageal , stomach , small bowel and colon lesions , comprising selecting the architecture combination and fully training such architecture for esophageal , stomach , small bowel and colon lesions with means of output validation and storage capabilities wherein the method : selects a number of subsets of all endoscopic ultrasonography images /videos , each of said subsets considering only images from the same patient ;

- selects another subset as validation set , wherein the subset does not overlap chosen images on the previously selected subsets ;

- pre-trains (8000) of each of the chosen subsets with one of a plurality of combinations of image feature extraction component , followed by a subsequent classification neural network component for pixel classification as esophageal , stomach , small bowel and colon lesions wherein said pre- training ;

- early stops when the scores do not improved over a given number of epochs , namely three ; evaluates the performance of each of the combinations ;

- is repeated on new, different subsets , with another networks combination and training hyperparameters , wherein such new combination considers a higher number of dense layers if the fl-metric is low

AMENDED SHEET (ARTICLE 19) and fewer dense layers if the fl -metric suggests overfitting ; selects (400) the architecture combination that performs best during pre- training ; fully trains and validates ( 9000) the selected architecture combination using the entire set of deviceassisted enteroscopy images to obtain an optimized architecture combination for each classification ;

- predicts ( 6000) esophageal , stomach , small bowel and colon lesions using said optimized architecture combination for classification ; receives the classification output (270) of the prediction module ( 6000) by an output collect module (7000) with means of communication to a third-party capable of performing validation of said network classification ;

- stores the corrected prediction into the storage component (2000) .

2 . The method of claim 1 , wherein the pre-training (8000) is performed dividing and selecting the entire device-assisted enteroscopy images cohort into a predefined amount of folds , wherein said given patient specific images are contained in one and one only fold .

3. The method of claim 1 , wherein the classification network architecture comprises at least two blocks , each having a Dense layer followed by a Dropout layer .

4 . The method of claims 1 and 3 , wherein the last block of the classification component includes a BatchNormalization layer followed by a Dense layer where the depth size is equal to the number of lesions type one desires to classify .

5 . The method of claim 1 , wherein the set of pre-trained neuronal networks is the best performing among the following :

22

AMENDED SHEET (ARTICLE 19) VGG16 , InceptionVB , Xception , EfficientNetB5 ,

EfficientNetB7 , Resnet50 and Resnetl25 .

6. The method of claims 1 and 5 , wherein the best performing combination is chosen based on the overall accuracy and on the fl-metrics .

7 . The method of claims 1 and 5 , wherein the training of the best performing combination comprises two to four dense layers in sequence , starting with 4096 and decreasing in half up to 512 .

8 . The method of claims 1 , 5 and 7 , wherein between the final two layers of the best performing combination there is a dropout layer of 0 . 1 drop rate .

9 . The method of claim 1 , wherein the training of the samples includes a ratio of training-to-validation of 10%-90% .

10 . The method of claim 1 , wherein the validation of the network output is achieved by interpreting the accuracy of the classification output and of correcting a wrong prediction , wherein the third-party comprises at least one of : another neuronal network , any other computational system adapted to perform the validation task or , optionally, a physician expert in gastroenterological imagery;

11 . The method of claim 10 , wherein the third-party validation is done by user-input .

12 . The method of claim 1 wherein the training dataset includes images in the storage component (2000) that were predicted sequentially performing the steps of such method .

13. A portable endoscopic device comprising instructions which , when executed by a processor , cause the computer to carry out the steps of the method of claims 1-12 .

23

AMENDED SHEET (ARTICLE 19)