Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND SYSTEM TO OBTAIN SUFFICIENT CYTOLOGY IMAGE IN CYTOPATHOLOGY
Document Type and Number:
WIPO Patent Application WO/2024/049498
Kind Code:
A1
Abstract:
Example methods and systems to identify a suspicious cytology specimen including target cells have been disclosed. One example method includes obtaining a first image associated with a first region of the cytology specimen through a first object lens, obtaining a second set of one or more images associated with a first subregion of the first region through a second object lens, identifying a first number of target cells and determining whether the first number is greater than a threshold number. In response that the first number is greater than the threshold number, the example method prompts an alert. Otherwise, the example method further includes obtaining a third set of one or more images associated with a second subregion through the second object lens and identifying a second number of the target cells associated with the second subregion.

Inventors:
LIU TIEN-JEN (US)
CHEN SHIH-YU (US)
CHEN SAMUEL (US)
Application Number:
PCT/US2023/013547
Publication Date:
March 07, 2024
Filing Date:
February 22, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
AIXMED INC (US)
International Classes:
G06T7/00; G02B21/00
Foreign References:
US20220237784A12022-07-28
US20210209753A12021-07-08
Attorney, Agent or Firm:
SU, Gene (TW)
Download PDF:
Claims:
We Claim:

1 . A method to identify a suspicious cytology specimen including target cells, comprising: obtaining a first image associated with a first region of the cytology specimen through a first object lens having a first field of view, a first magnification and a first depth of field in the first field of view; obtaining a second set of one or more images associated with a first subregion of the first region through a second object lens having a second field of view, a second magnification and a second depth of field in the second field of view; identifying a first number of the target cells distributed in a space of the cytology specimen; determining whether the first number is greater than a threshold number of target cells; in response to determining the first number greater than the threshold number, prompting an alert to indicate that the cytology specimen is suspicious; and in response to determining the first number not greater than the threshold number, obtaining a third set of one or more images associated with a second subregion of the first region through the second object lens and identifying a second number of the target cells distributed in a second space defined by the second subregion and the first depth of field.

2. The method of claim 1 , wherein the space is defined by the first subregion and the first depth of field and the first number of the target cells distributed in the space are identified from the second set of one or more images.

3. The method of claim 2, wherein the first region corresponds to the first field of view.

4. The method of claim 2, wherein the second field of view corresponds to one or more parts of the first subregion.

5. The method of claim 2, wherein the second magnification is greater than the first magnification and the second depth of field is less than the first depth of field.

6. The method of claim 2, prior to obtaining the first image, further comprising: obtaining images associated with other regions of the cytology specimen through the first object lens; determining a first layer of the cytology specimen that includes the target cells based on the images associated with the other regions, wherein the first layer corresponds to the first depth of field; and obtaining images of the first layer through the first object lens, wherein one of the images of the first layer is the first image.

7. The method of claim 2, wherein the image characteristics associated with the target cells include a contrast, a lightness, a shape and color information associated with the target cells.

8. The method of claim 7, wherein the color information corresponds to lights with a first range of wavelengths from about 530 nm to about 630 nm, a second range of wavelengths from about 450 nm to about 560 nm or a third range of wavelengths from about 450 nm to 530 nm.

9. The method of claim 7, wherein the color information includes a range of R, G and B values in an RGB (Red, Green, Blue) domain or a range of Hue value in a HSV (Hue, Saturation, Value) domain.

10. The method of claim 2, further comprising, determining whether a sum of the first number and the second number is greater than the threshold number.

11 . The method of claim 10, in response to determining the sum greater than the threshold number, prompting the alert; and in response to determining the sum not greater than the threshold number, obtaining a fourth set of one or images associated with a third subregion of the first region through the second object lens or a fifth set of one or more images associated with a fourth subregion of a second region the cytology specimen.

12. The method of claim 2, wherein the obtaining the second set of one or more images further includes driving the second object lens to focus a first space of the cytology specimen defined by the first subregion and the second depth of field and obtaining an image of the first space to be one of the second set of one or more images, wherein the first space is associated with a second layer of the cytology specimen.

13. The method of claim 1 , further comprising obtaining additional images associated with other regions of the cytology specimen through the first object lens before obtaining the second set of one or more images, wherein the identifying the first number of the target cells is before obtaining the second set of one or more images but after obtaining the additional images.

14. The method of claim 13, wherein the first number of the target cells distributed in the space are identified from the first image and the additional images, and the space is defined by the first region and the first depth of field and the other regions and the first depth of field.

15. The method of claim 14, wherein the obtaining the second set of one or more images further includes obtaining one single image associated with the first subregion.

16. The method of claim 13, wherein the prompting the alert is before obtaining the second set of one or more images.

17. A non-transitory computer-readable medium to identify a suspicious cytology specimen having instructions stored thereon, which in response to execution by one or more processors, cause the one or more processors to perform operations according to the method of any one of claims 1 to 16.

18. A system to identify a suspicious cytology specimen, the system comprising: one or more processors; and a non-transitory computer-readable medium coupled to the one or more processors and having instructions stored thereon, which in response to execution by the one or more processors, cause the one or more processors to perform operations according to the method of any one of claims 1 to 16.

Description:
METHOD AND SYSTEM TO OBTAIN SUFFICIENT CYTOLOGY IMAGE IN CYTOPATHOLOGY

CROSS-REFERENCE TO RELATED APPLICATION

[0001] This application claims the benefits of U.S. Provisional Application No. 63/403,660, filed September 2, 2022, which is incorporated by reference in its entirety.

BACKGROUND OF THE INVENTION

Field of the Invention

[0002] Embodiments of the present invention generally relate to methods and systems to obtain cytology images in cytopathology.

Description of the Related Art

[0003] Unless otherwise indicated herein, the approaches described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.

[0004] Cytopathology is a branch of pathology that studies and diagnoses diseases on a cellular level and generally involves obtaining cytology images on the cellular level. Obtaining cytology images of a cytology specimen includes an image digitalization of a glass slide on which the cytology specimen is distributed. The image digitalization may generally include scanning the glass slide for the cytology specimen to generate an image digital slide for the glass slide. The scanning may be performed by a whole-slide imaging scanner. The image digital slide can be viewed on a display (e.g., computer monitor) instead of a microscope.

[0005] Performing image digitalization of a glass slide efficiently and precisely can be challenging. One reason is that the cytology specimen on the glass slide may contain single cells and cell groups distributed in a three-dimensional space. The three-dimensional distributions of the cells and cell groups in the cytology specimen cause focusing difficulties. One conventional way to address such focusing difficulties is to obtain focused images of the entire glass slide, which can be very time-consuming. Another conventional way to address the focusing difficulties is to manually mark regions of the glass slide to be focused on and then obtain focused images of the marked regions. However, the manual approach may mark incorrect regions and as a result, the obtained focused images may not include many target cells in the cytology specimen.

BRIEF DESCRIPTION OF THE DRAWINGS

[0006] Fig. 1 is an example figure showing a cytology specimen distributed in a three-dimensional space of the cytology specimen on glass slide;

Fig. 2 illustrates how an example multiple object lens module obtains images associated with a layer of a cytology specimen by a first object lens;

Fig. 3A is an example figure illustrating a part of cytology specimen;

Fig. 3B illustrates an image associated with a part of cytology specimen obtained by first object lens;

Fig. 4A illustrates how an example multiple object lens module obtains images associated with a secondary subpart of a cytology specimen by a second object lens;

Fig. 4B illustrates an image associated with secondary subpart obtained by second object lens;

Fig. 40 illustrates how an example multiple object lens module obtains images associated with a plurality of secondary subparts of a cytology specimen by a second object lens;

Fig. 4D illustrates an image associated with secondary subpart obtained by second object lens;

Fig. 4E illustrates how an example multiple object lens module obtains images associated with a secondary subpart of a cytology specimen by a second object lens; Fig. 4F illustrates an image associated with secondary subpart obtained by second object lens 220;

Fig. 5 illustrates an example system to obtain images associated with a cytology specimen;

Fig. 6 is a flow diagram illustrating an example process to obtain images associated with target cells distributed in a cytology specimen; and

Fig. 7 is a flow diagram illustrating an example process to obtain images associated with target cells distributed in a cytology specimen, all arranged in accordance with some embodiments of the present disclosure.

DETAILED DESCRIPTION

[0007] In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the Figures, can be arranged, substituted, combined, and designed in a wide variety of different configurations, all of which are explicitly contemplated herein. In the description, a cytology specimen is suspicious when target cells distributed in a space of the cytology specimen include cells at risk of a disease.

[0008] Fig. 1 is an example figure showing cytology specimen 110 distributed in a three-dimensional space of cytology specimen 110 on glass slide 120, arranged in accordance with some embodiments of the present disclosure. Cytology specimen 110 may include multiple cells and impurities. For example, cytology specimen 110 may include dust or mark 131 , target cells 141 (e.g., the Epithelial cells at malignant risk), non-target cells 151 , 153, 155, 157 and 159 (e.g., red blood cells, normal Epithelial cells and other cells). Dust or mark 131 , target cells 141 and non-target cells 151 , 153, 155, 157 and 159 are distributed in a three-dimensional space (i . e. , at different depths within the volume of cytology specimen 110) on glass slide 120.

[0009] Fig. 2 illustrates how an example multiple object lens module 200 obtains images associated with a layer of cytology specimen 210 by first object lens 220, arranged in accordance with some embodiments of the present disclosure. In conjunction with Fig. 1 , in Fig. 2, cytology specimen 210 may correspond to cytology specimen 110. In Fig. 2, multiple object lens module 200 includes first object lens 220 configured to obtain images associated with cytology specimen 210. More specifically, first object lens 220 is configured to obtain images associated with one or more layers of cytology specimen 210. In some embodiments, each layer may correspond to a depth of field associated with a field of view of first object lens 220.

[0010] In some embodiments, first object lens 220 has a first field of view, a first depth of field and a first magnification. The first field of view may correspond to any of region A, B, C, D, E, F, G, H, I and J of cytology specimen 210.

[0011] In some embodiments, in the first field of view, a distance between a sharp and focused object nearest to first object lens 220 and a sharp and focused object furthest to first object lens 220 is referred to as the first depth of field. Some example distances may be distance 1 , distance 2 and distance 3 illustrated in Fig. 2. The first depth of field may also correspond to a layer of cytology specimen 210. Therefore, distance 1 may correspond to the first layer of cytology specimen 210; distance 2 may correspond to the second layer of cytology specimen 210; and distance 3 may correspond to the third layer of cytology specimen 210.

[0012] The first layer may be a layer furthest away from a glass slide (e.g., glass slide 120) and the third layer may be a layer adjacent to the glass slide (e.g., glass slide 120). Noting that although Fig. 2 illustrates 3 layers, cytology specimen 210 may include more or less layers according to the first depth of field of first object lens 220. In these embodiments, a region and a distance may define a part of cytology specimen 210. For example, J3 corresponds to a front bottom right part of cytology specimen 210. Therefore, in one embodiment illustrated in Fig. 2, cytology specimen 210 may be defined by parts A1 , A2, A3, B1 , B2, B3, C1 , C2, C3, D1 , D2, D3, E1 , E2, E3, F1 , F2, F3, G1 , G2, G3, H1 , H2, H3, 11 , I2, I3, J1 , J2 and J3.

[0013] In some embodiments, cytology specimen 210 may include, but not limited to, dust 211 , mark 212, first type of cells 213 (e.g., the Epithelial cell at malignant risk), second type of cells 214 (e.g., red blood cell), third types of cells 215 (e.g., normal Epithelial cell) and fourth types of cells 216 (e.g., other cells). Dust 211 , first type of cells 213, second type of cells 214, third types of cells 215 and fourth types of cells 216 may be distributed in cytology specimen 210 based on their specific gravities. Mark 212 may be manually marked by a user on a top surface of cytology specimen 210. Dust 211 and cells 213, 214, 215 and 216 have significant differences in their specific densities and may be distributed in different layers of cytology specimen 210. For example, dust 21 1 has a smaller specific density than cells 213, 214, 215 and 216 and may be distributed in the first layer of cytology specimen 210. Cells 213, 214, 215 and 216 have specific densities that are larger than dust 211 and may be distributed in the second layer of cytology specimen 210.

[0014] In some embodiments, cells 213, 214, 215 and 216 in cytology specimen 210 usually have sizes in a range of around 5 micrometres to around 20 micrometres. It should be noted that a selection of first object lens 220 may be based on the first depth of field of first object lens 210. The first depth of field of first object lens 220 may be much larger than the sizes of cells 213, 214, 215 and 216 in cytology specimen 210 so that the images of cells 213, 214, 215 and 216 are likely to be obtained in the same first depth of field (e.g., distance 1 , distance 2 or distance 3).

[0015] In some embodiments, prior to obtaining images of an entire layer of cytology specimen 210 through first object lens 220, first object lens 220 is configured to obtain images of each layer (e.g., first layer, second layer and third layer) of several regions randomly selected from all regions A, B, C, D, E, F, G, H, I and J. For example, first object lens 220 may be configured to obtain images of parts H1 , H2 and H3 in the randomly selected region H. Here, region H corresponds to the first field of view of first object lens 220, and distance 1 , 2 or 3 corresponds to the first depth of field of object lens 200. Several regions other than region H may be also randomly selected, for example, regions F, I and J. Accordingly, first object lens 220 may also obtain images of parts F1 , F2, F3, 11 , I2, I3, J1 , J2 and J3 of cytology specimen 210.

[0016] In some embodiments, images of parts F1 , F2, F3, H1 , H2, H3, 11 , I2, I3, J1 , J2 and J3 are processed by a processor with an artificial intelligence engine to identify whether the images of parts F1 , F2, F3, H1 , H2, H3, 11 , I2, I3, J1 , J2 and J3 include images of cells 213, 214, 215 and 216. The artificial intelligence engine may include machine learning capabilities. The artificial intelligence engine may be trained based on sample images of known cells corresponding to cells 213, 214, 215 and 216 having various contrasts, lightness, shapes and other image characteristics. In some embodiments, for illustration purposes only, the artificial intelligence engine may identify images of cells 213, 214, 215 and 216 that are mostly included in images of parts F2, H2, I2 and J2, instead of being included in images of parts F1 , F3, H1 , H3, 11 , I3, J1 and J3. Accordingly, the artificial intelligence engine may conclude that cells 213, 214, 215 and 216 are distributed in the second layer of cytology specimen 210. After reaching such a conclusion, first object lens 220 is configured to obtain images of the entire second layer of cytology specimen 210 (i.e., images of parts A2, B2, C2, D2, E2, F2, G2, H2, I2 and J2).

[0017] In conjunction with Fig. 2, Fig. 3A is an example figure illustrating a part 310 of cytology specimen 210, and Fig. 3B illustrates an image 330 associated with part 310 obtained by first object lens 220, arranged in accordance with some embodiments of the present disclosure. In some embodiments, part 310 may correspond to part H2 of cytology specimen 210. Cells 321 , 322, 323, 324, 325 and 326 may be distributed in part 310. In some embodiments, cells 324 may be the Epithelial cells at malignant risk and cells 321 , 322, 323, 325 and 326 may include normal Epithelial cells, red blood cells and other cells.

[0018] In some embodiments, image 330 may be a top view of part 310 because first object lens 220 is disposed at the top of part 310. Image 330 in Fig. 3B has an image region H’ corresponding to region H associated with part 310 in Fig. 3A. The correspondence between region H and image region H’ may include one or more factors associated with enlargement, shrinkage, rotation or twisted. [0019] In some embodiments, the artificial intelligence engine set forth above is used to process image 330 based on contrasts, lightness, shapes and other image characteristics associated with cells 321 to 326 to identify images 331 , 333 and 335 of image 330 which include images of cells 321 to 326. For example, images associated with cells 321 and 322 are identified to be included in image 331 , images associated with cells 323, 324 and 326 are identified to be included in image 333 and images associated with cells 321 , 322 and 325 are identified to be included in image 335. Any of images 331 , 333 and 335 has a spatial relationship in image region H’ of image 330. Based on the spatial relationship between images 331 , 333, 335 and image region H’, and the correspondence between image region H’ and region H, referring back to Fig. 3A, subregions 341 , 343 and 345 in region H may be identified. In some embodiments, subregions associated with cells respectively distributed in any of part A2, B2, C2, D2, E2, F2, G2, I2 and J2 may be identified in a similar manner.

[0020] In some embodiments, subparts 351 , 353 and 355 of part 310 are defined by distance 2 and subregions 341 , 343 and 345, respectively. In some embodiments, as set forth above, cells 324 may be the Epithelial cells at malignant risk (i.e., target cells) and cells 321 , 322, 323, 325 and 326 may include normal Epithelial cells, red blood cells and other cells (i.e., non-target cells), subpart associated with the target cells (i.e., subparts 351 and 353) may be examined in more details than subparts associated with the non-target cells (i.e., subpart 355). Subparts 351 and 353 and 355 are further examined with a second object lens to obtain more detailed images associated with target cells 324. For example, the second object lens is configured to obtain images associated with subpart 351 and 353 than subpart 355. For example, the second object lens may obtain images associated with subparts 351 and 353 but does not obtain images associated with subpart 355.

[0021] In some embodiments, as set forth above, subparts of parts A2, B2, C2, D2, E2, F2, G2, I2 and J2 may also be further examined with the second object lens to obtain more detailed images associated with cells 321 to 326.

[0022] Fig. 4A illustrates how an example multiple object lens module obtains images associated with a secondary subpart of a cytology specimen by a second object lens, arranged in accordance with some embodiments of the present disclosure. In Fig. 4A, second object lens 410 of the multiple object lens module is configured to obtain one or more images associated with subpart 420. In conjunction with Fig. 3A, subpart 420 may correspond to subpart 353 defined by distance 2 and subregion 343.

[0023] In some embodiments, second object lens 410 has a second field of view, a second depth of field and a second magnification. In some embodiments, in conjunction with Fig. 3A, the second field of view may correspond to subregion 343. In some other embodiments, in conjunction with Fig. 3A, the second field of view may correspond to only a part of subregion 343. For illustration only, Fig. 4A illustrates that the second field of view corresponds to subregion 343.

[0024] In some embodiments, the second magnification is higher than the first magnification. For illustration only, the first magnification and the second magnification may be, but not limited to, 4X and 20X, respectively.

[0025] In some embodiments, the second depth of field is less than the first depth of field. For illustration only, the first depth of field and the second depth of field may be, but not limited to, around 50 micrometres and around 1 micrometre, respectively. In some embodiments, in the second field of view, a distance between a sharp and focused object nearest to second object lens 410 and a sharp and focused object furthest to second object lens 410 refers to the second depth of field. Some example distance may be distance 21 , distance 22, distance 23 or distance 24 of distance 2 illustrated in Fig. 4A. Distance 2 illustrated in Fig. 4A corresponds to same distance 2 illustrated in Fig. 2 and Fig. 3A.

[0026] In some embodiments, the second depth of field may also correspond to a sublayer of a layer (e.g., second layer defined by distance 2 illustrated in Fig. 2) of cytology specimen 210. Therefore, distance 21 may correspond to a first sublayer of the second layer of cytology specimen 210, distance 22 may correspond to a second sublayer of the second layer cytology specimen 210, distance 23 may correspond to a third sublayer of the second layer of cytology specimen 210 and distance 24 may correspond to a fourth sublayer of the second layer of cytology specimen 210. Noting the number of sublayers of 4 is only for illustration and may include more or less sublayers according to the second depth of field of second object lens 410.

[0027] In some embodiments, second object lens 410 is configured to obtain images associated with subpart 420. More specifically, second object lens 410 is configured to obtain images of secondary subparts 421 , 422, 423 and/or 424 of subpart 420. In some embodiments, second object lens 410 is configured to focus on secondary subpart 421 , subpart 422, subpart 423 and/or subpart 424 with the second depth of field.

[0028] In some embodiments, the focus may be based on a determination of an artificial intelligence engine. The artificial intelligence engine may include machine learning capabilities. For example, the artificial intelligence engine may be trained based on sample images of target cells (e.g., the Epithelial cells at malignant risk) having various contrasts, lightness, shapes and other image characteristics.

Accordingly, the artificial intelligence engine may identify images of the target cells and determine to focus on these identified images. For example, among images of secondary subparts 421 , 422, 423 and 424 obtained by the second object lens 410, the artificial intelligence engine may identify images of secondary subparts 422 and 423 include images of the target cells. Based on the determination made by the artificial intelligence engine, the second object lens 410 is then driven to focus on secondary subparts 422 and 423 and save images of secondary subparts 422 and 423.

[0029] In alternative embodiments, the focus may be based on image characteristics of target cells (e.g., the Epithelial cells at malignant risk). Such image characteristics may include, but not limited to, color information associated with the target cells. For example, the nuclei of an Epithelial cell may be stained to blue by hematoxylin while red blood cells maintain red because red blood cells have no nuclei to be stained.

Accordingly, focusing on blue parts, instead of red parts, in the image have a higher chance to obtain images of the target cells.

[0030] In some embodiments, images of target cells may include a first specific range in the RGB (Red, Green, Blue) domain or in the HSV (Hue, Saturation, Value) domain. For example, the first specific range may include a range of R, G and B values, respectively. Alternatively, the first specific range may include a range of Hue value. On the other hand, images of non-target cells (e.g., normal Epithelial cells, red blood cells or other cells) may include a second specific range in the RGB domain or in the HSV domain, which is different from the first specific range. For example, the second specific range may include another range of R, G and B values. Alternatively, the second specific range may include another range of Hue value. For example, among images of secondary subparts 421 , 422, 423 and 424 obtained by the second object lens 410, images of secondary subparts 422 and 423 may include the first specific range. Accordingly, the second object lens 410 is then configured to focus on secondary subparts 422 and 423 and save images of secondary subparts 422 and 423 and not to focus on secondary subparts 421 and 424.

[0031] Fig. 4B illustrates an image 430 associated with secondary subpart 422 obtained by second object lens 220, arranged in accordance with some embodiments of the present disclosure. Image 430 includes images of target cell 441 and non-target cells 443 and 445. Image of target cell 441 includes the first specific range in the RGB domain or in the HSV domain and images of non-target cells 443 and 445 include the second specific range in RGB domain or in HSV domain. The first specific range in the RGB domain or in the HSV domain associated with image of target cell 441 causes second object lens 410 to focus on secondary subpart 422.

[0032] In some embodiments, the artificial intelligence engine is configured to determine whether a number of target cells in image 430 is greater than a threshold number of target cells. In some embodiments, in response to determining the first number greater than the threshold, in conjunction with Fig. 2, the artificial intelligence engine is configured to prompt an alert to indicate that cytology specimen 210 is suspicious. For example, the alert may alert a physician that a patient associated with cytology specimen 210 may have a disease.

[0033] In some embodiments, in response to determining the first number not greater than the threshold, the second object lens 410 is further configured to focus on another secondary subpart (e.g., secondary subpart 423) adjacent to secondary subpart 422 and obtain an image associated with secondary subpart 423.

[0034] Fig. 4C illustrates how an example multiple object lens module obtains images associated with a plurality of secondary subparts (e.g., secondary subparts 422 and 423) of a cytology specimen by a second object lens and Fig. 4D illustrates an image 450 associated with secondary subpart 423 obtained by second object lens 220, arranged in accordance with some embodiments of the present disclosure. Image 450 includes an image of target cell 451 . Image of target cell 451 may include the first specific range in RGB domain or in HSV domain.

[0035] In some embodiments, the artificial intelligence engine is configured to determine whether a number of target cells in images 430 and 450 is greater than the threshold number of target cells. In some embodiments, in response to determining the first number greater than the threshold, in conjunction with Fig. 2, the artificial intelligence engine is configured to prompt an alert to indicate that cytology specimen 210 is suspicious. For example, the alert may alert a physician that a patient associated with cytology specimen 210 may have a disease.

[0036] In some embodiments, in conjunction with Fig. 3A, in response to determining the first number of target cells in images 430 and 450 not greater than the threshold, the second object lens 410 is further configured to obtain one or more images associated with another subpart having the target cell (e.g., subpart 351 defined by distance 2 and subregion 341 ). The another subpart is a different subpart from subpart 420.

[0037] Fig. 4E illustrates how an example multiple object lens module obtains images associated with a secondary subpart (e.g., secondary subpart 462) of a cytology specimen by a second object lens, arranged in accordance with some embodiments of the present disclosure. In Fig. 4E, second object lens 410 is configured to obtain one or more images associated with subpart 460. In conjunction with Fig. 3A, subpart 460 may correspond to subpart 351 defined by distance 2 and subregion 341 . Fig. 4F illustrates an image 470 associated with secondary subpart 462 obtained by second object lens 220, arranged in accordance with some embodiments of the present disclosure. Image 470 includes an image of target cell 471 . Image of target cell 471 may include the first specific range in RGB domain or in HSV domain.

[0038] In some embodiments, in conjunction with Fig. 4B and Fig. 4D, the artificial intelligence engine is configured to determine whether a number of target cells in images 430, 450 and 470 is greater than the threshold number of target cells. In some embodiments, in response to determining the first number greater than the threshold, in conjunction with Fig. 2, the artificial intelligence engine is configured to prompt an alert to indicate that cytology specimen 210 is suspicious. For example, the alert may alert a physician that a patient associated with cytology specimen 210 may have a disease.

[0039] In some embodiments, in conjunction with Fig. 4B and Fig. 4D, in response to determining the first number of target cells in images 430, 450 and 470 not greater than the threshold, the second object lens 410 is further configured to obtain one or more images associated with another subpart having the target cell. The another subpart is a different subpart from subparts 420 and 460.

[0040] The process above may be repeated until the first number of target cells greater than the threshold or second object lens 410 obtains images from all subparts including the target cell of the cytology specimen when the first number of target cells continuously not greater than the threshold.

[0041] Fig. 5 illustrates an example system 500 to obtain images associated with a cytology specimen, arranged in accordance with some embodiments of the present disclosure. System 500 includes, but not limited to, computing device 510, camera 520, multiple object lens module 530, stage 550 and light source 560. In some embodiments, stage 550 is configured to carry and move a glass slide. Cytology specimen 540 is distributed on the glass slide.

[0042] In some embodiments, computing device 510 includes a processor, a memory subsystem, and a communication subsystem. Artificial intelligence engines discussed above may be implemented as a set of executable instructions stored in the memory subsystem to be executed by the processor. [0043] In some embodiments, computing device 510 is configured to generate control signals and transmit the control signals to camera 520, multiple object lens module 530, stage 550 and light source 560 via the communication subsystem. For example, computing device 510 is configured to control light source 560 to generate lights with a specific range of wavelengths associated with image characteristics of target cells distributed in cytology specimen 540. For example, the specific range of wavelengths may correspond to the color information associated with the target cells as set forth above. An example specific range of wavelengths may be about 530 nm to about 630 nm. Alternatively, another example specific range of wavelengths may be about 450 nm to about 560 nm, and preferably about 450 nm to about 530 nm.

[0044] In some embodiments, computing device 510 is configured to control a movement of stage 550. When stage 550 carries the glass slide and cytology specimen 540, the movement of stage 550 may align a region (e.g., region A, B, C, D, E, F, G, H, I or J illustrated in Fig. 2 or subregion 341 , 343 or 345 in Fig. 3A) of cytology specimen 540 and a field of view of object lens 533 or 535 with a light path of the light generated by light source 560 and illustrated in Fig. 5. Therefore, the light generated by light source 560 may pass through the region and object lens 533 or 535 and reach camera 520.

[0045] In some embodiments, computing device 510 is configured to control lens switching module 531 of multiple object lens module 530 to switch between different object lens (e.g., object lens 533 and 535). In some other embodiments, computing device 510 is configured to control camera 520, multiple object lens module 530 and/or stage 550 to move so that object lens 533 or object lens 535 to focus on parts, subparts and/or secondary subparts of cytology specimen 540. In conjunction with Fig. 2, object lens 533 may correspond to first object lens 220. In conjunction with Fig. 4A, object lens 555 may correspond to second object lens 410.

[0046] In some embodiments, computing device 510 is configured to control camera 520 to obtain an image of a focused part of cytology specimen 540. Computing device 510 is also configured to control camera 520 to send the obtained image to computing device 510 for further processing. Such processing includes, but not limited to, identifying images associated with target cells or non-target cells in cytology specimen 540, comparing one or more images obtained by second object lens 535 to an image obtained by first object lens 533 to determine whether a ratio of a number of target cells in the images obtained by second object lens 535 to a number of target cells in the image obtained by first object lens 533 is in a predetermined range.

[0047] Fig. 6 is a flow diagram illustrating an example process 600 to obtain images associated with target cells distributed in a cytology specimen, arranged in accordance with some embodiments of the present disclosure. Process 600 may include one or more operations, functions, or actions as illustrated by blocks 610, 620, 630, 640, 650 and/or 660 which may be performed by hardware, software and/or firmware. The various blocks are not intended to be limiting to the described embodiments. The outlined steps and operations are only provided as examples, and some of the steps and operations may be optional, combined into fewer steps and operations, or expanded into additional steps and operations without detracting from the essence of the disclosed embodiments. Although the blocks are illustrated in a sequential order, these blocks may also be performed in parallel, and/or in a different order than those described herein. In some embodiments, process 600 may be applied to various scanning approaches to scan a cytology specimen with corresponding different whole-slide imaging scanners. Such scanning approaches may include, but not limited to, area scanning or line scanning approaches.

[0048] Process 600 may begin at block 610, “obtain first image associated with first region of cytology specimen.” In some embodiments, in conjunction with Fig. 2 and Fig. 5, processor 510 is configured to control a movement of stage 550 so that a first region (e.g., region H in Fig. 2) of cytology specimen 210 is aligned with a first field of view of first object lens 220/533. Processor 510 is configured to control light source 560 to generate lights with a specific range of wavelengths corresponding to the color information associated with target cells as set forth above in cytology specimen 210. An example specific range of wavelengths may be about 530 nm to about 630 nm. Alternatively, another example specific range of wavelengths may be about 450 nm to about 560 nm, and preferably about 450 nm to about 530 nm. The lights may pass through the first region and first object lens 220/533. Processor 510 is configured to control camera 520 to obtain images associated with the first region through first object lens 220/533.

[0049] In some embodiments, first object lens 220/533 has a first depth of field in the first field of view. The first depth of field may be less than a thickness of cytology specimen 210. Therefore, to obtain sharp and focused images associated with the entire first region, camera 520 is configured to sequentially obtain images of different layers in the first region of cytology specimen 210 according to the first depth of field. For example, in conjunction with Fig. 2, camera 520 is configured to sequentially obtain images of parts H1 , H2 and H3. In some embodiments, processor 510 is configured to identify target cells from images of parts H1 , H2 and H3 based on contrasts, lightness, shapes and other image characteristics associated with the target cells in cytology specimen 210.

[0050] Block 610 may be repeated to perform on different regions (e.g., regions F, I and J in Fig. 2) of the cytology specimen. For example, processor 510 is configured to additionally obtain images of parts F1 , F2, F3, 11 , I2, I3, J1 , J2 and J3 of cytology specimen 210 and identify target cells from images of F1 , F2, F3, 11 , I2, I3, J1 , J2 and J3 based on contrasts, lightness, shapes and other image characteristics associated with the target cells in cytology specimen 210.

[0051] In some embodiments, in response to target cells are identified to be mostly dispersed in a specific layer (e.g., second layer) in the regions (e.g., regions F, H, I and J) of cytology specimen 210, processor 510 is configured to obtain a first image (e.g., image of part H2) associated with respective region (e.g., region H) of cytology specimen. Eventually, for example, processor 510 is configured to obtain images of A2, B2, C2, D2, E2, F2, G2, H2, I2 and J2 associated with regions A, B, C, D, E, F, G, H, I and J, respectively.

[0052] Block 610 may be followed by block 620, “obtain second set of images associated with first subregion of cytology specimen.” In some embodiments, processor 510 is configured to sequentially identify images associated with the target cells from images of A2, B2, C2, D2, E2, F2, G2, H2, I2 or J2. For example, in conjunction with Fig. 3A and Fig. 3B, processor 510 is configured to identify image 333 which is associated with the target cells. As previously discussed, based on spatial relationships between images 333 and image region H’, and a correspondence between image region H’ and region H, subregion 343 in region H may be identified. Similarly, processor 510 may also be configured to identify image 331 which is associated with the target cells and identify subregion 341 in region H.

[0053] In some embodiments, in conjunction with Fig. 3A, as previously discussed, a second object lens is configured to obtain more detailed images of target cells 324 in first subregion 343. In some embodiments, in conjunction with Fig. 4A and Fig. 5, processor 510 is configured to control a movement of stage 550 so that a first subregion (e.g., subregion 343 in Fig. 4A) of the cytology specimen is aligned with a second field of view of second object lens 410/535. Processor 510 is configured to control light source 560 to generate lights with a specific range of wavelengths corresponding to the color information associated with target cells as set forth above in the cytology specimen. An example specific range of wavelengths may be about 530 nm to about 630 nm. Alternatively, another example specific range of wavelengths may be about 450 nm to about 560 nm, and preferably about 450 nm to about 530 nm. The lights may pass through the first subregion and second object lens 410/535. Processor 510 is configured to control camera 520 to obtain a second set of images associated with the first subregion through second object lens 410/535. The second set of images may include one or more images of secondary subparts 421 , 422, 423 and 424 of subpart 420.

[0054] In some embodiments, in conjunction with Fig. 4A, 4B, 4C, 4D and Fig. 5, processor 510 is configured to control second object lens 410/535 to focus and obtain the second set of images of secondary subparts 422 and 423 based on image characteristics of target cells, for example, color information associated with the target cells. For example, the second set of images (e.g., images 430 and 450) includes the color information associated with the target cells but images associated with first subregion 343 other than the second set of images (e.g., images of secondary subparts 421 and 424) do not include the color information associated with the target cells. [0055] Block 620 may be followed by block 630, “identify first number of target cells.” In some embodiments, a first number of target cells in images associated with the first subregion 343 is identified. More specifically, the number of target cells in images 430 and 450 are identified in block 630 as the first number of target cells.

[0056] Block 630 may be followed by block 640, “determine first number greater than threshold.” In response to the first number of target cells identified in block 630 being determined greater than a threshold number of target cells, block 640 is followed by block 650, “prompt alert.” In some embodiments, in conjunction with Fig. 2, in block 650, an alert is prompted to indicate that cytology specimen 210 is suspicious. Alternatively, in response to the first number of target cells identified in block 630 being determined not greater than the threshold number of target cells, block 640 is followed by block 660, “obtain third set of images associated with second subregion of cytology specimen.”

[0057] In block 660, in some embodiments, in conjunction with Fig. 3A, 4E and 4F, as previously discussed, the second object lens is configured to obtain more detailed images of target cells 324 in second subregion 341 . In some embodiments, in conjunction with Fig. 4A and Fig. 5, processor 510 is configured to control a movement of stage 550 so that a second subregion (e.g., subregion 341 in Fig. 4E) of the cytology specimen is aligned with a second field of view of second object lens 410/535. Processor 510 is configured to control light source 560 to generate lights with a specific range of wavelengths corresponding to the color information associated with target cells as set forth above in the cytology specimen. An example specific range of wavelengths may be about 530 nm to about 630 nm. Alternatively, another example specific range of wavelengths may be about 450 nm to about 560 nm, and preferably about 450 nm to about 530 nm. The lights may pass through the second subregion and second object lens 410/535. Processor 510 is configured to control second object lens 410/535 to focus and obtain a third set of images of secondary subpart 462 based on image characteristics of target cells, for example, color information associated with the target cells. In addition, in some embodiments, a second number of target cells in images associated with the second subregion 341 is identified. More specifically, the number of target cells in image 470 is identified in block 660 as the second number of target cells.

[0058] In some embodiments, processor 510 is configured to determine whether a sum of the first number of target cells associated with the first subregion 343 and the second number of target cells associated with the second subregion 341 is greater than the threshold number of target cells specified in block 640. In response to the sum greater than the threshold number, in conjunction with Fig. 2, processor 510 is configured to prompt an alert to indicate that cytology specimen 210 is suspicious.

[0059] In some embodiments, in response to the sum not greater than the threshold number, processor 510 is configured to control second object lens 410/535 to focus and obtain images associated with another subregion based on image characteristics of target cells, for example, color information associated with the target cells. In conjunction with Fig. 3A, the another subregion may be a subregion of the first region H. Alternatively, in conjunction with Fig. 2, the another subregion may be a subregion of another region (e.g., region A, B, C, D, E, F, G, I or J). The sum of the target cells obtained by the second object lens from various subregions are continuously compared to the threshold number of the target cells specified in block 640. Once the sum is greater than the threshold number, in conjunction with Fig. 5, processor 510 is configured to prompt the alert and process 600 ends. Otherwise, in conjunction with Fig. 5, process 600 will repeat until processor 510 controls second object lens 535 to obtain images of all subregions on the cytology specimen associated with the target cells.

[0060] Fig. 7 is a flow diagram illustrating an example process 700 to obtain images associated with target cells distributed in a cytology specimen, arranged in accordance with some embodiments of the present disclosure. Process 700 may include one or more operations, functions, or actions as illustrated by blocks 710, 720, 730 and/or 740 which may be performed by hardware, software and/or firmware. The various blocks are not intended to be limiting to the described embodiments. The outlined steps and operations are only provided as examples, and some of the steps and operations may be optional, combined into fewer steps and operations, or expanded into additional steps and operations without detracting from the essence of the disclosed embodiments. Although the blocks are illustrated in a sequential order, these blocks may also be performed in parallel, and/or in a different order than those described herein. In some embodiments, process 700 may be applied to various scanning approaches to scan a cytology specimen with corresponding different whole-slide imaging scanners. Such scanning approaches may include, but not limited to, area scanning or line scanning approaches.

[0061] Process 700 may begin at block 710, “obtain one or more first images associated with first region of cytology specimen.” In some embodiments, in conjunction with Fig. 2 and Fig. 5, processor 510 is configured to control a movement of stage 550 so that a first region (e.g., region H in Fig. 2) of cytology specimen 210 is aligned with a first field of view of first object lens 220/533. Processor 510 is configured to control light source 560 to generate lights with a specific range of wavelengths corresponding to the color information associated with target cells as set forth above in cytology specimen 210. An example specific range of wavelengths may be about 530 nm to about 630 nm. Alternatively, another example specific range of wavelengths may be about 450 nm to about 560 nm, and preferably about 450 nm to about 530 nm. The lights may pass through the first region and first object lens 220/533. Processor 510 is configured to control camera 520 to obtain images associated with the first region through first object lens 220/533.

[0062] In some embodiments, first object lens 220/533 has a first depth of field in the first field of view. The first depth of field may be less than a thickness of cytology specimen 210. Therefore, to obtain sharp and focused images associated with the entire first region, camera 520 is configured to sequentially obtain images of different layers in the first region of cytology specimen 210 according to the first depth of field. For example, in conjunction with Fig. 2, camera 520 is configured to sequentially obtain images of parts H1 , H2 and H3. In some embodiments, processor 510 is configured to identify target cells from images of parts H1 , H2 and H3 based on contrasts, lightness, shapes and other image characteristics associated with the target cells in cytology specimen 210. [0063] Block 710 may be followed by block 720, “obtain additional images associated with other regions of cytology specimen.” In some embodiments, in block 720, operations performed in block 710 may be repeated for different regions (e.g., regions F, I and J in Fig. 2) of the cytology specimen. For example, processor 510 is configured to additionally obtain images of parts F1 , F2, F3, 11 , I2, I3, J1 , J2 and J3 of cytology specimen 210 and identify target cells from images of F1 , F2, F3, 11 , I2, I3, J1 , J2 and J3 based on contrasts, lightness, shapes and other image characteristics associated with the target cells in cytology specimen 210.

[0064] Block 720 may be followed by block 730, “identify first number of target cells and first number greater than threshold.” In some embodiments, a first number of target cells are identified from the first images obtained in block 710 and the additional images obtained in block 720. In some embodiments, the first number of target cells may be greater than a threshold number of target cells. Assuming the target cells are the Epithelial cells at malignant risk and the threshold number that physicians use to diagnose a cancer, process 700 now can conclude that a patient providing the cytology specimen may be a cancer patient. In conjunction with Fig. 5, it is important to note that this conclusion does not need to use second object lens 535. In block 730, in conjunction with Fig. 2 and Fig. 5, processor 510 is configured to prompt an alert to indicate that cytology specimen 210 is suspicious.

[0065] Block 730 may be followed by block 740, “obtain second set of images associated with subregions of cytology specimen.” Given that the patient is very likely to be a cancer patient because the first number of target cells identified in block 730 is greater than the threshold number, in conjunction with Fig. 5, images obtained by second object lens 535 may be used to understand sizes and/or morphologies of some of the most significant target cells in the cytology specimen. Therefore, in some embodiments, in conjunction with Fig. 5, processor 510 is configured to control a movement of stage 550 so that second object lens 535 obtains a single image from each subregion of the cytology specimen to shorten the scanning time of the cytology specimen and obtain images of most significant target cells on the cytology specimen. [0066] The above examples can be implemented by hardware (including hardware logic circuitry), software or firmware or a combination thereof. The above examples may be implemented by any suitable computing device, computer system, wearables, etc. The computing device may include processor(s), memory unit(s) and physical NIC(s) that may communicate with each other via a communication bus, etc. The computing device may include a non-transitory computer-readable medium having stored thereon instructions or program code that, in response to execution by the processor, cause the processor to perform processes described herein with reference to Figs. 6 and 7. The computing device may communicate with a wearable and/or one or more sensors.

[0067] The techniques introduced above can be implemented in special-purpose hardwired circuitry, in software and/or firmware in conjunction with programmable circuitry, or in a combination thereof. Special-purpose hardwired circuitry may be in the form of, for example, one or more application-specific integrated circuits (ASICs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), and others. The term ‘processor’ is to be interpreted broadly to include a processing unit, ASIC, logic unit, or programmable gate array etc.

[0068] Some aspects of the embodiments disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computing systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware are possible in light of this disclosure.

[0069] Software and/or other instructions to implement the techniques introduced here may be stored on a non-transitory computer-readable storage medium and may be executed by one or more general-purpose or special-purpose programmable microprocessors. A “computer-readable storage medium”, as the term is used herein, includes any mechanism that provides (i.e. , stores and/or transmits) information in a form accessible by a machine (e.g., a computer, network device, personal digital assistant (PDA), mobile device, manufacturing tool, any device with a set of one or more processors, etc.). A computer-readable storage medium may include recordable/non recordable media (e.g., read-only memory (ROM), random access memory (RAM), magnetic disk or optical storage media, flash memory devices, etc.)

[0070] From the foregoing, it will be appreciated that various embodiments of the present disclosure have been described herein for purposes of illustration, and that various modifications may be made without departing from the scope and spirit of the present disclosure. Accordingly, the various embodiments disclosed herein are not intended to be limiting.