Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEMS, DEVICES, SUBSYSTEMS AND METHODS FOR ORAL CAVITY INSPECTION
Document Type and Number:
WIPO Patent Application WO/2021/149050
Kind Code:
A1
Abstract:
Systems, subsystems, devices and methods for oral cavity inspection may be based on acquisition of images of different oral cavity areas of a subject, and may comprise using at least one image acquisition device; determining a relative field of view (FOV) of the at least one image acquisition device, for each of the acquired images; and construction of display data, based on the acquired images, for visually representing one or more oral cavity areas of the respective subject. The constructed display data of the respective subject may be used for remote oral cavity inspection of the respective subject.

Inventors:
HAJ YAHYA BAHA AL DIN (IL)
SHVALB NIR (IL)
HAMZANI PINHAS YAFIT (IL)
MEDINA ODED (IL)
HACOHEN SHLOMO (IL)
Application Number:
PCT/IL2021/050061
Publication Date:
July 29, 2021
Filing Date:
January 20, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ARIEL SCIENT INNOVATIONS LTD (IL)
MOR RESEARCH APPLIC LTD (IL)
International Classes:
G06T7/70; A61B1/247; A61C1/08; A61C9/00
Domestic Patent References:
WO2019102480A12019-05-31
Foreign References:
US9939714B12018-04-10
US20060154198A12006-07-13
Attorney, Agent or Firm:
KESTEN, Dov et al. (IL)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A system for oral cavity inspection, comprising: at least one image acquisition device, configured to acquire images of different oral cavity areas of a subject; a positioning subsystem, configured to determine a relative field of view (FOV) of the at least one image acquisition device, for each acquired image; a construction subsystem configured to generate, based on the acquired images of the respective subject, display data for visually representing one or more oral cavity areas of the respective subject, wherein the construction of the display data is carried out at least by organizing at least some of the acquired images, and/or portions thereof, according to their relative FOVs; and a remote oral inspection subsystem configured to controllably display the constructed display data, wherein the remote oral inspection subsystem is remotely located from the at least one image acquisition device, for remote medical inspection of the oral cavity areas of the respective subject.

2. The system for oral cavity inspection of claim 1, wherein the organizing of at least some of the acquired images comprises stitching at least some of the acquired images and/or portions thereof to one another according to their respective FOVs to generate a panoramic oral cavity image.

3. The system for oral cavity inspection of any one of claims 1 to 2, wherein the construction subsystem is further configured to: normalize the acquired images to one another, in order to form a coherent geometric and/or coloration scaling of the acquired images outputting corresponding normalized images, and organize the normalized images, according to their respective FOVs to construct the display data.

4. The system for oral cavity inspection of any one of claims 1 to 3, wherein the display data is descriptive of information about: a panoramic view of the one or more oral cavity areas; a pictures atlas; and/or a three-dimensional (3D) model of the one or more oral cavity areas.

5. The system for oral cavity inspection of any one of claims 1 to 4 further comprising at least one light source, for illuminating the oral cavity of the respective subject while acquiring images by the image acquisition device, and wherein the at least one light source comprises one or more of: a light emitting diode (LED); a flashlight, a white light lamp, an ultraviolet (UV) light source, an array of LED light sources, each LED light source emitting light of a different wavelength band.

6. The system for oral cavity inspection of any one of claims 1 to 5, wherein at least the positioning subsystem is operable via a designated application operable via a mobile communication device, the designated application being configured to receive and transmit one or more of: the display data, the acquired images, and/or the relative FOV of each of the acquired images, to the remote oral inspection subsystem.

7. The system for oral cavity inspection of any one of claims 1 to 6, wherein the at least one image acquisition device and the positioning subsystem are comprised in a mobile communication device, wherein the positioning subsystem is comprised in a designated application operable via the mobile communication device, and wherein the positioning subsystem determines the relative FOV of each acquired image, based on sensor data originating from one or more movement and/or orientation sensors of the mobile communication device.

8. The system for oral cavity inspection of claim 7 further comprising at least one connector, configured to removably and rigidly connecting the at least one image acquisition device to the mobile device, for using one or more mobile device's sensors and/or image processing programs, configured for determining orientation of the mobile device, wherein the relative FOV of each acquired image is determined, based on the orientation of the mobile device to which the at least one image acquisition device connects via the at least one connector.

9. The system for oral cavity inspection of claim 8, wherein the remote oral inspection subsystem is operable via a computerized data storage, processing, communication, input and display machine.

10. The system for oral cavity inspection of claim 9, wherein the remote oral inspection subsystem further comprises an analysis engine, configured to analyze the acquired images and/or display data associated therewith, for determining a medical condition of the subject and, optionally, to output an analysis result, wherein the analysis engine comprises one or more analysis programs, configured to detect one or more medical oral cavity abnormalities, based on image analysis of the display data and/or on the acquired images, and wherein the one or more image analysis based medical abnormalities detection is carried out by detecting oral cavity areas showing one or more of: topographic abnormality; coloring abnormality; morphological abnormality; vasculature abnormality; and/or changes over time in any one or more of the above oral cavity abnormalities.

11. The system for oral cavity inspection of claim 10, wherein the detectable medical abnormalities are associated with one or more of the following medical oral conditions: oral cavity wound and/or lesion; oral cavity ulcer; oral cavity inflammation; dental condition; oral cavity benign, malignant and/or premalignant lesion and/or tissue.

12. The system for oral cavity inspection of any one of claims 10 to 11, wherein the analysis engine is further configured to determine physical characteristics of medical abnormalities, and to output inspection data indicative of detected abnormalities and their physical characteristics.

13. The system for oral cavity inspection of claim 12, wherein the physical characteristics, detectable by the analysis engine and indicatable over the display of the display data, comprise, for each or some of the detected abnormalities, one or more of: peripheral boundaries of the identified abnormality; topography of the detected abnormality; morphology of the abnormality; location indication of the abnormality in the oral cavity; coloration of the abnormality; abnormality specification.

14. The system for oral cavity inspection of any one of claims 1 to 13 further comprising one or more positioning sensors, each being configured to detect one or more positioning parameters values indicative of the respective positioning of the at least one image acquisition device, outputting FOV data, indicative of the positioning parameters values, wherein the positioning subsystem being configured to determine the relative FOV of each acquired image, based on the outputted FOV data.

15. The system for oral cavity inspection of any one of claims 1 to 14, further comprising an orthodontic fixture device, configured to fixate the subject's mandible and/or jaw at a fixed and stable posture, while acquiring images of the respective subject's oral cavity, and wherein the orthodontic fixture device comprises one or more physical markers for FOV determination.

16. The system for oral cavity inspection of any one of claims 1 to 15, wherein the remote oral inspection subsystem further comprises a user interface (UI), configured for user input-based control over the display of the display data.

17. The system for oral cavity inspection of any one of claims 1 to 16 further comprising a digital image acquisition guide (DIAG), configured for auditory and/or visual display of guidance instructions for guiding the subject or a caretaker thereof as to how to position the at least one image acquisition device, when acquiring the images of the oral cavity of the respective subject.

18. A system for oral cavity inspection, comprising: a data acquisition subsystem (DAS) comprising: at least one image acquisition device, configured to acquire images of different oral cavity areas of a subject and output image data representing thereof; a DAS positioning subsystem, configured to determine relative field of view (FOV) of the at least one image acquisition device, for each acquired image in the image data; and a DAS communication subsystem configured to transmit the image data and FOV associated with each acquired image thereof; and a remote oral inspection subsystem (ROIS) comprising: a ROIS communication subsystem, configured to receive image data and FOV associated therewith from multiple subjects, from mobile communication devices of subjects; a ROIS construction subsystem, configured to construct display data, for each subject, based on the image data of the respective subject, the display data visually representing one or more oral cavity areas of the respective subject, wherein the construction of the display data is carried out at least by organizing at least some of the acquired images, from the image data, and/or portions thereof, according to their respective FOV ; and a ROIS storage unit configured to retrievably store therein received image data, associated FOVs and/or display data for each respective subject; a ROIS display subsystem, configured to controllably display the constructed display data or part thereof, wherein the ROIS being remotely located from the DAS device, for remote medical inspection of the oral cavity areas of subjects.

19. A method for oral cavity inspection, comprising: acquiring images of different oral cavity areas of a subject, using at least one image acquisition device; determining a relative field of view (FOV) of the at least one image acquisition device, for each of the acquired images; and constructing display data, based on the acquired images, for visually representing one or more oral cavity areas of the respective subject, wherein the construction of the display data is carried out at least by organizing at least some of the acquired images, and/or portions thereof, according to their relative FOVs.

20. The method for oral cavity inspection of claim 19, further comprising displaying the constructed display data, using one or more display devices of a remote oral inspection subsystem, located remotely from the at least one image acquisition device, for remote medical inspection of the oral cavity areas of the respective subject.

21. The method for oral cavity inspection of any one of claims 19 or 20, wherein the organizing of at least some of the acquired images or portions thereof, comprises stitching at least some of the acquired images or portions thereof to one another, according to their respective FOVs, to generate a panoramic oral cavity image as the display data.

22. The method for oral cavity inspection of any one of claims 19 to 21, wherein the step of constructing of the display data comprises: normalizing the acquired images to one another, to form a coherent scaling and/or coloration of the acquired images; and organizing the normalized images, according to their relative FOVs.

23. The method for oral cavity inspection of any one of claims 19 to 22, wherein the display data is descriptive of information about: a panoramic view of the one or more oral cavity areas; and/or a three-dimensional (3D) model of the one or more oral cavity areas.

24. The method for oral cavity inspection of any one of claims 19 to 23, further comprising illuminating the oral cavity of the respective subject while acquiring images thereof, using at least one light source.

25. The method for oral cavity inspection of any one of claims 19 to 24, wherein the steps of acquiring the images and determining their relative FOVs is carried out using a designated application operable via a mobile communication device, the designated application being configured to transmit the acquired images and their relative FOVs to the remote oral inspection subsystem.

26. The method for oral cavity inspection of claim 25, wherein the construction of the display data is carried out at the remote oral inspection subsystem.

27. The method for oral cavity inspection of any one of claims 19 to 26, wherein the steps of acquiring the images, determining their relative FOVs and constructing display data based thereon is carried out using a designated application operable via a mobile communication device, the designated application being configured to transmit the constructed display data to the remote oral inspection subsystem for controllably displaying thereof.

28. The method for oral cavity inspection of any one of claims 19 to 27, further comprising analyzing the acquired images and/or the display data for determining a clinical condition of the subject, wherein the analysis of the acquired images or and/or of the display data is carried out by using one or more analysis programs, configured to detect one or more medical abnormalities and/or distinct to distinct normal oral cavity areas from abnormal oral cavity areas.

29. The method for oral cavity inspection of claim 28, wherein the one or more image analysis based medical abnormalities detection is carried out by identifying oral cavity areas showing one or more of: topographic abnormality; coloring abnormality; morphological abnormality; vasculature abnormality; and/or changes over time in any one or more of the above oral cavity abnormalities.

30. The method for oral cavity inspection of any one of claims 28 or 29, wherein the detected medical abnormalities are associated with one or more of the following medical oral conditions: oral cavity wounds; oral cavity ulcers; oral cavity inflammation; dental condition; oral cavity benign, malignant and/or premalignant ulcers and/or tissue.

31. The method for oral cavity inspection of any one of claims 29 to 30 further comprising identifying physical characteristics of identified abnormalities, and visually displaying information associated with the identified abnormalities, and wherein the physical characteristics of identified abnormalities comprise one or more of: peripheral boundaries of the identified abnormality; topography of the identified abnormality; morphology of the abnormality; location indication of the abnormality in the oral cavity; coloration of the abnormality.

32. The method for oral cavity inspection of any one of claims 19 to 31 further comprising: comparing the display data of the respective subject acquired at a specific acquisition time, to one or more previously constructed display data, for tracking oral cavity medical condition of the respective subject, over time.

33. The method for oral cavity inspection of any one of claims 19 to 32 further comprising fixating the subject's mandible and/or jaw at a fixed and stable posture, while acquiring images, using an orthodontic fixture device.

34. The method for oral cavity inspection of any one of claims 19 to 33 further comprising controlling the displaying of the display data, using a user interface (UI) operable via the remote oral inspection subsystem.

35. The method for oral cavity inspection of any one of claims 19 to 34 further comprising audibly and/or visually displaying of guidance instructions to the subject or a caretaker thereof, using a digital image acquisition guide (DIAG).

36. An intra-oral dental imaging system comprising: a housing comprising at least an insertion portion at a distal end thereof for inserting into an oral cavity, the housing comprising: a light source generating light rays which are transmitted through a light transmission path to illuminate at least a portion of said oral cavity, an imaging device for receiving said light rays when said light rays are reflected from said at least a portion of said oral cavity and transmitted through an optical path to said imaging device, a mirror configured to be translatable in one or more degrees of freedom to one or more predetermined positions, wherein each of said predetermined positions directs said light rays from said light source through said light transmission path to a desired region of interest (ROI) in said oral cavity, and directs said reflected light from said ROI through said optical path to said imaging device, and a scanning means comprising at least one actuator, for translating said mirror in said one or more degrees of freedom.

37. The system of claim 36, wherein said mirror is disposed internally of said insertion portion.

38. The system of claim 36, wherein said mirror is disposed externally and distally to said insertion portion, and wherein said housing further comprises a mirror extension support for coupling said mirror to said housing.

39. The system of any one of claim 36 to 38, wherein said mirror is disposed at a tilt angle relative to said light transmission path, and wherein said tilt angle is between 20 and 60 degrees.

40. The system of claim 39, wherein said tilt angle is 35 degrees.

41. The system of any one of claims 36 to 40, wherein said scanning means is configured for translating said mirror by at least one of: moving said mirror axially along a longitudinal axis of said housing, rotating said mirror about a longitudinal axis of said housing, and modifying a tilt angle of said mirror relative to said light transmission path.

42. The system of any one of claims 36 to 41, wherein said scanning means is configured for translating said mirror by simultaneously (i) moving said mirror axially along said longitudinal axis of said housing and (ii) rotating said mirror about said longitudinal axis of said housing.

43. The system of any one of claims 36 to 42, further comprising a control unit comprising at least a positioning module, wherein said positioning module is configured to operate said scanning means to translate said mirror into said one or more predetermined positions.

44. The system of claim 43, wherein said control unit is located remotely to said housing, and is in data communication with said housing.

45. The system of any one of claims 43 to 44, wherein said positioning module is configured to operate said scanning means to translate said mirror into a sequence of said predetermined positions, and wherein each predetermined position in said sequence of predetermined positions (i) directs said light rays from said light source through said light transmission path to a specified ROI in a sequence of ROIs, and (ii) directs said reflected light from said specified ROI through said optical path to said imaging device.

46. The system of claim 45, wherein at least some of said ROIs in said sequence of ROIs are adjoining ROIs.

47. The system of any one of claims 45 to 46, wherein at least some of said ROIs in said sequence of ROIs are at least partially overlapping ROIs.

48. The system of any one of claims 45 to 47, wherein said control unit is further configured to operate said imaging device to acquire an image at each of said predetermined positions in said sequence of predetermined positions.

49. The system of claim 48, wherein said control unit is further configured to combine all of said images to generate a combined image of at least a portion of said oral cavity.

50. The system of claim 49, wherein said control unit is further configured to display said combined image on a display.

Description:
SYSTEMS. DEVICES. SUBSYSTEMS AND METHODS FOR ORAL CAVITY

INSPECTION

CROSS-REFERENCE TO RELATED APPLICATION

[0001] This application claims the benefit of U.S. Provisional Application Ser. No. 62/963,356, filed January 20, 2020, the content of which is incorporated by reference herein by reference in its entirety.

FIELD OF THE INVENTION

[0002] The present disclosure relates in general to oral cavity inspection.

BACKGROUND

[0003] Oral cavity inspection is used for a variety of purposes, such as for medical or clinical diagnostic purposes (e.g., for detection of medical abnormal tissue, wounds and/or their characteristics), or for tracking healing processes, over time, of oral cavity tissue or wounds, referred to herein interchangeably as "medical abnormalities" and/or "oral cavity medical abnormalities."

[0004] Oral cavity wounds or lesions may be divided into four main groups: (a) mechanical - surgical and traumatic wounds; (b) chronic wounds such as ulcers or oral Lichen Planus (OLP); (c) bums, chemical or thermal injuries; and (d) benign, malignant or premalignant lesions such as epithelial dysplasia, melanomas and carcinomas. Most chronic wounds are ulcers that are associated with age related diseases such as ischemia, diabetes and others like mellitus, venous stasis disease, or pressure.

[0005] Some oral cavity medical conditions require close tracking of the disease progression and/or healing progression, such as in cases in which premalignant or malignant lesions are detected and treated, in order to see if the lesion grows or reduces in size, in response to treatment, and adapt treatment according to disease/healing progression.

[0006] In many cases subjects are required to visit a medical professional for oral inspection to determine his/her medical oral cavity condition. The healing process of a wound located in the oral cavity may be influenced by the nature of the tissue disruption and the circumstances surrounding the wound’s closure. The wound healing response is a natural, essential, phylogenetically defense mechanism aimed at restoring tissue integrity. A variety of local and systemic factors can impede healing. Factors affecting cutaneous and oral wound healing include, for example, age and sex -hormones, stress, diabetes, obesity, medications and nutrition.

[0007] The foregoing examples of the related art and limitations related therewith are intended to be illustrative and not exclusive. Other limitations of the related art will become apparent to those of skill in the art upon a reading of the specification and a study of the figures.

BRIEF DESCRIPTION OF THE FIGURES

[0008] The figures illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.

[0009] For simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity of presentation. Furthermore, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. References to previously presented elements are implied without necessarily further citing the drawing or description in which they appear. The figures are listed below.

[0010] Fig. 1 is a block diagram, schematically illustrating a system for oral cavity inspection, according to some embodiments;

[0011] Fig. 2 shows a fixture device for fixating a subject's jaw, configured also for holding an image acquisition device, according to some embodiments;

[0012] Fig. 3 shows average optic flow vectors using optical flow technique for determining and mitigating relative movements between the image acquisition device and the subject, according to some embodiments;

[0013] Fig. 4 is a block diagram, schematically illustrating a system for oral cavity inspection, according to some embodiments;

[0014] Fig. 5A is a block diagram, schematically illustrating a construction subsystem, according to some embodiments; [0015] Fig. 5B is a block diagram, schematically illustrating a positioning subsystem, according to some embodiments;

[0016] Fig. 5C is a block diagram, schematically illustrating a display subsystem of a remote oral inspection subsystem, according to some embodiments;

[0017] Fig. 6A shows a sequence of acquired images of an oral cavity of a subject;

[0018] Fig. 6B shows a 3D model associated with the images shown in Fig. 6A;

[0019] Fig. 7A shows an oral atlas display of a subject's oral cavity area, according to some embodiments;

[0020] Fig. 7B shows a generic oral cavity display, used for indicating the subject's oral cavity area being displayed in the oral atlas, according to some embodiments;

[0021] Fig. 8 shows a flowchart, illustrating an image acquisition and oral cavity inspection process, according to some embodiments;

[0022] Fig. 9 shows a flowchart, illustrating an image acquisition and oral cavity inspection process, involving automatic analysis of the acquire images, according to some embodiments;

[0023] Fig. 10 shows a flowchart, illustrating a process for constructing a panoramic view image display data for a subject, according to some embodiments;

[0024] Fig. 11 shows a flowchart, illustrating a process for constructing a 3D model display data for a subject, according to some embodiments;

[0025] Fig. 12 schematically illustrates a system for oral cavity inspection, having multiple remote oral cavity inspection subsystems, according to some embodiments;

[0026] Fig. 13 schematically illustrates a system for medical internal inspection of subjects, according to some embodiments;

[0027] Fig. 14 illustrates a system for oral cavity inspection having an image acquisition device rigidly and removably connectable to a mobile device via a removable connector, for determining image acquisition device field of view for each image, acquired thereby, based on orientation sensor(s) of the mobile device rigidly connected thereto, according to some embodiments;

[0028] Fig 15A shows a connector for removably and rigidly connecting a mobile device and a camera to each other for using mobile device's orientation and/movement sensor(s) for determining field of view of the camera for images acquired thereby, according to some embodiments;

[0029] Fig 15B shows a zoomed-in image of the connector of Fig. 15A for removably and rigidly connecting a mobile device and a camera to each other for using mobile device's orientation and/movement sensor(s) for determining field of view of the camera for images acquired thereby, according to some embodiments; and

[0030] Figs. 16A-16B show exemplary dental probe imaging systems for intra-oral imaging, according to some embodiments.

DETAILED DESCRIPTION

[0031] Aspects of disclosed embodiments pertain to systems, subsystems, devices and methods for oral cavity inspection, using at least one image acquisition device such as a still and/or video camera, configured to acquire images of different oral cavity areas of a subject and output image data descriptive of the acquired images; a positioning subsystem, configured to determine a relative field of view (FOV) of the at least one image acquisition device, for each acquired image in the respective image data; and a construction subsystem configured to generate, based on the image data of the respective subject, display data for visually representing one or more oral cavity areas of the respective subject. In some embodiments, the construction of the display data may be carried out by organizing at least some of the acquired images from the image data, and/or portions thereof, according to their relative FOVs. Optionally, the acquired FOVs may at least partially overlap. Optionally, a wide FOV may fully cover one or more narrow FOVs.

[0032] Disclosed embodiments of the systems may be configured to enable a subject or a caretaker thereof, to establish the image data remotely from one or more professionals, where the actual inspection is carried out remotely from the subject, by the one or more professionals, inspecting the display data of the respective subject and of the respective acquisition time. This may enable easy and comfortable tracking and/or diagnosis of medical or clinical condition of the subject's oral cavity, without requiring the subject to physically arrive at the professional's clinic for each required oral cavity inspection.

[0033] The remote inspection may be useful, yet not limited, for tracking abnormalities (e.g., diseases, infections, tumors, etc.) progress or spread, healing processes and/or treatment responsiveness of oral cavity wounds or tissue over time, by, for example, having the professional inspector receive display data of acquired images of the same subject, taken at different times, for determining the healing or disease progress over time, where the subject or a caretaker thereof, simply sends the image data and/or the display data based thereon, for remote inspection at different times (e.g., dates).

[0034] According to some embodiments, the constructed display data may be displayed by at least one remote oral inspection subsystem (ROIS), remotely located from the at least one image acquisition device, for remote medical inspection of the oral cavity areas of the respective subject. According to some embodiments the ROIS may be operable via a computerized data storage, processing, communication, input and display machine.

[0035] It is noted that the term "inspection" or "medical inspection", interchangeably used herein, may relate to, for example, medical or clinical observation, advisory, diagnosis, tracking, follow-up, check, examination and/or the like.

[0036] It is noted that the term "inspector" used herein may refer to any user and/or any person qualified for medically and/or clinically inspecting oral cavity, such as, yet not limited to, a medical expert.

[0037] It is noted that the term " subject" , used herein, may relate to any individual being medically or clinically inspected, such as a "patient", a "human", etc.

[0038] It is noted that the term "oral cavity" may relate to any type of organ and/or tissue or object inside a subject's mouth and/or pharynx.

[0039] According to some embodiments, the system, for oral cavity inspection, may include a data acquisition subsystem (DAS) operable via one or more computation and communication devices (CCDs), such as mobile communication devices, laptops, personal computers (PCs), tablet devices, cellphone devices etc., configured for data input, output, computation, programming and communication.

[0040] According to some embodiments, the system, for oral cavity inspection, may include a data acquisition subsystem (DAS) operable via one or more computation and communication devices (CCDs), such as mobile communication devices, laptops, personal computers (PCs), tablet devices, cellphone devices, a computerized end-user device, etc., configured for data input, output, computation, programming and/or communication. [0041] A computerized end-user device may include a multifunction mobile communication device also known as “smartphone”, a personal computer, a laptop computer, a tablet computer, a server (which may relate to one or more servers or storage systems and/or services associated with a business or corporate entity, including for example, a file hosting service, cloud storage service, online file storage provider, peer- to-peer file storage or hosting service and/or a cyberlocker), personal digital assistant, a workstation, a wearable device, a handheld computer, a notebook computer, a vehicular device, a stationary device and/or a home appliances control system.

[0042] According to some embodiments, the DAS may be located remotely from the ROIS, for enabling the subject or a caretaker thereof to carry out, at least the oral cavity image acquisition at a remote location from the professionals' clinic and/or person, for improving subject's convenience.

[0043] According to some embodiments, the DAS and/or the ROIS may be implemented as a software program, hardware, or a combination thereof.

[0044] According to some embodiments, the DAS may be configured to acquire and/or receive visual images of the subject's oral cavity at each "acquisition session", associated with a specific acquisition time parameter such as acquisition date., receive or acquire FOV of each acquired image, output image data including the acquired images, FOVs thereof and, optionally, the associated timing data, and transmit the related data (e.g., data descriptive of one or more FOV-acquisition time tuples) to the ROIS via one or more communication links.

[0045] According to some embodiments, the construction subsystem, configured to construct display data, for each given image data, may be embedded via the DAS and/or via the ROIS.

[0046] For example, the DAS may also include the construction subsystem, and carry out the construction of the display data, and transmit the constructed display data to the ROIS, vis one or more communication links, where the ROIS may be configured to receive the display data and controllably display thereof via one or more display devices.

[0047] In other examples, the construction subsystem (or parts thereof) may be embedded in the ROIS, where the DAS transmits image data of a respective acquisition session to the ROIS, and both the construction of the display data and the display thereof are carried out by the ROIS. [0048] According to some embodiments, the display data may include an (e.g., panoramic) two-dimensional (2D) view of the oral cavity or one or more areas thereof, and/or a three-dimensional (3D) model of the oral cavity or one or more areas thereof.

[0049] According to some embodiments, the panoramic view may be established by the construction subsystem, by stitching at least some of the acquired images or portions thereof, to one another, according to their respective FOVs.

[0050] In some embodiments, image data relating to various (e.g., partially overlapping) FOVs may be fused to generate display data.

[0051] In some embodiments, different FOVs may be associated (e.g., tagged) with respective information including, for example, positional information relating to the imaged oral cavity such as image acquisition angle, distance from the subject, FOV subtended by the image acquisition device, clinical information, anatomical information, and/or the like for providing a user, for example, with a positional map of the displayed images, along with clinical, anatomical and/or other information related to the display data.

[0052] According to some embodiments, to determine the relative FOV of each acquired image, one or more relative FOV determination modules may be used, such as software and/or hardware and/or sensor based modules. For example, a software and/or hardware based module, using image processing or image analysis programs and or circuitry may be used to determine relative FOV of each image in respect to one or more other acquired images. Additional or alternative ways to determine relative FOV of each image, may include using one or more sensor attached to the image acquisition device (e.g., camera) and/or to devices in which the image acquisition device is embedded and/or devices connecting the image acquisition device to another computation and communication device, such as one or more gyroscopes, accelerometers, etc.

[0053] According to some embodiments the ROIS and/or the DAS may each include a data storage unit, using one or more databases, for retrievably storing data therein, such as image data and/or display data, display commands and/or programs, display data construction commands and/or programs etc.

[0054] According to some embodiments, the construction of the display image, for each display data of each subject may be carried out by first normalizing the acquired images to one another, in order to form a coherent scaling of the acquired images to generate normalized images, and then organizing the normalized images, according to their respective FOVs, e.g., to form a panoramic view display data and/or to form a 3D model of the oral cavity or area(s) thereof as the display data.

[0055] According to some embodiments the scaling normalization may be carried out using one or more predefined and/or selectable canonical planes.

[0056] According to some embodiments, the DAS may further include at least one light source, configured to illuminate the oral cavity of the subject. The light source(s) may be configured to emit white light, and/or light of one or more wavelength bands such as in the spectral range of one or more of: ultraviolet (UV), chemofluorescent blue light, toluidine blue light, green amber light, infrared (IR) light etc. The selection of the wavelength band may be done according to lesion type to be detected or tracked.

[0057] According to some embodiments, the one or more light sources may include one or more of: a light emitting diode (LED); a flashlight, a white light lamp, an ultraviolet (UV) light source, an IR light source, an array of LED light sources, each LED light source may be configured to emit light of a different (optionally narrow) wavelength band.

[0058] According to some embodiments, the subject may be required to use one or more dye material, designated for dying specific tissue/cells types for improved inspection such as, for example, an acidophilic dye that selectively stains acidic substances such as deoxyribonucleic acid (DNA), for visual identification of oral cavity medical abnormalities such as oral potential malignant disorders (OPMDs), oral dysplasia and/or early oral squamous cell carcinoma (OSCC).

[0059] According to some embodiments, the image acquisition device may include one or more of: a camera, a three-dimensional (3D) sensor, a charged coupled device (CCD) camera, a narrow spectral optical detector, a Complementary metal-oxide- semiconductor (CMOS) camera, a hybrid CCD-CMOS camera, a wideband spectral optical detector, a wide FOV camera, a telephoto camera, a macro camera, a digital camera, a range sensor, a periscope device having a camera and reflective surfaces, a binocular camera, and/or the like. The type of the one or more image acquisition devices may be adapted to the wavelength band(s) emitted by the one or more light sources.

[0060] According to some embodiments, the DAS may be implemented as a designated application operable via mobile communication devices such as mobile phones and/or tablets, etc., enabling, for example using the mobile communication device's camera(s), light source(s) and/or orientation sensing devices and programs for image acquisition and illumination of the subject's oral cavity, for determining FOV of each acquired image and for transmitting image data to the ROIS. The designated application may also be configured to carry out the display data construction and send the display data to the ROIS for inspection.

[0061] According to some embodiments, a special designated (e.g., high-definition (HD)) standalone camera (separate from the mobile communication device) may be used as the image acquisition device, for improving image quality and/or for enabling insertion of the camera deeper into the subject's oral cavity. The standalone camera may be connectable to the mobile communication device to enable the designated application to receive images therefrom.

[0062] According to some embodiments, the standalone camera may be fixedly connectable to the mobile communication device, such that the mobile communication device moves in response to the moving of the camera, to enable using orientation sensing devices and/or programs of the respective mobile communication device for determining FOV of each acquired image.

[0063] According to some embodiments, the ROIS may be also configured to enable the professional inspector to write his/her medical opinion, based on the display data of the respective subject and send the written medical opinion to the subject or a caretaker thereof. The written medical opinion may be sent to the subject's mobile communication device and displayed via the DAS designated application.

[0064] According to some embodiments, the ROIS may be further configured to display and/or analyze several files of display data of a respective subject, each based one oral cavity image data files acquired at different acquisition times (e.g., different dates), for tracking disease, abnormalities and/or healing processes and/or progress over time, by comparing oral cavity areas in the display data files.

[0065] According to some embodiments, the system for oral inspection may further include an analysis engine, configured to receive the image data and/or the display data of a subject and operate image analysis thereover to determine medical condition (normal/abnormal) of area(s) of the subject's oral cavity and output analysis results. For example, the analysis engine may be configured to detect medical abnormalities and their related characteristics, e.g., by identification of one or more visually detectable, topographic, coloring, morphological, and/or vasculature abnormalities in physical abnormalities in oral cavity surface or tissue and/or changes thereof over time. For example, the analysis engine may be configured to detect specific types of abnormalities such as lesions, ulcers and optionally the characteristics thereof including, for example, size, peripheral borders, topography and/or morphology thereof, abnormality classification (e.g., inflammation, ulcer, autoimmune, viral or bacterial infection related, benign, malignant or premalignant and/or the exact type of malignant or premalignant cells in the identified abnormal lesion or tissue, etc.), location of the abnormality in the oral cavity etc.

[0066] According to some embodiments, the analysis engine may further be configured to apply one or more image processing programs to facilitate diagnosis. Such image processing tools can include filters for automatically or semi-automatically and adaptively changing (e.g., improving) quality of acquired images, to highlight areas of interest in order, for example, to improve medical abnormalities detection and/or distinction between normal to abnormal oral cavity tissue, borders, etc. Such filters can include, for example, high-pass filters, low-pass filters, band-pass filters, etc. For example, the image quality of an ROI may be improved. In some embodiments, portions of the displayed image may be adaptively blurred or deblurred or otherwise processed to enhance or decrease locally the quality of the image, for example, to facilitate diagnosis. For example, the image quality of a selected portion of an image may be reduced (e.g., blurred) to facilitate diagnosis with respect to another (optionally, enhanced or deblurred) portion of the displayed image.

[0067] According to some embodiments, the detected medical abnormalities may be associated with one or more visually detectable physical characteristics such as: peripheral boundaries of an identified medical abnormality, topography of the identified abnormality, morphology of the identified abnormality, coloration of the identified abnormality, location identification indicators of the identified abnormality, etc. The physical characteristics of each medical abnormality may be determined or identified as part of the medical abnormalities' identification process, and information thereof may be stored and/or displayed in association with the display data.

[0068] According to some embodiments, the analysis engine may be further configured to compare between display data of different acquisition times of the same subject, for oral cavity medical abnormalities' distinction and detection as well as for tracking disease/healing progression for known (e.g., previously detected) medical abnormalities.

[0069] According to some embodiments, the ROIS may also include a user input-based control module such as a user interface (UI) such as a graphical UI (GUI), configured to enable a professional user to carry out one or more of the following operations: control display of the display data; mark suspicious medical abnormalities' periphery, or area, present chronical display of display data imagery of the subject taken at different times (dates) for visual and chronical comparison-based inspection, input written medical opinions and/other messages to the subject and control their transmission, etc.

[0070] According to some embodiments, the display data or portions thereof, associated with the respective subject and a specific image acquisition time (e.g., date), may be compared to a generic oral cavity of a healthy person, for detection of medical abnormalities in the subject's oral cavity.

[0071] According to some embodiments, the DAS of the system for oral cavity inspection may further include a digital image acquisition guide (DIAG), configured to provide auditory and/or visual guiding instructions for image acquisition improvement. For example, the subject may be required to acquire several initial images of his/her oral cavity area(s), where the DIAG processes these initial images e.g., in real time or near real time, to determine camera position and/or image quality and calculate a preferable position of the camera, and output guiding instructions to the subject for leading the subject to position the camera to the desired (e.g., optimal) position.

[0072] |According to some embodiments, the construction subsystem may operate an optical flow image processing operator, over at least some of the acquired images, to determine relative movements between the subject's oral cavity and the at least one image acquisition device. The construction of the display data may, accordingly, be further based on resulting relative movements between the subject's oral cavity and the at least one image acquisition device.

[0073] Aspects of disclosed embodiments pertain to systems for oral cavity inspection including at least one memory for storing data and software code, and a processor which, when executing the software code, results in the execution of the following steps:

[0074] retrievably storing acquired images of different oral cavity areas of a subject, acquired by using at least one image acquisition device; [0075] retrievably storing relative field of view (FOV) of the at least one image acquisition device, for each of the acquired images; and constructing display data, based on the acquired images, for visually representing one or more oral cavity areas of the respective subject, wherein the construction of the display data may be carried out at least by organizing at least some of the acquired images, and/or portions thereof, according to their relative FOVs.

[0076] Reference is now made to Fig. 1, schematically illustrating a system 1000 for oral cavity inspection, according to some embodiments. The system 1000 may include an image acquisition device such as a camera 1100; a data acquisition subsystem (DAS) 1200 operable via a computer and communication device (CaCD) 100 such as a PC, a laptop, a mobile communication device such as a cellphone, a tablet device etc.; a construction subsystem 1300; and a remote oral inspection subsystem (ROIS) 1400.

[0077] According to some embodiments, the ROIS 1400 may be located remotely from the DAS 1200 and camera 1100, where the DAS 1200 and camera 1100 are used by the subject or a caretaker thereof, to acquire images of oral cavity of the respective subject, determine FOV of each image and send image data including, for example, the acquired images, at least one acquisition time parameter (e.g., acquisition date) and information indicative of FOV of each acquired image. The image data may then be processed by the construction subsystem 1300, for construction of respective display data such as, for example, a 3D model and/or panoramic view of the subject's oral cavity.

[0078] The construction subsystem 1300 may be embedded in the DAS 1200 and/or in the ROIS 1400, and/or in a separate computation and communication device.

[0079] According to some embodiments, the DAS 1200 may include: a DAS image acquisition subsystem 1210, configured to receive acquired images and optionally also additional acquisition information from the camera 1100 such as image acquisition parameters e.g., image resolution, camera orientation, camera focusing parameters, camera FOV parameters, camera aperture, etc.

[0080] According to some embodiments, the DAS image acquisition subsystem 1210 may further be configured to execute an image processing program (e.g., algorithm) for selecting images according to one or more image selection criteria, such as according to the image quality and/or resolution, image content, removing identical or similar regional coverage related images, etc. [0081] According to some embodiments, the DAS image acquisition subsystem 1210 may be further configured to control the camera 1100 from the CaCD 100, e.g., by including and operating an acquisition UI allowing controlling camera parameters such as zooming, FOV, resolution etc., via the UI.

[0082] According to some embodiments, the DAS image acquisition subsystem 1210 may be further configured to operate a real time (RT) or near RT digital image acquisition guide (DIAG), for guiding the subject or a caretaker thereof while acquiring the oral cavity images, e.g., by analyzing camera 1100 output data in RT or near RT and outputting corresponding directions or instructions for correctly positioning the camera 1100 in respect to the subject's oral cavity.

[0083] According to some embodiments, the DAS positioning subsystem 1220 may be configured to determine relative FOV of each acquired image or of each selected acquired image. The determining of the relative FOV of each image may be carried out in any one or more of the following manners:

(i) receiving relative FOV of each image from the camera;

(ii) image processing of visual content of the acquired images; and/or

(iii) using sensor data from one or more sensors such as an accelerometer or a gyroscope (e.g., comprised by the DAS 1200 or by using CaCD's 100 FOV sensors and/or programs).

[0084] According to some embodiments, the camera 1100 used for image acquisition, may be fixedly connectable to a mobile communication device CaCD 100 such as a tablet or cellphone device, to have the CaCD 100 orientation and movement responsive to the camera's 1100 movement, e.g., such that the CaCD's 100 moves in coordination with the camera 1100. This may enable to determine the relative FOV of each acquired image, by using software, hardware and/or sensors movement/orientation detection mechanisms, typically embedded in mobile communication devices, for relative FOV determination.

[0085] According to some embodiments the system 1000 may use additional or alternative image acquisition device(s) for acquiring images of subjects' oral cavity, such as, for example, a 3D sensor outputting a points cloud, each point indicative of a 3D position (e.g., coordinates' values), one or more cameras embedded in the CaCD 100, an optical sensor or detector and/or the like. [0086] According to some embodiments, the DAS 1200 may also include a DAS communication subsystem 1230, configured to transmit and receive data via one or more communication links such as communication link 10, and/or via one or more communication technologies and//or protocols.

[0087] According to some embodiments, the DAS 1200 may also include a DAS memory unit or use memory unit of the CaCD 100 to form and manage a database 110, configured to retrievably store image data, programs and/or commands of the DAS subsystems 1210-1230.

[0088] According to some embodiments, the construction subsystem 1300 may receive the image data from the DAS 1200, including acquired images, the determined relative FOV of each image and optionally also additional information such as acquisition time, and construct, based on the received image data, associated with the respective subject and the respective acquisition time, and construct the respective display data, based on the image data and the relative FOVs of the images in the image data.

[0089] For example, the construction of the display data may be carried out by organizing at least some of the acquired images or portions thereof, according to their respective relative FOVs.

[0090] According to some embodiments, the ROIS 1400 may be operable via a computation and communication device such as a PC, a laptop, a tablet or a mobile communication device, and may be configured to receive the constructed display data of each subject, and display thereof, using, for example, one or display devices, such as display device 1460.

[0091] According to some embodiments, the display device 1460 may include one or more of: a screen, a touch screen.

[0092] According to some embodiments, the displaying of the display data of each subject, may be used for having an inspector such as a relevant medical expert, view the display data to medically inspect areas of oral cavity of each subject, e.g., for identification and/or tracking healing disease progress of oral cavity abnormalities such as lesions, wounds, abnormal tissue, etc.

[0093] According to some embodiments, the ROIS 1400, may include a ROIS communication subsystem 1410 for receiving and transmitting data via one or more communication links and//or technologies. For example, the ROIS communication subsystem 1410 may receive display data and/or image data, and/or relative FOVs of images from multiple DASs, indicative of display data constructed based on images of multiple subjects acquired on multiple acquisition times.

[0094] According to some embodiments, each received display data associated with a specific subject and a specific acquisition time, may be controllably displayed by the ROIS display subsystem 1430, using the display device 1460 of the ROIS 1400.

[0095] According to embodiments, the ROIS display subsystem 1430 may be configured to display, display data of subjects as well as to control each received display data. For example, the ROIS display subsystem 1430 may include or operate a user interface (UI) such as a graphical user interface (GUI), e.g., displayed to users via the display device 1460 and responsive to user input (e.g., using one or more input devices such as a computer mouse, a touch screen, keyboard and/or the like).

[0096] According to some embodiments, the GUI may enable users such as inspectors, to select portions of the display data to be displayed, to view several display data displays of a subject acquired on different dates for comparison therebetween (e.g., for tracking healing and/or disease progress of lesions, wounds or oral cavity tissue), to mark or highlight identified medical abnormality oral cavity areas, to improve display quality, to move the display data e.g., by rotating and/or lateral screen displacement thereof, to upload one or more specific original acquired images of a certain oral cavity area (e.g., to view this area in a higher image resolution) of and/or to zoom in and out into or out of certain displayed oral cavity areas.

[0097] According to some embodiments, the ROIS 1400 further includes a ROIS analysis engine 1440, e.g., configured for automatic inspection of oral cavity of each specific subject. The ROIS analysis engine 1440 may include one or more analysis programs, configured to process or analyze the display data and/or the acquired images associated therewith, e.g., using artificial intelligence (AI) based programs, to identify and/or track oral cavity medical abnormalities and optionally also for automatic output of analysis results.

[0098] According to some embodiments the analysis results may be indicated as any one or more of:

• Visual indication of locations, areas and/or boundaries of identified medical abnormalities optionally over the display data or portion(s) thereof; • Indications of each identified medical abnormality and physical characteristics thereof;

• Textual medical observation message;

• Textual indication of medical abnormalities;

• Changes over time in previously identified abnormalities and characteristics thereof (e.g., by comparing display data of the same subject associated with different chronologically ordered acquisition dates).

[0099] According to some embodiments, an analysis engine such as described above in respect to the ROIS analysis engine 1440, may be embedded as part of the construction subsystem 1300, such that the constructed display data is transmitted along with analysis results information. In this way, the inspector may receive automatic initial oral cavity inspection results as an auxiliary medical information.

[0100] According to some embodiments analysis results may be transmitted also directly to the respective subject or a caretaker thereof.

[0101] According to some embodiments, the ROIS 1400 further includes a storage unit 1470, configured to retrievably store data such as display data of subjects, image data, FOVs data, instructions, rules, programs and/or commands such as control programs, GUI programs, analysis programs and/or the like.

[0102] According to some embodiments, as shown in Fig. 1, one or more light sources such as light source 1110 may be used for illuminating the subject's oral cavity while images thereof are acquired. The light source 1110 may include, for example, any one or more of: white light emitting light source (e.g., lamp), one or more light emitting diodes (LEDs), each emitting in a different narrow wavelength band and/or the like.

[0103] The ROIS 1400 may further include a ROIS feedback subsystem 1450, configured to enable inspectors to generate, input and transmit inspection related feedback to subjects.

[0104] According to some embodiments, the subject may be required to use one or more mouth-washable substances for enabling or improving visual identification of one or more normal or abnormal oral cavity areas, tissue, lesions, wounds etc. The selected mouth-washable substance may be associated with the specific wavelength band illumination used and/or with the specific type of medical abnormality to be identified. [0105] For example, an acidophilic dye mouth-washable substance may be used for selective staining of acidic substances such as DNA in the oral cavity tissue, using white and/or blue light emitting light source(s) for visually indicating tissue areas dyed by the acidophilic dye mouth-washable substance, e.g., for malignant or premalignant tissue, lesions and/or afflictions identification.

[0106] According to some embodiments, to avoid or reduce influences of movements of the subject and/or of the one or more image acquisition device, while acquiring oral cavity images thereof, an orthodontic fixture device 1500, configured to fixate the subject's mandible and/or jaw at a fixed and stable posture, for preventing the subject from moving one or more of his/her jaws when images his/her oral cavity are acquired. Optionally, the orthodontic fixture device 1500 may comprise a plurality of articulated elements that may be selectively secured in position relative to each other.

[0107] According to some embodiments, the orthodontic fixture device 1500 may be designed to be inserted into the subject's mouth and comfortably hold the lower and/or upper jaw. The orthodontic fixture device 1500 may further be configured to hold the camera 1100 and/or any other image acquisition device.

[0108] Fig. 2 shows details of the orthodontic fixture device 1500 illustrated in Fig. 1, according to some embodiments. The orthodontic fixture device 1500, according to these embodiments, may include one or more jaw fixating elements such as two side supports 1510a and 1510b, configured to hold the subject's mouth in an open fixed posture; a bridge element 1520 fixedly connecting the side supports 1510a and 1510b; and a holder 1521, configured to fixedly or movably hold an image acquisition device such as camera 1100. The holder 1521 may be fixedly, movably and/or removably connected to the bridge element 1520.

[0109] According to some embodiments, the movable holder 1521 may include one or more pivotal and/or spherical joints for enabling positioning control over the image acquisition device held by the holder 1521, e.g., by controlling moving of the camera for directing thereof to various oral cavity areas.

[0110] According to some embodiments, the holder 1521 and/or the camera 1100 may be rigidly connected to the CaCD 100, e.g., via one or more removable connectors, for enabling relative FOV determination for each acquired image, by causing the CaCD 100 to move responsively to movements of the holder 1521 and/or the camera 1100 held thereby.

[0111] According to some embodiments, one or more physical markers such as markers 1501a, 1501b and 1501c may be positioned over the orthodontic fixture device 1500, for assisting in scaling and calibration of the acquired images, and or in locating oral cavity areas. The physical markers 1501a-1501c may be used as reference points e.g., to identify the camera's 1100 positioning in respect to the marked reference points to identify relative FOV of each acquired image, as the distances therebetween and their fixed locations in relation to the oral cavity are known.

[0112] In order to extract the relative position of the camera 1100 in relation to the oral cavity area, a regular optic flow algorithm may be applied e.g., via the DAS image acquisition subsystem 1210. Optic flow is the pattern of relative motion between objects, surfaces, and edges in a visual scene caused by the relative motion between an observer and a scene (i.e. relative motion between the camera 1100 and the oral area it faces and between each other). The optical flow technique can be used for high-detail reconstruction of 3D model geometry for display data including a 3D model.

[0113] According to some embodiments the optical flow technique may be based on averaging the optic flow vector field (see Fig. 3 movement of each object is indicated in dashed lines 33 and average linear and non-linear movements are indicated in arrows 31 and 32, respectively). The vector field being indicative of the portion of linear and/or non-linear movements, while averaging the first moment of the vector field is an overall estimation of the camera rotation. To eliminate tremor effects of soft tissues within the jaw an optical flow algorithm may be applied only on the lower teeth set. Occasionally it is required to place an orthodontic fixture device 1500 to provide a clearer view of the entire oral cavity region.

[0114] Reference is now made to Fig. 4, schematically illustrating a system 3000 for oral cavity inspection, according to some embodiments. In these embodiments, the CaCD used is a mobile communication device (MCD) 300 such as a cellphone, a laptop or a tablet device having one or more MCD cameras 301, MCD light sources 302 and MCD movement sensing modules 303 such as software -based MCD positioning sensor, for identifying MCD's orientation and/or movements. The positioning sensor may be configured to output FOV data, e.g., including one or more positioning parameters values, that can be used for calculating relative FOV of each acquired image.

[0115] According to some embodiments, the DAS of the system 3000 may be implemented as a designated application (DA) 3200 operable via the MCD 300 having a DA image acquisition subsystem 3210, configured to receive acquired images from the MCD camera(s) 301, control camera(s) and/or light source(s) operation, and guide the subject or a caretaker thereof in the image acquisition process; a DA positioning subsystem 3220, configured to determine relative FOV of each acquired image; and aa DA communication subsystem 3230, configured to communicate with a ROIS 3300, remotely located from the MCD 300, via one or more communication links such as via wireless communication link 20.

[0116] According to some embodiments, the ROIS 3300 may be operable via a computation and communication device such as a PC, a laptop, a tablet or a mobile communication device, and may be configured to receive the image data of each subject and FOVs of each image therein, construct the display data associated with the received image data, based on image analysis and FOVs of the images, and display the display data of each subject, using, for example, one or display devices.

[0117] According to some embodiments, the displaying of the display data of each subject, may be used for having an inspector such as a relevant medical expert, view the display data to medically inspect areas of oral cavity of each subject, e.g., for identification and/or tracking healing disease progress of oral cavity abnormalities such as lesions, wounds, abnormal tissue, etc.

[0118] According to some embodiments, the ROIS 3300, may include:

• A ROIS communication subsystem 3310 for receiving and transmitting data via one or more communication links and//or technologies, such as image data, FOVs and acquisition time data, associated with a specific subject and a specific acquisition time, from the MCD 300 of the respective subject via the DA 3200 operable thereby;

• A ROIS construction subsystem 3320, configured to construct, for each subject and for each acquisition time associated therewith, display data, based on the subject's image data and relative FOVs of acquired images therein. • A ROIS display subsystem 3330, configured to controllably display subjects' display data;

• A ROIS analysis engine 3340, configured to automatically (e.g., AI based) analyze the display data and/or the image data for automatic inspection of subjects' oral cavities;

• ROIS feedback subsystem 3350, configured to enable inspectors to generate, input and transmit inspection related feedback to subjects;

• A display device 3360, configured at least for visual display and optionally also for user input (e/g/ by using a touch screen);

• A storage unit 3370, configured to retrievably store data such as display data of subjects, image data, FOVs data, instructions, rules, programs and/or commands such as control programs, GUI programs, analysis programs and/or the like.

[0119] The construction of each display data may be carried out by organizing (e.g., stitching) the acquired images or portions thereof based on their relative FOV to form a panoramic view of the oral cavity of the subject for the respective acquisition time, and/or building a 3D model of the subject's oral cavity of the respective acquisition time, based on image processing of the acquired images received in the image data.

[0120] According to some embodiments, each received display data associated with a specific subject and a specific acquisition time, may be controllably displayed by the ROIS display subsystem 3330, using the display device 1460 of the ROIS 1400.

[0121] According to embodiments, the ROIS display subsystem 1430 may be configured to display, display data of subjects as well as to control each received display data. For example, the ROIS display subsystem 3330 may include or operate a user interface (UI) such as a graphical user interface (GUI), e.g., displayed to users via the display device 3360 and responsive to user input (e.g., using one or more input devices such as a computer mouse, a touch screen, keyboard and/or the like).

[0122] According to some embodiments, the construction subsystem such as construction subsystems 1300 and/or 3320 may be configured to only receive image data including all the acquired images and their associated acquisition time, and determine FOV of each image, using an image analysis module. The construction subsystem may also be configured to normalize the scaling of content of each image for outputting normalized images, each associated with a relative FOV, and then construct the display data based on the normalized images and their relative FOVs.

[0123] Reference is now made to Fig. 5A, schematically illustrating a possible construction subsystem 5000 configuration, according to some embodiments. The construction subsystem 5000 may include:

[0124] An image analysis module 5010, configured to receive, from a DAS, image data including acquired images of a subject's oral cavity and their associated acquisition time, optionally receiving FOV related data (e.g., accelerometer output data for each image), determine relative FOV of each image and normalize at least some of the acquired images, generating a set of normalized images for the respective subject and acquisition time; and one or more of:

• A Panoramic Construction module 5020, configured to stitch the normalized images and/or portions thereof, according to their relative FOVs to form a panoramic view image as at least part of the display data. The display data may also include location information, for enabling display navigation in the panoramic view image;

• A 3D construction module 5030, configured to construct a 3D model of the respective subject's oral cavity or area(s) thereof, based on the normalized images and their relative FOVs.

[0125] According to some embodiments, in order to construct the 3D model of the subject's oral cavity, a modeling program or algorithm may be used, that uses images of several FOVs of the same oral cavity area and/or one or more reference points (e.g., markers' indication in the images), to construct a 3D modeling thereof.

[0126] According to some embodiments the display data may be displayed alongside with a model atlas and/or a pictures atlas. The model atlas may be a 3D or 2D illustration of a general oral cavity illustration, where the oral cavity area in the display data being displayed is indicated over the illustrated oral cavity, changing the regional indication over the general oral cavity illustration, through the display data viewing navigation, in accordance with the currently displayed oral cavity area. The picture atlas may show the actual acquired images associated with the respective displayed oral cavity area. [0127] Fig. 5B schematically illustrates a positioning subsystem 6000, according to some embodiments. The positioning subsystem 6000, which may be embedded in a DAS, may include: a FOV module 6010, configured to determine relative FOV of each acquired image, e.g., based on image analysis and/or on FOV related data arriving from the CaCD, which operates the DAS that includes the positioning subsystem 6000; and a mitigation module 6011, configured to mitigate image quality and/or relative FOV determination for each acquired image, that may be caused due to movements of the subject, reflex movements or oral cavity tissue and/or camera movements, e.g., by using optical flow detection, which detects patterns of relative movement between the at least one image acquisition device and one or more reference locations in the subject's oral cavity.

[0128] According to some embodiments the mitigation may be carried out by using optical flow techniques.

[0129] Fig. 5C schematically illustrates a ROIS display subsystem 7000, according to some embodiments. The ROIS display subsystem 7000 may include a UI 7010 such as a GUI, and a display control module 7011 operable via the UI 7010.

[0130] According to some embodiments, the control module 7011 may be configured to receive operational input from users via the UI 7010.

[0131] According to some embodiments, the UI 7010 may include one or more of the following display control and graphical tools:

(i) graphical tools e.g., for allowing the user to mark suspected abnormal oral cavity areas;

(ii) comparison tools for displaying display data of the same subject from various acquisition times and/or for automatic comparison and display of changes between areas in the oral cavity of the respective subject over time;

(iii) display data orientation control tools (e.g., for rotating the display data or parts thereof);

(iv) display data zooming control tools;

(v) image quality control tools (e.g., sharpness, color, brightness and/or contrast control tools);

(vi) virtual topography control tools (showing the oral cavity topography); (vii) coloration control tools (enabling changing coloration of the display data and/or color suspected abnormal features, objects and/or areas);

(viii) image editing tools (copy/paste, image/model cuts, cross sectional viewing etc.);

(ix) inspection opinion drafting and communication tool(s), configured to allow a medical expert user to draft a written medical opinion based on the displayed display data of the respective subject, and send the written medical opinion to the subject;

(x) comparison tools, such as tools enabling the user to place display data images (e.g., panoramic oral cavity images or 3D models) taken at different acquisition times (dates) in an overlapping manner, to view changes over time in the oral cavity or areas thereof (e.g., for tracking healing and/or disease progression over time); and/or

(xi) navigation tools, for enabling a user to navigate through the display data.

[0132] Fig. 6A shows a sequence of acquired images of an oral cavity of a subject. Fig. 6B shows a 3D model associated with the images shown in Fig. 6A. According to some embodiments the 3D model may be displayed alongside with a picture atlas indicating all or some of the images taken to form the displayed 3D model area.

[0133] Figures 7A and 7B illustrate how a user can navigate through an oral atlas display data, using a general oral cavity oral cavity atlas illustration 901, where the exact location of the explored area displayed to the user, is indicated in the model atlas illustration 901 (see regional marking 902). The model atlas may also provide with an orientational indication 903.

[0134] According to some embodiments, as illustrated in Fig. 7A, the display data includes a regional image of one or more oral cavity regions of the subject such as regional image 910 showing an oral cavity region marked by regional marking 902 in Fig. 7B, e.g., taken from the acquired images, and a set of zoomed-in and/or higher resolution other images 910a, 910b, 910c and 910d showing specific objects from the regional image. Navigation buttons 911a, 911b, 911c and 911d e.g., for Rx, Ry and Rz rotation, for example, where each button rotates the view of the respective selected image 910/910a/910b/910c/910d to a certain rotation direction at a predefined rotation angle Dq; or for lateral movement, where each button moves the respective selected image 910/910a/910b/910c/910d laterally in a predefined lateral rate DL along selected X or Y axes (e.g., where navigation buttons 911a and 911b are used for lateral left or right movement along the x axis, and navigation buttons 911c and 911d are used for up and down movement along the y axis).

[0135] Fig. 8 shows a flowchart, illustrating an image acquisition and oral cavity inspection process, according to some embodiments. The process may include:

[0136] Acquiring images of a subject's oral cavity and/or one or more area thereof (e.g., using one or more image acquisition devices) (block 51);

[0137] In an embodiment, the process may further include recording acquisition time (e.g., date), for the respective acquisition session (block 52);

[0138] In an embodiment, the process may further include determining relative FOV for each acquired image (block 53, e.g., by using image processing of the acquired images and/or by using FOV related data received, for example, from a device such as a CaCD configured to sense positioning and/or orientation of the one or more image acquisition devices and output FOV related data;

[0139] In an embodiment, the process may further include, receiving the acquired images, their relative FOVs and the acquisition time (block 54); e.g., using a construction subsystem operable via a ROIS, remotely located from the one or more image acquisition devices, at a DAS operable via a CaCD and/or at a separate CaCD operating the construction subsystem;

[0140] In an embodiment, the process may further include constructing display data (e.g., in a form of a display data file), based at least one the acquired images and their relative FOVs (block 55), for the specific respective subject and the specific acquisition time, using the construction subsystem;

[0141] Controllably displaying the constructed display data 56, e.g., using one or more display devices of the ROIS;

[0142] In an embodiment, the process may further include generating medical inspection data (block 57), e.g., by having one or more inspectors inspecting the display data and inputting an inspection opinion, via a designated UI of the ROIS and/or by automatic inspection based on image analysis of the display data and/or of the acquired images. [0143] In an embodiment, the process may further include transmitting the medical inspection data to the subject and/or to a caretaker thereof (block 58).

[0144] According to some embodiments, the oral cavity inspection process may further include mitigating one or more of the acquired images, before constructing of the display data based thereon, e.g., by using one or more image filtering tools or models.

[0145] Fig. 9 shows a flowchart, illustrating an image acquisition and oral cavity inspection process involving automatic analysis of the acquire images, according to some embodiments. The process may include:

[0146] In an embodiment, the process may include acquiring images of a subject's oral cavity and/or one or more area thereof (block 61) (e.g., using one or more image acquisition devices);

[0147] In an embodiment, the process may further include recording acquisition time (e.g., date), for the respective acquisition session (block 62);

[0148] In an embodiment, the process may further include determining relative FOV for each acquired image (block 63), e.g., by using image processing of the acquired images and/or by using FOV related data received, for example, from a device such as a CaCD configured to sense positioning and/or orientation of the one or more image acquisition devices and output FOV related data;

[0149] In an embodiment, the process may further include receiving the acquired images, their relative FOVs and the acquisition time (block 64); e.g., using a construction subsystem operable via a ROIS, remotely located from the one or more image acquisition devices, at a DAS operable via a CaCD and/or at a separate CaCD operating the construction subsystem;

[0150] In an embodiment, the process may further include constructing display data (e.g., in a form of a display data file), based at least one the acquired images and their relative FOVs (block 65), for the specific respective subject and the specific acquisition time, using the construction subsystem;

[0151] In an embodiment, the process may further include analyzing the acquired image and/or the constructed display data of the respective subject and acquisition time (block 66), e.g., for detecting oral cavity medical abnormalities and their related one or more characteristics; [0152] In an embodiment, the process may further include generating medical inspection data, based on analysis results (block 67);

[0153] In an embodiment, the process may further include controllably displaying the display data and/or the detected medical abnormalities and/or their related characteristics (block 68), e.g., to one or more inspectors and/or to the subject or a caretaker thereof (optionally enabling the inspector(s) to change or edit the automatically generated inspection data and/or input additional inspection information forming a combined expert and automatic medical inspection data;

[0154] In an embodiment, the process may further include transmitting the medical inspection data and/or the display data to the respective subject and/or a caretaker thereof (block 69).

[0155] Fig. 10 shows a flowchart, illustrating a process for constructing a panoramic view image display data for a subject, according to some embodiments. The panoramic view construction process may be carried out using a construction subsystem as described above. The display data construction process may include:

[0156] In an embodiment, the construction process may include receiving acquired images of a subject's oral cavity or area(s) thereof and optionally also their respective FOVs data and acquisition time information (block 71);

[0157] In an embodiment, the construction process may further include determining, for each acquired image, its relative FOV (block 72), e.g., based on image processing of at least the respective acquired image and one or more other acquired images, and/or based on the received FOV data;

[0158] In an embodiment, the construction process may further include normalizing the acquired images, such that content thereof may be scaled in a coherent unified scaling (block 73), based on analysis of the acquired image and their relative FOVs (e.g., by comparing images of the same objects in the oral cavity and their proportions and/or geometry in each acquired image and then scaling them to the same proportions) outputting corresponding normalized images;

[0159] In an embodiment, the construction process may further include construct at least one panoramic oral cavity image as the respective display data (block 74), e.g., by organizing (stitching) one or more of the normalized images and/or portions thereof to one another, according to their relative FOVs; [0160] In an embodiment, the construction process may further include outputting the constructed at least one panoramic oral cavity image display data (block 75).

[0161] According to some embodiments, the outputted display data may be controllably displayable in a manner that allows viewing the subject's entire oral cavity area and zooming in to view specific details in the respective oral cavity area, and/or navigating through different oral cavity displayed areas, e.g., using one or more display control tools such as zooming tools, navigation tools, etc.

[0162] According to some embodiments, the normalization may also be based on images of physical markers photographed during the image acquisition, e.g., for scaling objects in each acquired image.

[0163] According to some embodiments, the normalization of the acquired images may also include coloration normalization. Since each image may be exposed to different illumination conditions e.g., due to changes in shadings from oral cavity objects, changes in external illumination etc., the coloration of correlating oral cavity objects and/or areas may be different and therefore require coloration normalization. The color normalization may be carried out, for example, using histogram matching, using color of one or more automatically or manually selected oral cavity object such as a tooth image in one of the acquired images as a reference coloration/illumination value.

[0164] Fig. 11 shows a flowchart, illustrating a process for constructing a 3D model display data for a subject, according to some embodiments. The 3D model construction process may be carried out using a construction subsystem as described above. The display data construction process may include:

[0165] In an embodiment, the construction process may include receiving acquired images of a subject's oral cavity or area(s) thereof and optionally also their respective FOVs data and acquisition time information (block 81);

[0166] In an embodiment, the construction process may further include determining, for each acquired image, its relative FOV (block 82), e.g., based on image processing of at least the respective acquired image and one or more other acquired images, and/or based on the received FOV data;

[0167] In an embodiment, the construction process may further include normalizing the acquired images, such that content thereof may be scaled in a coherent unified scaling and/or colorization (block 83), based on analysis of the acquired image and their relative FOVs (e.g., by comparing images of the same objects in the oral cavity and their proportions and/or geometry in each acquired image and then scaling them to the same proportions) outputting corresponding normalized images;

[0168] In an embodiment, the construction process may further include construct 3D model of the subject's oral cavity as the respective display data (block 84), e.g., by using several FOVs of each oral cavity object taken from different normalized images, and/or by using physical markers as perspective scaling and reference points; and

[0169] In an embodiment, the construction process may further include outputting the constructed 3D model display data (block 85).

[0170] According to some embodiments, the outputted 3D model display data may be controllably displayable in a manner that allows various display control options (e.g., using corresponding display control tools) such as rotating and moving of the 3D model, selecting viewing of specific 3D objects or areas thereof, zooming options, coloration options, cross section slicing options, 3D modeling of a specific zoomed-in area, etc.

[0171] Fig. 12 schematically illustrates a system 8000 for oral cavity inspection, having multiple ROISs, for enabling multiple clinics or experts to view oral cavity display data of multiple subjects, according to some embodiments. The system 8000 may include ROISs 8301, 8302 and 8303, each configured to receive image and/or display data from multiple DASs such as DASs 8201, 8202 and 8203, each DAS being operable via a respective CaCD such as CaCDs 801, 802 and 803 respectively.

[0172] According to some embodiments each ROIS 8301/8302/8303 of the system 8000 may be configured to, for each subject: receive subject's acquired images, their associated acquisition time and optionally their related FOVs data, and construct, based thereon, a respective display data for the respective acquisition time and subject; and display the constructed display data via one or more display devices of the respective ROIS 8301/8302/8303.

[0173] According to some embodiments the construction of the display data may be carried out at the DAS of the respective subject or at a separate CaCD located remotely from the respective ROIS and optionally also remotely from the respective DAS.

[0174] Aspects of disclosed embodiments pertain to systems and methods for internal inspection of subjects' internal bodily areas (e.g., internal cavities such as abdominal, oral, intestine cavities etc.), by acquiring images of a subject's internal area, constructing display data based on the acquired images and displaying therefor for remote inspection of that respective internal area of the subject.

[0175] According to some embodiments, the systems may each include: at least one image acquisition device, configured to acquire images of different internal areas of a subject; a positioning subsystem, configured to determine a relative field of view (FOV) of the at least one image acquisition device, for each acquired image; a construction subsystem configured to generate, based on the acquired images of the respective subject, display data for visually representing one or more internal areas of the respective subject, wherein the construction of the display data is carried out at least by organizing at least some of the acquired images, and/or portions thereof, according to their relative FOVs; and a remote inspection subsystem configured to controllably display the constructed display data, wherein the remote inspection subsystem is remotely located from the at least one image acquisition device, for remote medical inspection of the internal areas of the respective subject.

[0176] Fig. 13 schematically illustrates a system 9000 for medical internal inspection of subjects, according to some embodiments.

[0177] The system 9000 may include an image acquisition subsystem 9100, having one or more images acquisition devices such as a first image acquisition device 9101 and a second image acquisition device 9102, each configured to acquire images of a subject's internal area; a DAS 9200, which may be operable via a CaCD 500, the DAS 9200 being configured to receive acquired images from the image acquisition subsystem 9100, and determine relative positioning of the image acquisition device being used for each acquired image; a construction subsystem 9300, configured to construct display data of the respective subject's internal area, based on the acquired images and on the relative positioning of each image; and a ROIS 9400, configured to receive the display data of each subject and controllably display thereof.

[0178] Any one or more of the first or second image acquisition devices 9101 and/or 9102 may include: a camera (video and/or still), an endoscopic device having a camera or imagery device embedded therein, an imagery device such as computerized tomography (CT) machine, X-Ray machine/device, isotopic tomography (IT) machine, ultrasound device, etc. [0179] According to some embodiments, the image acquisition subsystem 9100 may further include one or more light sources for illuminating the inspected internal area of the subject.

[0180] According to some embodiments of the systems and methods for internal and/or oral cavity inspection, described above, to determine the relative positioning of the image acquisition device for each acquired image, one or more of the following techniques and/or devices may be used:

• Positioning sensors (such as accelerometer, gyroscope, etc.);

• Image analysis program(s);

• Additional image acquisition device (e.g., additional camera) dedicated for acquiring images of the first image acquisition device for positioning thereof, e.g., based on image analysis of the images acquired by the second image acquisition device;

• Markings placed over the image acquisition device that acquires images of the subject's internal/oral cavity;

• And/or the like.

[0181] Fig. 14 shows a camera a system 4000 for oral inspection, using a removable rigid connector 4101 for rigidly and removably connecting an image acquisition device 4100 (e.g., a camera) of the system 4000 to a mobile device 600 having a designated application 4200 DAS operable thereby, according to some embodiments.

[0182] The rigid connection between the mobile device 600 and the image acquisition device 4100 allows using positioning and/or orientation sensor(s) and/or program(s) embedded and/or operable via the mobile device 600 for determining relative FOV of each acquired image, e.g., by having the mobile device 600 rotate/move in coordination with movements of the image acquisition device 4100, since it is rigidly connected thereto via connector 4101.

[0183] According to some embodiments, the designated application 4200 may be configured to retrieve orientation information (e.g., orientation sensor(s) data), and determine, for each acquired image, its relative FOV based on the retrieved orientation information, indicative of the orientation of the image acquisition device at the time the respective image was acquired. This process may be carried out in RT or near RT. [0184] According to some embodiments, the acquired images and their determined relative FOVs, may be retrieved or received from the designated application 4200, by a construction subsystem 4300 of the system 4000, for constructing display data, based on the received/retrieved acquired images and their relative FOVs.

[0185] According to some embodiments, the constructed display data may then be received and controllably displayed by a ROIS 4400, remotely located from the image acquisition device 4100 and from the mobile device 600.

[0186] According to some embodiments, the connector 4101 may include one or more fasteners for attaching to the image acquisition device 4100 at one end and to the mobile device 600 at another end thereof. One or more of these fasteners may be configured for removable attachment thereof.

[0187] Figures 15A and 15B show a connector 5000 for removably and rigidly connecting a mobile device 510 and a camera 500 to each other for using mobile device's 510 orientation and/movement sensor(s) for determining FOV of images acquired by the camera 500, according to some embodiments.

[0188] The connector 5000 may include a rigid connecting element 5001 such as a rod, a first connecting element 5110, which may be an element adhered or welded to one end surface of the camera 500 at a first end thereof and to one end of the connecting element 5110 at a second opposite end thereof; and a second connecting element 5120 such as a clip fastener, configured to removably connect to the mobile device 510.

[0189] In some embodiments, the present disclosure provides for an intra-oral dental probe imaging system, which includes a probe comprising a probe housing configured for intra-oral insertion for imaging an oral cavity of interest; an image acquisition device; a light source generating light rays for illumination of the oral cavity of interest; and a reflection mirror that directs the light rays from the light source through a light transmission path to a selected region of interest (ROI) within the oral cavity, and for directing the reflected light from the ROI through an optical path leading to the image acquisition device.

[0190] In some embodiments, the present disclosure provides for a scanning means for translating the reflection mirror in one or more translation directions, e.g., one of more degrees of freedom. In some embodiments, the scanning means may comprise one or more linear, rotary, and/or tilt actuators configured to move the mirror, e.g., axially along one or more axes of movement; rotationally, e.g., about one or more axes of rotation; and/or tiltably, e.g., about one or more axes of tilt. According to such a configuration, after the probe is positioned for imaging an oral cavity of interest, the reflection mirror may be positioned at an initial predetermined position by the scanning means, wherein the predetermined position is configured to obtain an image of a region of interest (ROI) within the oral cavity, and wherein the ROI is associated with a specified FOV. In the initial predetermined position, the light passes from the illumination source through any light transmission path, to reach the ROI within the oral cavity, and is reflected by the ROI. Then, the reflected light is directed by the reflection mirror through an optical path and reaches the imaging device, which captures an image of the ROI. In some embodiments, the scanning means then provides for translating the reflection mirror in one or more translation directions to a subsequent predetermined position associated with a subsequent FOV, to similarly illuminate a subsequent ROI within the oral cavity, wherein the reflected light from the subsequent ROI is similarly directed by an optical path to reach the imaging optical system, which captures an image of the subsequent ROI. Thus, after multiple iterations of these steps, a sequence of images depicting a series of ROIs within the oral cavity may be captured, wherein the regions of interest may be determined to cover the entire oral cavity or a specified portion thereof, and may be contiguous, adjoining, overlapping, or any combination thereof. Finally, the sequence of images may be combined to generate a combined view of the oral cavity or a specified portion thereof.

[0191] In some embodiments, a control and processing module of the present system may be configured to acquire and/or receive the visual images of the oral cavity at an acquisition session, wherein the acquisition session is associated with specific acquisition parameters such as acquisition date, time, duration, number of images, total area coverage, FOV of each image, etc. In some embodiments, an FOV associated with each image may comprise data including, for example, positional information relating to the imaged ROI, e.g., image acquisition angle, distance from the subject, and the like, for providing a user, for example, with a positional map of the displayed images, along with clinical, anatomical and/or other information related to the display data.

[0192] In some embodiments, the control and processing module may be configured to output image data including the acquired images, FOVs thereof and, optionally, the associated timing data. In some embodiments, the control and processing module of the present system may be configured to construct display data for the given image data. For example, the control and processing module may combine or stitch together some or all of the series of images acquired during an acquisition session, to produce a combined view of the oral cavity or a specified portion thereof.

[0193] According to some embodiments, the control and processing module may be configured to determine an absolute FOV associated with each acquired image in relation to the oral cavity of ineptest, and/or a relative FOV of each acquired image in relation to one or more other acquired images within the sequence of images. In some embodiments, a system of the present disclosure may comprise for this purpose one or more FOV determination modules, such as software and/or hardware and/or sensor based modules. For example, a software and/or hardware based module may be configured to determine an absolute FOV of each image, and by extension, an associated ROI within the oral cavity of interest, based, at least in part, on the predetermined position of the reflection mirror associated with each of the acquired images and a known absolute spatial position of the present probe. In some embodiments, determining a relative FOV of each image may be further based, at least in part, on using image processing or image analysis programs and or circuitry to determine relative FOV of each image in respect to one or more other acquired images. Additional or alternative ways to determine absolute and/or relative FOV of each image may include using one or more sensors attached to the image acquisition device (e.g., camera) and/or to devices in which the image acquisition device is embedded and/or devices connecting the image acquisition device to another computation and communication device, such as one or more gyroscopes, accelerometers, etc.

[0194] According to some embodiments, the construction of a combined display image comprising some or all of the series of images acquired during an acquisition session, may be based, at least in part, on organizing the series of acquired images according to their determined FOVs, e.g., to form a panoramic view display data and/or to form a 3D model of the oral cavity of interest or any portion thereof.

[0195] Fig. 16A shows an exemplary dental probe 6600 for intra-oral inspection, according to some embodiments. In some embodiments, dental probe 6600 comprises housing 6610, comprising an imaging device 6620 and an optional objective lens 6630. In some embodiments, illumination source 6670, e.g., a LED or similar module, is configured to project illumination light to the reflection mirror, to illuminate an intra- oral ROI, e.g., one or more oral cavities. In some embodiments, housing 6610 comprises a scanning module 6640 comprising a mirror extension support 6650 and a reflection mirror 6660 coupled to the mirror extension support 6650.

[0196] In some embodiments, dental probe 6600 may be coupled remotely in data communication with, e.g., a control and processing unit 6690. In some embodiments, control and processing unit 6690 may comprise one or more of an image acquisition module 6691, an image processing module 6692, an imaging positioning module 6693, and/or a communication module 6694. In some embodiments, control and processing unit 6690 may be configured to control operation of dental probe 6600, including with respect to image acquisition, image data processing, and/or determining a positioning of dental probe 6600 relative to an ROI. In some embodiments, control and processing unit 6690 may provide for remote operation of dental probe 6600, e.g., by a dental practitioner.

[0197] In some embodiments, imaging device 6620 may include one or more of: a camera, a three-dimensional (3D) sensor, a charged coupled device (CCD) camera, a narrow spectral optical detector, a Complementary metal-oxide-semiconductor (CMOS) camera, a hybrid CCD-CMOS camera, a wideband spectral optical detector, a wide FOV camera, a telephoto camera, a macro camera, a digital camera, a range sensor, a periscope device having a camera and reflective surfaces, a binocular camera, and/or the like. The type of the one or more image acquisition devices may be adapted to the wavelength band(s) emitted by the one or more light sources.

[0198] In some embodiments, scanning module 6640 may be configured to position reflection mirror 6660 in one or more desired positions and/or a predetermined sequence of desired positions in relation to an oral cavity, so as to enable scanning of a specified portion of an oral cavity, e.g., by acquiring a series of images of one or more specified ROIs within the oral cavities.

[0199] In some embodiments, scanning module 6640 may comprise any one or more linear, rotary, and/or tilt actuators configured to translate reflection mirror 6660, e.g., axially along one or more axes of movement; rotationally, e.g., about one or more axes of rotation; and/or tiltably, e.g., about one or more axes of tilt. For example, with reference to Fig. 16 A, scanning module 6640 may be configured to move reflection mirror 6660 along longitudinal axis A-A of housing 6610, and/or rotate reflection mirror 6660 about longitudinal axis A-A. In some embodiments, rotating reflection mirror 6660 about longitudinal axis A-A may be achieved, e.g., by rotating mirror extension support 6650 about housing 6610. In some embodiments, scanning module 6640 may be configured to operate mirror extension support 6650 to modify a tilt angle a between a plane C-C defined by reflection mirror 6660 and a transverse orthogonal plane B-B.

[0200] Thus, for example, dental probe 6600 may be used for acquiring a series of images of an oral cavity of interest during an acquisition session to scan a desired portion of the oral cavity. In some embodiments, the acquisition session comprises a series or sequence of images of a sequence of ROIs within the oral cavity, e.g., contiguous, adjoining, and/or overlapping ROIs. In some embodiments, an acquisition session may be associated with specific acquisition parameters, and specifically, an FOV of each image. In some embodiments, control and processing unit 6690 may be configured to operate scanning module 6640 of dental probe 6600 to translate reflection mirror 6660 according to a desired FOV associated with each image in the series, using e.g., an imaging positioning module 6693 of control and processing unit 6690. In some embodiments, control and processing unit 6690 may be configured to determine an absolute FOV associated with each acquired image in relation to the oral cavity of interest, and/or a relative FOV of each acquired image in relation to one or more other acquired images within the sequence of images. In some embodiments, control and processing unit 6690 may be configured to determine an absolute FOV of each image, and by extension, an associated ROI within the oral cavity of interest, based, at least in part, on the known position of reflection mirror 6660 associated with each of the acquired images, and a known absolute spatial position of the dental probe 6600. In some embodiments, determining a relative FOV of each image may be further based, at least in part, on using image processing or image analysis programs and or circuitry to determine relative FOV of each image in respect to one or more other acquired images. Additional or alternative ways to determine absolute and/or relative FOV of each image may include using one or more sensors attached to the image acquisition device (e.g., camera) and/or to devices in which the image acquisition device is embedded and/or devices connecting the image acquisition device to another computation and communication device, such as one or more gyroscopes, accelerometers, etc.

[0201] In some embodiments, control and processing unit 6690 may be configured to retrieve orientation information (e.g., orientation sensor(s) data), and determine, for each acquired image, its relative FOV based on the retrieved orientation information, indicative of the orientation of the image acquisition device at the time the respective image was acquired. This process may be carried out in real time or near real time. According to some embodiments, the acquired images and their determined relative FOVs, may be retrieved or received from the control and processing unit 6690, based on the received/retrieved acquired images and their relative FOVs.

[0202] In some embodiments, am imaging positioning module 6693 of control and processing unit 6690 may be configured to operate dental probe 6600 to provide for translating a position and/or tilt angle and/or orientation of reflection mirror 6660 in one or more degrees of freedom. For example, scanning module 6640 may be configured to move reflection mirror 6660 axially, along longitudinal axis A-A and/or rotate reflection mirror 6660 about longitudinal axis of housing 6610 A-A. In some embodiments, rotating reflection mirror 6660 about longitudinal axis A-A may be achieved, e.g., by rotating mirror extension support 6650 about housing 6610. In some embodiments, scanning module 6640 may be configured, e.g., to move reflection mirror 6660 axially along longitudinal axis A-A while simultaneously rotating reflection mirror 6660 about axis A- A, such that reflection mirror 6660 is advanced or retracted axially and rotationally in relation to a distal face 6612 of housing 6610, to define a spiral or corkscrew path of motion. In some embodiments, scanning module 6640 may be configured to operate mirror extension support 6650 to translate reflection mirror 6660 to modify an orientation of a plane defined by reflection mirror 6660 relative to a plane defined by distal face 6612 of housing 6610. For example, In some embodiments, scanning module 6640 may be configured to operate mirror extension support 6650 to modify a tilt angle a between a plane defined by reflection mirror 6660 and a plane defined by distal face 6612 of housing 6610. In some embodiments, tilt angle a may be within the range of 20-60 degrees, e.g., 20, 25, 30, 35, 4045, 5055, or 60 degrees. In some embodiments, scanning module 6640 may be configured to translate reflection mirror 6660 in any combination of two or more degrees of freedom, e.g., axial movement substantially along a longitudinal axis of housing 6610, rotational movement about a longitudinal axis of housing 6610, and/or adjustment of an orientation of a plane defined by reflection mirror 6660 in relation to a plane defined by distal face 6612 of housing 6610.

[0203] Fig. 16B shows an exemplary dental probe 6700 for intra-oral inspection, according to some embodiments. In some embodiments, dental probe 6700 comprises a housing 6710 defining a probe body and a distal probe tip 6715 having a smaller cross- sectional area than the probe body, thus enabling access to smaller oral cavity regions. In some embodiments, at least a portion of distal probe tip 6715 may be optically transparent. In some embodiments, housing 6710 comprises an imaging device 6720 and an optional objective lens 6730. In some embodiments, illumination source 6770, e.g., a LED or similar module, is configured to project illumination light to the reflection mirror, to illuminate an intra-oral ROI, e.g., one or more oral cavities. In some embodiments, housing 6710 comprises a scanning module comprising a reflection mirror 6760 housed internally of distal probe tip 6715.

[0204] In some embodiments, dental probe 6700 may be coupled remotely in data communication with, e.g., a control and processing unit 6790. In some embodiments, control and processing unit 6790 may comprise one or more of an image acquisition module 6791, an image processing module 6792, an imaging positioning module 6793, and/or a communication module 6794. In some embodiments, control and processing unit 6790 may be configured to control operation of dental probe 6700, including with respect to image acquisition, image data processing, and/or determining a positioning of dental probe 6700 relative to an ROI. In some embodiments, control and processing unit 6790 may provide for remote operation of dental probe 6700, e.g., by a dental practitioner.

[0205] In some embodiments, imaging device 6720 may include one or more of: a camera, a three-dimensional (3D) sensor, a charged coupled device (CCD) camera, a narrow spectral optical detector, a Complementary metal-oxide-semiconductor (CMOS) camera, a hybrid CCD-CMOS camera, a wideband spectral optical detector, a wide FOV camera, a telephoto camera, a macro camera, a digital camera, a range sensor, a periscope device having a camera and reflective surfaces, a binocular camera, and/or the like. The type of the one or more image acquisition devices may be adapted to the wavelength band(s) emitted by the one or more light sources.

[0206] In some embodiments, the image positioning module 6793 of control and processing unit 6790 may be configured to position reflection mirror 6760 within distal probe tip 6715 in one or more desired positions and/or a predetermined sequence of desired positions in relation to an oral cavity, so as to enable scanning of a specified portion of an oral cavity, e.g., by acquiring a series of images of one or more specified ROIs within the oral cavities. [0207] In some embodiments, the scanning module may comprise any one or more linear, rotary, and/or tilt actuators configured to translate reflection mirror 6760, e.g., axially along one or more axes of movement; rotationally, e.g., about one or more axes of rotation; and/or tiltably, e.g., about one or more axes of tilt. For example, with reference to Fig. 16B, the scanning module may be configured to move reflection mirror 6760 along longitudinal axis A-A of housing 6710, and/or rotate reflection mirror 6760 about longitudinal axis A-A. In some embodiments, the scanning module may be configured to modify a tilt angle a between a plane C-C defined by reflection mirror 6760 and a transverse orthogonal plane B-B.

[0208] Thus, for example, dental probe 6700 may be used for acquiring a series of images of an oral cavity of interest during an acquisition session to scan a desired portion of the oral cavity In some embodiments, the acquisition session comprises a series or sequence of images of a sequence of ROIs within the oral cavity, e.g., contiguous, adjoining, and/or overlapping ROIs. In some embodiments, an acquisition session may be associated with specific acquisition parameters, and specifically, an FOV of each image. In some embodiments, control and processing unit 6790 may be configured to operate dental probe 6700 to translate reflection mirror 6760 according to a desired FOV associated with each image in the series, using, e.g., an imaging positioning module 6793 of control and processing unit 6790. In some embodiments, control and processing unit 6790 may be configured to determine an absolute FOV associated with each acquired image in relation to the oral cavity of interest, and/or a relative FOV of each acquired image in relation to one or more other acquired images within the sequence of images. In some embodiments, control and processing unit 6790 may be configured to determine an absolute FOV of each image, and by extension, an associated ROI within the oral cavity of interest, based, at least in part, on the known position of reflection mirror 6760 associated with each of the acquired images, and a known absolute spatial position of the dental probe 6700. In some embodiments, determining a relative FOV of each image may be further based, at least in part, on using image processing or image analysis programs and or circuitry to determine relative FOV of each image in respect to one or more other acquired images. Additional or alternative ways to determine absolute and/or relative FOV of each image may include using one or more sensors attached to the image acquisition device (e.g., camera) and/or to devices in which the image acquisition device is embedded and/or devices connecting the image acquisition device to another computation and communication device, such as one or more gyroscopes, accelerometers, etc.

[0209] In some embodiments, control and processing unit 6790 may be configured to retrieve orientation information (e.g., orientation sensor(s) data), and determine, for each acquired image, its relative FOV based on the retrieved orientation information, indicative of the orientation of the image acquisition device at the time the respective image was acquired. This process may be carried out in real time or near real time. According to some embodiments, the acquired images and their determined relative FOVs, may be retrieved or received from the control and processing unit 6790, based on the received/retrieved acquired images and their relative FOVs.

[0210] In some embodiments, an imaging positioning module 6793 of control and processing unit 6790 may be configured to operate dental probe 6700 to provide for translating a position and/or tilt angle and/or orientation of reflection mirror 6760 in one or more degrees of freedom. For example, the imaging positioning module 6793 may be configured to move reflection mirror 6760 axially, along longitudinal axis A-A of housing 6710, and/or rotate reflection mirror 6760 about a longitudinal axis A-A. In some embodiments, the imaging positioning module 6793 may be configured, e.g., to move reflection mirror 6760 axially along longitudinal axis A-A while simultaneously rotating reflection mirror 6760 about axis A-A, such that reflection mirror 6760 is advanced or retracted axially and rotationally along axis A-A within distal probe tip 6715, to define a spiral or corkscrew path of motion. In some embodiments, the imaging positioning module 6793 may be configured to translate reflection mirror 6760 to modify an orientation of a plane defined by reflection mirror 6760. For example, In some embodiments, scanning module 6740 may be configured to operate mirror extension support 6750 to modify a tilt angle a between a plane defined by reflection mirror 6760 and a transverse orthogonal plane to axis A-A. In some embodiments, tilt angle a may be within the range of 20-60 degrees, e.g., 20, 25, 30, 35, 4045, 5055, or 60 degrees. In some embodiments, the imaging positioning module 6793 may be configured to translate reflection mirror 6760 in any combination of two or more degrees of freedom, e.g., axial movement substantially along a longitudinal axis of housing 6710, rotational movement about a longitudinal axis of housing 6710, and/or adjustment of an orientation of a plane defined by reflection mirror 6760 in relation to a transverse orthogonal plane to axis A- A. Additional Examples:

[0211] Example 1 is a system for oral cavity inspection, comprising: at least one image acquisition device, configured to acquire images of different oral cavity areas of a subject; a positioning subsystem, configured to determine a relative field of view (FOV) of the at least one image acquisition device, for each acquired image; a construction subsystem configured to generate, based on the acquired images of the respective subject, display data for visually representing one or more oral cavity areas of the respective subject, wherein the construction of the display data is carried out at least by organizing at least some of the acquired images, and/or portions thereof, according to their relative FOVs; and a remote oral inspection subsystem configured to controllably display the constructed display data, wherein the remote oral inspection subsystem is remotely located from the at least one image acquisition device, for remote medical inspection of the oral cavity areas of the respective subject.

[0212] In example 2, the subject matter example 1 may include, wherein the organizing of at least some of the acquired images comprises stitching at least some of the acquired images and/or portions thereof to one another according to their respective FOVs to generate a panoramic oral cavity image.

[0213] In example 3, the subject matter of any one or more of examples 1 to 2 may include, wherein the construction subsystem is further configured to: normalize the acquired images to one another, in order to form a coherent geometric and/or coloration scaling of the acquired images outputting corresponding normalized images, and organize the normalized images, according to their respective FOVs to construct the display data.

[0214] In example 4, the subject matter of any one or more of examples 1 to 3 may include, wherein the display data is descriptive of information about: a panoramic view of the one or more oral cavity areas; a pictures atlas; and/or a three-dimensional (3D) model of the one or more oral cavity areas.

[0215] In example 5, the subject matter of any one or more of examples 1 to 4 may include, wherein the system further comprises at least one light source, for illuminating the oral cavity of the respective subject while acquiring images by the image acquisition device. [0216] In example 6, the subject matter of example 5 may include, wherein the at least one light source comprises one or more of: a light emitting diode (LED); a flashlight, a white light lamp, an ultraviolet (UV) light source, an array of LED light sources, each LED light source emitting light of a different wavelength band.

[0217] In example 7, the subject matter of any one or more of examples 1 to 6 may include, wherein at least the positioning subsystem is operable via a designated application operable via a mobile communication device, the designated application being configured to receive and transmit one or more of: the display data, the acquired images, and/or the relative FOV of each of the acquired images, to the remote oral inspection subsystem.

[0218] In example 8, the subject matter of any one or more of examples 1 to 7 may include, wherein the at least one image acquisition device and the positioning subsystem are comprised in a mobile communication device, wherein the positioning subsystem is comprised in a designated application operable via the mobile communication device, and wherein the positioning subsystem determines the relative FOV of each acquired image, based on sensor data originating from one or more movement and/or orientation sensors of the mobile communication device.

[0219] In example 9, the subject matter of any one or more of examples 8 may include, wherein the system further comprises at least one connector, configured to removably and rigidly connecting the at least one image acquisition device to the mobile device, for using one or more mobile device's sensors and/or image processing programs, configured for determining orientation of the mobile device, wherein the relative FOV of each acquired image is determined, based on the orientation of the mobile device to which the at least one image acquisition device connects via the at least one connector.

[0220] In example 10, the subject matter of any one or more of examples 1 to 9 may include, wherein the construction subsystem is comprised in the remote oral inspection subsystem, wherein the acquired images of the respective subject and the relative FOV thereof, are transmitted to the remote oral inspection subsystem and processed by the construction subsystem, to construct their corresponding display data.

[0221] In example 11, the subject matter of example 10 may include, wherein the remote oral inspection subsystem is operable via a computerized data storage, processing, communication, input and display machine. [0222] In example 12, the subject matter of any one or more of examples 1 to 11 may include, wherein the at least one image acquisition device comprises: a camera, a video camera, a 3D sensor, a wide FOV camera, a telephoto camera, a macro camera, a digital camera, a binocular camera, a range sensor, a periscope device having a camera and reflective surfaces.

[0223] In example 13, the subject matter of any one or more of examples 1 to 12 may include, wherein the remote oral inspection subsystem further comprises an analysis engine, configured to analyze the acquired images and/or display data associated therewith, for determining a medical condition of the subject and, optionally, to output an analysis result.

[0224] In example 14, the subject matter of example 13 may include, wherein the analysis engine comprises one or more analysis programs, configured to detect one or more medical oral cavity abnormalities, based on image analysis of the display data and/or on the acquired images.

[0225] In example 15, the subject matter of example 14 may include, wherein the one or more image analysis based medical abnormalities detection is carried out by detecting oral cavity areas showing one or more of:

• topographic abnormality;

• coloring abnormality;

• morphological abnormality;

• vasculature abnormality; and/or

• changes over time in any one or more of the above oral cavity abnormalities.

[0226] In example 16, the subject matter of example 15 may include, wherein the detectable medical abnormalities are associated with one or more of the following medical oral conditions:

• oral cavity wound and/or lesion;

• oral cavity ulcer;

• oral cavity inflammation;

• dental condition; • oral cavity benign, malignant and/or premalignant lesion and/or tissue.

[0227] In example 17, the subject matter of any one or more of examples 13 to 16 may include, wherein the analysis engine is further configured to determine physical characteristics of medical abnormalities, and to output inspection data indicative of detected abnormalities and their physical characteristics.

[0228] In example 18, the subject matter of example 17 may include, wherein the physical characteristics, detectable by the analysis engine and indicatable over the display of the display data, comprise, for each or some of the detected abnormalities, one or more of:

• peripheral boundaries of the identified abnormality;

• topography of the detected abnormality;

• morphology of the abnormality;

• location indication of the abnormality in the oral cavity;

• coloration of the abnormality;

• abnormality specification.

[0229] In example 19, the subject matter of any one or more of examples 13 to 18 may include, wherein the analysis engine is further configured to compare the display data of the respective subject associated with a respective acquisition time, to one or more previously constructed display data, and compare thereof, for tracking oral cavity medical condition of the respective subject over time.

[0230] In example 20, the subject matter of any one or more of examples 1 to 19 may include, wherein the remote oral inspection subsystem is further configured to display multiple display data displays, each associated with the same subject and a different acquisition time, for tracking oral cavity medical condition of the respective subject, over time.

[0231] In example 21, the subject matter of any one or more of examples 1 to 20 may include, wherein the system further comprises one or more positioning sensors, each being configured to detect one or more positioning parameters values indicative of the respective positioning of the at least one image acquisition device, outputting FOV data, indicative of the positioning parameters values, wherein the positioning subsystem being configured to determine the relative FOV of each acquired image, based on the outputted FOV data.

[0232] In example 22, the subject matter of any one or more of examples 1 to 21 may include, wherein the system further comprises an orthodontic fixture device, configured to fixate the subject's mandible and/or jaw at a fixed and stable posture, while acquiring images of the respective subject's oral cavity.

[0233] In example 23, the subject matter of example 22 may include, wherein the orthodontic fixture device comprises jaw fixating elements and a movable holder, the movable holder being configured to movably hold the at least one image acquisition device and/or one or more light sources.

[0234] In example 24, the subject matter of any one or more of examples 22 to 23 may include, wherein the orthodontic fixture device comprises one or more physical markers for FOV determination.

[0235] In example 25, the subject matter of any one or more of examples 1 to 24 may include, wherein the remote oral inspection subsystem further comprises a user interface (UI), configured for user input-based control over the display of the display data.

[0236] In example 26, the subject matter of example 25 may include, wherein the UI comprises one or more of the following UI tools:

• graphical tools for marking a suspected abnormal oral cavity area;

• comparison tools for displaying display data of the subject from various dates and/or for automatic comparison and display of changes between areas in the oral cavity of the respective subject over time;

• one or more display data orientation control tools;

• one or more display data zooming control tools;

• one or more image quality control tools;

• one or more virtual topography control and/or comparison tools;

• virtual planar geometry control and/or comparison tools;

• one or more coloration control and/or comparison tools;

• one or more navigation tools; • one or more image editing tools;

• one or more inspection opinion drafting and communication tools, configured to allow an inspector to draft a written medical opinion based on the displayed display data of the respective subject, and send the written medical opinion to the subject.

[0237] In example 27, the subject matter of any one or more of examples 1 to 26 may include, wherein the system further comprises a digital image acquisition guide (DIAG), configured for auditory and/or visual display of guidance instructions for guiding the subject or a caretaker thereof as to how to position the at least one image acquisition device, when acquiring the images of the oral cavity of the respective subject.

[0238] In example 28, the subject matter of example 27 may include, wherein the DIAG is configured to receive initial images from the at least one image acquisition device, analyze the initial images and output guidance instruction, in real time, responsive to analysis results.

[0239] In example 29, the subject matter of any one or more of examples 1 to 28 may include, wherein the system further comprises a database, for retrievably storing acquired images and FOV data associated therewith.

[0240] In example 30, the subject matter of any one or more of examples 1 to 29 may include, wherein the positioning subsystem is further configured to mitigate at least one of the acquired images, based on analysis of the acquired images thereof, using a mitigation module.

[0241] In example 31, the subject matter of example 30 may include, wherein the mitigation module uses optical flow detection, which detects patterns of relative movement between the at least one image acquisition device and one or more reference locations in the subject's oral cavity.

[0242] Example 32 is a system for oral cavity inspection, comprising:

• a data acquisition subsystem (DAS) comprising: at least one image acquisition device, configured to acquire images of different oral cavity areas of a subject and output image data representing thereof; a DAS positioning subsystem, configured to determine relative field of view (FOV) of the at least one image acquisition device, for each acquired image in the image data; and a DAS communication subsystem configured to transmit the image data and FOV associated with each acquired image thereof; and

• a remote oral inspection subsystem (ROIS) comprising: a ROIS communication subsystem, configured to receive image data and FOV associated therewith from multiple subjects, from mobile communication devices of subjects; a ROIS construction subsystem, configured to construct display data, for each subject, based on the image data of the respective subject, the display data visually representing one or more oral cavity areas of the respective subject, wherein the construction of the display data is carried out at least by organizing at least some of the acquired images, from the image data, and/or portions thereof, according to their respective FOV ; and a ROIS storage unit configured to retrievably store therein received image data, associated FOVs and/or display data for each respective subject; a ROIS display subsystem, configured to controllably display the constructed display data or part thereof, wherein the ROIS being remotely located from the DAS device, for remote medical inspection of the oral cavity areas of subjects.

[0243] Example 33 is a method for oral cavity inspection, comprising: acquiring images of different oral cavity areas of a subject, using at least one image acquisition device; determining a relative field of view (FOV) of the at least one image acquisition device, for each of the acquired images; and constructing display data, based on the acquired images, for visually representing one or more oral cavity areas of the respective subject, wherein the construction of the display data is carried out at least by organizing at least some of the acquired images, and/or portions thereof, according to their relative FOVs.

[0244] In example 34, the subject matter of example 33 may include, wherein the method further comprises displaying the constructed display data, using one or more display devices of a remote oral inspection subsystem, located remotely from the at least one image acquisition device, for remote medical inspection of the oral cavity areas of the respective subject.

[0245] In example 35, the subject matter of any one or more of examples 33 of 34 may include, wherein the organizing of at least some of the acquired images or portions thereof, comprises stitching at least some of the acquired images or portions thereof to one another, according to their respective FOVs, to generate a panoramic oral cavity image as the display data.

[0246] In example 36, the subject matter of any one or more of examples 33 to 35 may include, wherein the step of constructing of the display data comprises: normalizing the acquired images to one another, to form a coherent scaling and/or coloration of the acquired images; and organizing the normalized images, according to their relative FOVs.

[0247] In example 37, the subject matter of any one or more of examples 33 to 36 may include, wherein the display data is descriptive of information about:

• a panoramic view of the one or more oral cavity areas; and/or

• a three-dimensional (3D) model of the one or more oral cavity areas.

[0248] In example 38, the subject matter of any one or more of examples 33 to 37 may include, wherein the method further comprises illuminating the oral cavity of the respective subject while acquiring images thereof, using at least one light source.

[0249] In example 39, the subject matter of any one or more of examples 33 to 38 may include, wherein the steps of acquiring the images and determining their relative FOVs is carried out using a designated application operable via a mobile communication device, the designated application being configured to transmit the acquired images and their relative FOVs to the remote oral inspection subsystem.

[0250] In example 40, the subject matter of example 39 may include, wherein the construction of the display data is carried out at the remote oral inspection subsystem.

[0251] In example 41, the subject matter of any one or more of examples 33 to 40 may include, wherein the steps of acquiring the images, determining their relative FOVs and constructing display data based thereon is carried out using a designated application operable via a mobile communication device, the designated application being configured to transmit the constructed display data to the remote oral inspection subsystem for controllably displaying thereof.

[0252] In example 42, the subject matter of any one or more of examples 33 to 41 may include, wherein the at least one image acquisition device comprises: a camera, a video camera, a 3D sensor, a wide FOV camera, a telephoto camera, a macro camera, a digital camera, a binocular camera, a range sensor, a periscope device having a camera and reflective surfaces.

[0253] In example 43, the subject matter of any one or more of examples 33 to 42 may include, wherein the method further comprises analyzing the acquired images and/or the display data for determining a clinical condition of the subject.

[0254] In example 44, the subject matter of example 43 may include, wherein the construction of the display data is further based on the results of the analysis of the acquired images.

[0255] In example 45, the subject matter of any one or more of examples 43 or 44 may include, wherein the analysis of the acquired images or and/or of the display data is carried out by using one or more analysis programs, configured to detect one or more medical abnormalities and/or distinct to distinct normal oral cavity areas from abnormal oral cavity areas.

[0256] In example 46, the subject matter of example 45 may include, wherein the one or more image analysis based medical abnormalities detection is carried out by identifying oral cavity areas showing one or more of:

• topographic abnormality;

• coloring abnormality;

• morphological abnormality;

• vasculature abnormality; and/or

• changes over time in any one or more of the above oral cavity abnormalities.

[0257] In example 47, the subject matter of any one or more of examples 45 or example 46 may include, wherein the detected medical abnormalities are associated with one or more of the following medical oral conditions:

• oral cavity wounds;

• oral cavity ulcers;

• oral cavity inflammation;

• dental condition; oral cavity benign, malignant and/or premalignant ulcers and/or tissue. [0258] In example 48, the subject matter of any one or more of examples 45 to 47 may include, wherein the method further comprises identifying physical characteristics of identified abnormalities, and visually displaying information associated with the identified abnormalities.

[0259] In example 49, the subject matter of example 48 may include, wherein the physical characteristics of identified abnormalities comprise one or more of:

• peripheral boundaries of the identified abnormality;

• topography of the identified abnormality;

• morphology of the abnormality;

• location indication of the abnormality in the oral cavity;

• coloration of the abnormality.

[0260] In example 50, the subject matter of any one or more of examples 33 to 49 may include, wherein the method further comprises: comparing the display data of the respective subject acquired at a specific acquisition time, to one or more previously constructed display data, for tracking oral cavity medical condition of the respective subject, over time.

[0261] In example 51, the subject matter of any one or more of examples 33 to 50 may include, wherein the method further comprises displaying multiple display data displays, each display being associated with the same subject and a different acquisition times, for tracking oral cavity medical condition of the respective subject, over time.

[0262] In example 52, the subject matter of any one or more of examples 33 to 51 may include, wherein the method further comprises fixating the subject's mandible and/or jaw at a fixed and stable posture, while acquiring images, using an orthodontic fixture device.

[0263] In example 53, the subject matter of example 52 may include, wherein the orthodontic fixture device comprises jaw fixating elements and a movable holder, the movable holder being configured to movably hold the at least one image acquisition device and/or one or more light sources.

[0264] In example 54, the subject matter of any one or more of examples 52 to 54 may include, wherein the orthodontic fixture device comprises one or more physical markers, wherein the step of determining the relative FOV for each acquired image is carried out based on the physical markers located over the orthodontic fixture device.

[0265] In example 55, the subject matter of any one or more of examples 33 to 54 may include, wherein the method further comprises controlling the displaying of the display data, using a user interface (UI) operable via the remote oral inspection subsystem.

[0266] In example 56, the subject matter of example 55 may include, wherein the UI comprises one or more of the following UI tools:

[0267] graphical tools for marking a suspected abnormal oral cavity area;

[0268] comparison tools for displaying display data of the subject from various acquisition times and/or for automatic comparison and display of changes between areas in the oral cavity of the respective subject over time;

• one or more display data orientation control tools;

• one or more display data zooming control tools;

• one or more image quality control tools;

• one or more virtual topography control and/or comparison tools;

• virtual planar geometry control and/or comparison tools;

• one or more coloration control and/or comparison tools;

• one or more navigation tools;

• one or more image editing tools;

• inspection opinion drafting and communication tool, configured to allow a medical expert user to draft a written medical opinion based on the displayed display data of the respective subject, and send the written medical opinion to the subject.

[0269] In example 57, the subject matter of any one or more of examples 33 to 56 may include, wherein the method further comprises audibly and/or visually displaying of guidance instructions to the subject or a caretaker thereof, using a digital image acquisition guide (DIAG).

[0270] In example 58, the subject matter of example 57 may include, wherein the DIAG is configured to receive initial images from the at least one image acquisition device, analyze the initial images and output guidance instruction, in real time, responsive to analysis results.

[0271] In example 59, the subject matter of any one or more of examples 33 to 58 may include, wherein the method further comprises mitigating the image data, based on analysis of the acquired images thereof, using at least one mitigation program.

[0272] In example 60, the subject matter of example 59 may include, wherein the mitigation is carried out by optical flow detection for patterns of relative movement between the at least one image acquisition device and one or more reference locations in the subject's oral cavity.

[0273] In example 61, the subject matter of example 33 to 60 may include, wherein the method further comprises: operating an optical flow image processing operator, over at least some of the acquired images, to determine relative movements between the subject's oral cavity and the at least one image acquisition device, wherein the construction of the display data is further based on resulting relative movements between the subject's oral cavity and the at least one image acquisition device.

[0274] Example 62 is a system for oral cavity inspection comprising at least one memory for storing data and software code, and a processor which, when executing the software code, results in the execution of the following steps:

• retrievably storing acquired images of different oral cavity areas of a subject, acquired by using at least one image acquisition device;

• retrievably storing relative field of view (FOV) of the at least one image acquisition device, for each of the acquired images; and

• constructing display data, based on the acquired images, for visually representing one or more oral cavity areas of the respective subject, wherein the construction of the display data is carried out at least by organizing at least some of the acquired images, and/or portions thereof, according to their relative FOVs.

[0275] In example 63, the subject matter of example 62 may include, wherein the system further comprises: displaying the constructed display data, using one or more display devices of a remote oral inspection subsystem, located remotely from the at least one image acquisition device, for remote medical inspection of the oral cavity areas of the respective subject. [0276] Example 64 is a system for medical internal inspection, comprising:

• at least one image acquisition device, configured to acquire images of different internal areas of a subject;

• a positioning subsystem, configured to determine a relative field of view (FOV) of the at least one image acquisition device, for each acquired image;

• a construction subsystem configured to generate, based on the acquired images of the respective subject, display data for visually representing one or more internal areas of the respective subject, wherein the construction of the display data is carried out at least by organizing at least some of the acquired images, and/or portions thereof, according to their relative FOVs; and

• a remote inspection subsystem configured to controllably display the constructed display data, wherein the remote inspection subsystem is remotely located from the at least one image acquisition device, for remote medical inspection of the internal areas of the respective subject.

[0277] Example 65 is an intra-oral dental imaging system comprising:

• a housing comprising at least an insertion portion at a distal end thereof for inserting into an oral cavity, the housing comprising: o a light source generating light rays which are transmitted through a light transmission path to illuminate at least a portion of the oral cavity, o an imaging device for receiving the light rays when the light rays are reflected from the at least a portion of the oral cavity and transmitted through an optical path to the imaging device, o a mirror configured to be translatable in one or more degrees of freedom to one or more predetermined positions, wherein each of the predetermined positions directs the light rays from the light source through the light transmission path to a desired region of interest (ROI) in the oral cavity, and directs the reflected light from the ROI through the optical path to the imaging device, and o a scanning means comprising at least one actuator, for translating the mirror in the one or more degrees of freedom.

[0278] In example 66, the mirror of example 65 is disposed internally of the insertion portion, or the mirror is disposed externally and distally to the insertion portion, wherein the housing further comprises a mirror extension support for coupling the mirror to the housing.

[0279] In example 67, the mirror of example 65 is disposed at a tilt angle relative to the light transmission path, and wherein the tilt angle is between 20 and 60 degrees. In one example the tilt angle is 35 degrees.

[0280] In example 68, the scanning means of example 65 is configured for translating the mirror by at least one of: moving the mirror axially along a longitudinal axis of the housing, rotating the mirror about a longitudinal axis of the housing, and modifying a tilt angle of the mirror relative to the light transmission path.

[0281] In example 69, the scanning means of example 65 is configured for translating the mirror by simultaneously (i) moving the mirror axially along the longitudinal axis of the housing and (ii) rotating the mirror about the longitudinal axis of the housing.

[0282] In example 70, the system of any one of examples 65 to 69 comprises a control unit comprising at least a positioning module, wherein the positioning module is configured to operate the scanning means to translate the mirror into the one or more predetermined positions.

[0283] In example 71, the control unit of example 70 is located remotely to the housing, and is in data communication with the housing.

[0284] In example 72, the positioning module of any one of example 70 or 71, is configured to operate the scanning means to translate the mirror into a sequence of the predetermined positions, and wherein each predetermined position in the sequence of predetermined positions (i) directs the light rays from the light source through the light transmission path to a specified ROI in a sequence of ROIs, and (ii) directs the reflected light from the specified ROI through the optical path to the imaging device.

[0285] In example 73, at least some of the ROIs in the sequence of the ROIs of example 72 are adjoining ROIs.

[0286] In example 74, at least some of the ROIs in the sequence of the ROIs of example 72 are at least partially overlapping ROIs.

[0287] In example 75, the control unit of any one of example 70 to 74 is further configured to operate the imaging device to acquire an image at each of the predetermined positions in the sequence of predetermined positions. [0288] In example 76, control unit of any one of example 70 to 75 is further configured to combine all of the images to generate a combined image of at least a portion of the oral cavity.

[0289] In example 77, control unit of any one of example 70 to 76 is further configured to display the combined image on a display.

[0290] The modules, subsystems and/or systems described above may include various processing, storing, communication, display and/or input devices, software and/or hardware elements and/or a combination thereof.

[0291] The methods and processes described above may be carried out via processing, storage, communication, display and/or input devices using respective software and/or hardware elements.

[0292] While the invention has been described with respect to a limited number of embodiments, these should not be construed as limitations on the scope of the invention, but rather as exemplifications of some of the embodiments.

[0293] Any digital computer system, unit, device, module and/or engine exemplified herein can be configured or otherwise programmed to implement a method disclosed herein, and to the extent that the system, module and/or engine is configured to implement such a method, it is within the scope and spirit of the disclosure. Once the system, module and/or engine are programmed to perform particular functions pursuant to computer readable and executable instructions from program software that implements a method disclosed herein, it in effect becomes a special purpose computer particular to embodiments of the method disclosed herein. The methods and/or processes disclosed herein may be implemented as a computer program product that may be tangibly embodied in an information carrier including, for example, in a non-transitory tangible computer-readable and/or non-transitory tangible machine-readable storage device. The computer program product may directly loadable into an internal memory of a digital computer, comprising software code portions for performing the methods and/or processes as disclosed herein.

[0294] Additionally or alternatively, the methods and/or processes disclosed herein may be implemented as a computer program that may be intangibly embodied by a computer readable signal medium. A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro -magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a non-transitory computer or machine -readable storage device and that can communicate, propagate, or transport a program for use by or in connection with apparatuses, systems, platforms, methods, operations and/or processes discussed herein.

[0295] The terms “non-transitory computer-readable storage device” and “non- transitory machine-readable storage device” encompasses distribution media, intermediate storage media, execution memory of a computer, and any other medium or device capable of storing for later reading by a computer program implementing embodiments of a method disclosed herein. A computer program product can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by one or more communication networks.

[0296] These computer readable and executable instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable and executable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.

[0297] The computer readable and executable instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks. [0298] The term “engine” and “module” may comprise one or more computer modules, wherein a module may be a self-contained hardware and/or software component that interfaces with a larger system.

[0299] A module or engine may comprise a machine or machines executable instructions. A module or engine may be embodied by a circuit or a controller programmed to cause the system to implement the method, process and/or operation as disclosed herein. For example, a module or engine may be implemented as a hardware circuit comprising, e.g., custom very large-scale integration (VLSI) circuits or gate arrays, an Application- specific integrated circuit (ASIC), off-the-shelf semiconductors such as logic chips, transistors, and/or other discrete components. A module or engine may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices and/or the like.

[0300] A module may comprise a machine or machines executable instructions. A module may be embodied by a circuit or a controller programmed to cause the system to implement the method, process and/or operation as disclosed herein. For example, a module may be implemented as a hardware circuit comprising, e.g., custom very large- scale integration (VLSI) circuits or gate arrays, an Application- specific integrated circuit (ASIC), off-the-shelf semiconductors such as logic chips, transistors, and/or other discrete components. A module may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices and/or the like.

[0301] In the discussion, unless otherwise stated, adjectives such as “substantially” and “about” that modify a condition or relationship characteristic of a feature or features of an embodiment of the invention, are to be understood to mean that the condition or characteristic is defined to within tolerances that are acceptable for operation of the embodiment for an application for which it is intended.

[0302] Unless otherwise specified, the terms “substantially”, “'about” and/or “close” with respect to a magnitude or a numerical value may imply to be within an inclusive range of -10% to +10% of the respective magnitude or value.

[0303] It is important to note that the method may include is not limited to those diagrams or to the corresponding descriptions. For example, the method may include additional or even fewer processes or operations in comparison to what is described in the figures. In addition, embodiments of the method are not necessarily limited to the chronological order as illustrated and described herein.

[0304] Discussions herein utilizing terms such as, for example, "processing", "computing", "calculating", "determining", "establishing", "analyzing", "checking", “estimating”, “deriving”, “selecting”, “inferring” or the like, may refer to operation(s) and/or process(es) of a computer, a computing platform, a computing system, or other electronic computing device, that manipulate and/or transform data represented as physical (e.g., electronic) quantities within the computer's registers and/or memories into other data similarly represented as physical quantities within the computer's registers and/or memories or other information storage medium that may store instructions to perform operations and/or processes. The term determining may, where applicable, also refer to “heuristically determining”.

[0305] It should be noted that where an embodiment refers to a condition of "above a threshold", this should not be construed as excluding an embodiment referring to a condition of "equal or above a threshold". Analogously, where an embodiment refers to a condition “below a threshold”, this should not be construed as excluding an embodiment referring to a condition “equal or below a threshold”. It is clear that should a condition be interpreted as being fulfilled if the value of a given parameter is above a threshold, then the same condition is considered as not being fulfilled if the value of the given parameter is equal or below the given threshold. Conversely, should a condition be interpreted as being fulfilled if the value of a given parameter is equal or above a threshold, then the same condition is considered as not being fulfilled if the value of the given parameter is below (and only below) the given threshold.

[0306] It should be understood that where the claims or specification refer to "a" or "an" element and/or feature, such reference is not to be construed as there being only one of those elements. Hence, reference to “an element” or “at least one element” for instance may also encompass “one or more elements”.

[0307] Terms used in the singular shall also include the plural, except where expressly otherwise stated or where the context otherwise requires.

[0308] In the description and claims of the present application, each of the verbs, "comprise" "include" and "have", and conjugates thereof, are used to indicate that the object or objects of the verb are not necessarily a complete listing of components, elements or parts of the subject or subjects of the verb.

[0309] Unless otherwise stated, the use of the expression “and/or” between the last two members of a list of options for selection indicates that a selection of one or more of the listed options is appropriate and may be made. Further, the use of the expression “and/or” may be used interchangeably with the expressions “at least one of the following”, “any one of the following” or “one or more of the following”, followed by a listing of the various options.

[0310] It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments or example, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, example and/or option, may also be provided separately or in any suitable sub-combination or as suitable in any other described embodiment, example or option of the invention. Certain features described in the context of various embodiments, examples and/or optional implementation are not to be considered essential features of those embodiments, unless the embodiment, example and/or optional implementation is inoperative without those elements.

[0311] It is noted that the terms “in some embodiments”, “according to some embodiments”, "according to some embodiments of the invention", “for example”, “e.g.”, “for instance” and “optionally” may herein be used interchangeably.

[0312] The number of elements shown in the Figures should by no means be construed as limiting and is for illustrative purposes only.

[0313] It is noted that the terms “operable to” can encompass the meaning of the term “modified or configured to”. In other words, a machine “operable to” perform a task can in some embodiments, embrace a mere capability (e.g., “modified”) to perform the function and, in some other embodiments, a machine that is actually made (e.g., “configured”) to perform the function.

[0314] Throughout this application, various embodiments may be presented in and/or relate to a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the embodiments. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.

[0315] The phrases “ranging/ranges between” a first indicate number and a second indicate number and “ranging/ranges from” a first indicate number “to” a second indicate number are used herein interchangeably and are meant to include the first and second indicated numbers and all the fractional and integral numerals there between.